Skip to main content
haiqu.block_vector_loading() This notebook demonstrates how to use haiqu.block_vector_loading() to generate a quantum state for 1D and 2D large vectors with high fidelity. It slices the input vector or matrix into blocks, which are encoded independently. The method introduces a trade-off between loading smaller and simpler vectors into a quantum state but requiring more qubits to do so. For the 64x64 pixels image Vector Loading prepares a state with ~87% fidelity, while Block Vector Loading achieves ~97% fidelity. haiqu.block_vector_loading() What does it do? Block Vector loading prepares a quantum state, which amplitudes match a given data vector, sliced into multiple blocks. How do I use it? Pass a real or complex vector or matrix and desired number of blocks to create a data loading job, then retrieve results with job.result(). What are the options? Optional parameters include various hyperparameters for circuit synthesis such as num_layers, truncation_cutoff and fine_tuning_iterations. Which option do you recommend? Start with few blocks and observe the state fidelity returned by the job result as well as number of qubits required. If needed try to increase amount of blocks and synthesis parameters to achieve the desired fidelity. Having a bug or an issue? Submit feedback Initialize the benchmark Import the necessary libraries, initialize the Haiqu SDK, create a desired quantum state.
import qiskit
import numpy as np
import pandas as pd
from haiqu.sdk import haiqu
import matplotlib.pyplot as plt
from skimage import data, color, transform
from qiskit.circuit.library import StatePreparation

haiqu.login()
haiqu.init("Block Vector Loading Tutorial")

target_size = 64

image_rgb = data.astronaut()
image_gray = color.rgb2gray(image_rgb)
image_resized = transform.resize(image_gray, (target_size, target_size), anti_aliasing=True)

plt.imshow(image_resized, cmap='gray')
plt.title(f"Image for Block Vector Loading")
plt.show()
Run benchmark scenarios Prepare a quantum state with Qiskit’s default and Haiqu’s methods
# Scenario 1: standard method
qiskit_sp = StatePreparation(image_resized.ravel(), normalize=True)
circuit_qiskit = qiskit.QuantumCircuit(qiskit_sp.num_qubits, name="Qiskit")
circuit_qiskit.compose(qiskit_sp, inplace=True)

# Scenario 2: Haiqu vector loading
state_gate, fidelity_vector = haiqu.vector_loading(data=image_resized.ravel()).result()
circuit_vector = qiskit.QuantumCircuit(state_gate.num_qubits, name="VL")
circuit_vector.compose(state_gate, inplace=True)
print(f"Vector loading prepares the image with fidelity: {fidelity_vector:.3f}")

# Scenario 3: Block Vector loading with 4x4 blocks
block_gate, fidelity_block = haiqu.block_vector_loading(data=image_resized.tolist(), num_blocks=(4, 4)).result()
circuit_block = qiskit.QuantumCircuit(block_gate.num_qubits, name="BVL")
circuit_block.compose(block_gate, inplace=True)
print(f"Block Vector loading prepares the image with fidelity: {fidelity_block:.3f}")
Block Vector loading allows to load the image with a much better fidelity than standard Vector loading. Haiqu’s loading methods outperform standard method in circuit complexity as shown in the table below:
# we consider an ideal device with all-to-all connectivity
device_ideal = haiqu.get_device("aer_simulator")

circuit_qiskit_transpiled = haiqu.transpile(circuit_qiskit, device=device_ideal)

circuit_vector_transpiled = haiqu.transpile(circuit_vector, device=device_ideal)

circuit_block_transpiled = haiqu.transpile(circuit_block, device=device_ideal)

haiqu.compare_metrics(circuit_qiskit_transpiled, circuit_vector_transpiled, circuit_block_transpiled)
Get in Touch Documentation portal docs.haiqu.ai Contact Support feedback.haiqu.ai Follow Us on LinkedIn latest news on LinkedIn Visit Our Website Learn more about Haiqu Inc. on haiqu.ai Business Inquiries Contact us at info@haiqu.ai