Quantum neural network

“Neural Network are not black boxes. They are a big pile of linear algebra.” - Randall Munroe, xkcd

Machine learning has a wide range of models for tasks such as classification, regression, and clustering. Neural networks are one of the most successful models, having experienced a resurgence in use over the past decade due to improvements in computational power and advanced software libraries. The typical structure of a neural network consists of a series of interacting layers that perform transformations on data passing through the network. An archetypal neural network structure is the feedforward neural network, visualized by the following example:


../_images/neural_network.svg


Here, the neural network depth is determined by the number of layers, while the maximum width is given by the layer with the greatest number of neurons. The network begins with an input layer of real-valued neurons, which feed forward onto a series of one or more hidden layers. Following the notation of [1], if the \(n\) neurons at one layer are given by the vector \(\mathbf{x} \in \mathbb{R}^{n}\), the \(m\) neurons of the next layer take the values

\[\mathcal{L}(\mathbf{x}) = \varphi (W \mathbf{x} + \mathbf{b}),\]

where

  • \(W \in \mathbb{R}^{m \times n}\) is a matrix,

  • \(b \in \mathbb{R}^{m}\) is a vector, and

  • \(\varphi\) is a nonlinear function (also known as the activation function).

The matrix multiplication \(W \mathbf{x}\) is a linear transformation on \(\mathbf{x}\), while \(W \mathbf{x} + \mathbf{b}\) represents an affine transformation. In principle, any nonlinear function can be chosen for \(\varphi\), but often the choice is fixed from a standard set of activations that include the rectified linear unit (ReLU) and the sigmoid function acting on each neuron. Finally, the output layer enacts an affine transformation on the last hidden layer, but the activation function may be linear (including the identity), or a different nonlinear function such as softmax (for classification).

Layers in the feedforward neural network above are called fully connected as every neuron in a given hidden layer or output layer can be connected to all neurons in the previous layer through the matrix \(W\). Over time, specialized versions of layers have been developed to focus on different problems. For example, convolutional layers have a restricted form of connectivity and are suited to machine learning with images. We focus here on fully connected layers as the most general type.

Training of neural networks uses variations of the gradient descent algorithm on a cost function characterizing the similarity between outputs of the neural network and training data. The gradient of the cost function can be calculated using automatic differentiation, with knowledge of the feedforward network structure.

Quantum neural networks aim to encode neural networks into a quantum system, with the intention of benefiting from quantum information processing. There have been numerous attempts to define a quantum neural network, each with varying advantages and disadvantages. The quantum neural network detailed below, following the work of [1], has a CV architecture and is realized using standard CV gates from Strawberry Fields. One advantage of this CV architecture is that it naturally accommodates for the continuous nature of neural networks. Additionally, the CV model is able to easily apply non-linear transformations using the phase space picture - a task which qubit-based models struggle with, often relying on measurement postselection which has a probability of failure.

Implementation

A CV quantum neural network layer can be defined as

\[\mathcal{L} := \Phi \circ \mathcal{D} \circ \mathcal{U}_{2} \circ \mathcal{S} \circ \mathcal{U}_{1},\]

where

  • \(\mathcal{U}_{k}=U_{k}(\boldsymbol{\theta}_{k},\boldsymbol{\phi}_{k})\) is an \(N\) mode interferometer,

  • \(\mathcal{D}=\otimes_{i=1}^{N}D(\alpha_{i})\) is a single mode displacement gate (Dgate) with complex displacement \(\alpha_{i} \in \mathbb{C}\),

  • \(\mathcal{S}=\otimes_{i=1}^{N}S(r_{i})\) is a single mode squeezing gate (Sgate) acting on each mode with squeezing parameter \(r_{i} \in \mathbb{R}\), and

  • \(\Phi=\otimes_{i=1}^{N}\Phi(\lambda_{i})\) is a non-Gaussian gate on each mode with parameter \(\lambda_{i} \in \mathbb{R}\).

Note

Any non-Gaussian gate such as the cubic phase gate (Vgate) represents a valid choice, but we recommend the Kerr gate (Kgate) for simulations in Strawberry Fields. The Kerr gate is more accurate numerically because it is diagonal in the Fock basis.

The layer is shown below as a circuit:


../_images/layer.svg


These layers can then be composed to form a quantum neural network. The width of the network can also be varied between layers [1].

Reproducing classical neural networks

Let’s see how the quantum layer can embed the transformation \(\mathcal{L}(\mathbf{x}) = \varphi (W \mathbf{x} + \mathbf{b})\) of a classical neural network layer. Suppose \(N\)-dimensional data is encoded in position eigenstates so that

\[\mathbf{x} \Leftrightarrow \ket{\mathbf{x}} := \ket{x_{1}} \otimes \ldots \otimes \ket{x_{N}}.\]

We want to perform the transformation

\[\ket{\mathbf{x}} \Rightarrow \ket{\varphi (W \mathbf{x} + \mathbf{b})}.\]

It turns out that the quantum circuit above can do precisely this! Consider first the affine transformation \(W \mathbf{x} + \mathbf{b}\). Leveraging the singular value decomposition, we can always write \(W = O_{2} \Sigma O_{1}\) with \(O_{k}\) orthogonal matrices and \(\Sigma\) a positive diagonal matrix. These orthogonal transformations can be carried out using interferometers without access to phase, i.e., with \(\boldsymbol{\phi}_{k} = 0\):

\[U_{k}(\boldsymbol{\theta}_{k},\mathbf{0})\ket{\mathbf{x}} = \ket{O_{k} \mathbf{x}}.\]

On the other hand, the diagonal matrix \(\Sigma = {\rm diag}\left(\{c_{i}\}_{i=1}^{N}\right)\) can be achieved through squeezing:

\[\otimes_{i=1}^{N}S(r_{i})\ket{\mathbf{x}} \propto \ket{\Sigma \mathbf{x}},\]

with \(r_{i} = \log (c_{i})\). Finally, the addition of a bias vector \(\mathbf{b}\) is done using position displacement gates:

\[\otimes_{i=1}^{N}D(\alpha_{i})\ket{\mathbf{x}} = \ket{\mathbf{x} + \mathbf{b}},\]

with \(\mathbf{b} = \{\alpha_{i}\}_{i=1}^{N}\) and \(\alpha_{i} \in \mathbb{R}\). Putting this all together, we see that the operation \(\mathcal{D} \circ \mathcal{U}_{2} \circ \mathcal{S} \circ \mathcal{U}_{1}\) with phaseless interferometers and position displacement performs the transformation \(\ket{\mathbf{x}} \Rightarrow \ket{W \mathbf{x} + \mathbf{b}}\) on position eigenstates.

Warning

The TensorFlow backend is the natural simulator for quantum neural networks in Strawberry Fields, but this backend cannot naturally accommodate position eigenstates, which require infinite squeezing. For simulation of position eigenstates in this backend, the best approach is to use a displaced squeezed state (prepare_displaced_squeezed_state) with high squeezing value r. However, to avoid significant numerical error, it is important to make sure that all initial states have negligible amplitude for Fock states \(\ket{n}\) with \(n\geq \texttt{cutoff_dim}\), where \(\texttt{cutoff_dim}\) is the cutoff dimension.

Finally, the nonlinear function \(\varphi\) can be achieved through a restricted type of non-Gaussian gates \(\otimes_{i=1}^{N}\Phi(\lambda_{i})\) acting on each mode (see [1] for more details), resulting in the transformation

\[\otimes_{i=1}^{N}\Phi(\lambda_{i})\ket{\mathbf{x}} = \ket{\varphi(\mathbf{x})}.\]

The operation \(\mathcal{L} = \Phi \circ \mathcal{D} \circ \mathcal{U}_{2} \circ \mathcal{S} \circ \mathcal{U}_{1}\) with phaseless interferometers, position displacements, and restricted non-Gaussian gates can hence be seen as enacting a classical neural network layer \(\ket{\mathbf{x}} \Rightarrow \ket{\phi(W \mathbf{x} + \mathbf{b})}\) on position eigenstates.

Extending to quantum neural networks

In fact, CV quantum neural network layers can be made more expressive than their classical counterparts. We can do this by lifting the above restrictions on \(\mathcal{L}\), i.e.:

  • Using arbitrary interferometers \(U_{k}(\boldsymbol{\theta}_{k},\boldsymbol{\phi}_{k})\) with access to phase and general displacement gates (i.e., not necessarily position displacement). This allows \(\mathcal{D} \circ \mathcal{U}_{2} \circ \mathcal{S} \circ \mathcal{U}_{1}\) to represent a general Gaussian operation.

  • Using arbitrary non-Gaussian gates \(\Phi(\lambda_{i})\), such as the Kerr gate.

  • Encoding data outside of the position eigenbasis, for example using instead the Fock basis.

In fact, gates in a single layer form a universal gate set, making the CV quantum neural network a model for universal quantum computing, i.e., a sufficient number of layers can carry out any quantum algorithm implementable on a CV quantum computer.

CV quantum neural networks can be trained both through classical simulation and directly on quantum hardware. Strawberry Fields relies on classical simulation to evaluate cost functions of the CV quantum neural network and the resultant gradients with respect to parameters of each layer. However, this becomes an intractable task with increasing network depth and width. Ultimately, direct evaluation on hardware will likely be necessary to large scale networks; an approach for hardware-based training is mapped out in [2]. The PennyLane library provides tools for training hybrid quantum-classical machine learning models, using both simulators and real-world quantum hardware.

Example CV quantum neural network layers are shown, for one to four modes, below:


../_images/layer_1mode.svg

One mode layer


../_images/layer_2mode.svg

Two mode layer


../_images/layer_3mode.svg

Three mode layer


../_images/layer_4mode.svg

Four mode layer


Here, the multimode linear interferometers \(U_{1}\) and \(U_{2}\) have been decomposed into two-mode phaseless beamsplitters (BSgate) and single-mode phase shifters (Rgate) using the Clements decomposition [3]. The Kerr gate is used as the non-Gaussian gate.

Code

First, we import Strawberry Fields, TensorFlow, and NumPy:

import numpy as np
import tensorflow as tf
import strawberryfields as sf
from strawberryfields import ops

Before we begin defining our optimization problem, let’s first create some convenient utility functions.

Utility functions

The first step to writing a CV quantum neural network layer in Strawberry Fields is to define a function for the two interferometers:

def interferometer(params, q):
    """Parameterised interferometer acting on ``N`` modes.

    Args:
        params (list[float]): list of length ``max(1, N-1) + (N-1)*N`` parameters.

            * The first ``N(N-1)/2`` parameters correspond to the beamsplitter angles
            * The second ``N(N-1)/2`` parameters correspond to the beamsplitter phases
            * The final ``N-1`` parameters correspond to local rotation on the first N-1 modes

        q (list[RegRef]): list of Strawberry Fields quantum registers the interferometer
            is to be applied to
    """
    N = len(q)
    theta = params[:N*(N-1)//2]
    phi = params[N*(N-1)//2:N*(N-1)]
    rphi = params[-N+1:]

    if N == 1:
        # the interferometer is a single rotation
        ops.Rgate(rphi[0]) | q[0]
        return

    n = 0  # keep track of free parameters

    # Apply the rectangular beamsplitter array
    # The array depth is N
    for l in range(N):
        for k, (q1, q2) in enumerate(zip(q[:-1], q[1:])):
            # skip even or odd pairs depending on layer
            if (l + k) % 2 != 1:
                ops.BSgate(theta[n], phi[n]) | (q1, q2)
                n += 1

    # apply the final local phase shifts to all modes except the last one
    for i in range(max(1, N - 1)):
        ops.Rgate(rphi[i]) | q[i]

Warning

The Interferometer class in Strawberry Fields does not reproduce the functionality above. Instead, Interferometer applies a given input unitary matrix according to the Clements decomposition.

Using the above interferometer function, an \(N\) mode CV quantum neural network layer is given by the function:

def layer(params, q):
    """CV quantum neural network layer acting on ``N`` modes.

    Args:
        params (list[float]): list of length ``2*(max(1, N-1) + N**2 + n)`` containing
            the number of parameters for the layer
        q (list[RegRef]): list of Strawberry Fields quantum registers the layer
            is to be applied to
    """
    N = len(q)
    M = int(N * (N - 1)) + max(1, N - 1)

    int1 = params[:M]
    s = params[M:M+N]
    int2 = params[M+N:2*M+N]
    dr = params[2*M+N:2*M+2*N]
    dp = params[2*M+2*N:2*M+3*N]
    k = params[2*M+3*N:2*M+4*N]

    # begin layer
    interferometer(int1, q)

    for i in range(N):
        ops.Sgate(s[i]) | q[i]

    interferometer(int2, q)

    for i in range(N):
        ops.Dgate(dr[i], dp[i]) | q[i]
        ops.Kgate(k[i]) | q[i]

Finally, we define one more utility function to help us initialize the TensorFlow weights for our quantum neural network layers:

def init_weights(modes, layers, active_sd=0.0001, passive_sd=0.1):
    """Initialize a 2D TensorFlow Variable containing normally-distributed
    random weights for an ``N`` mode quantum neural network with ``L`` layers.

    Args:
        modes (int): the number of modes in the quantum neural network
        layers (int): the number of layers in the quantum neural network
        active_sd (float): the standard deviation used when initializing
            the normally-distributed weights for the active parameters
            (displacement, squeezing, and Kerr magnitude)
        passive_sd (float): the standard deviation used when initializing
            the normally-distributed weights for the passive parameters
            (beamsplitter angles and all gate phases)

    Returns:
        tf.Variable[tf.float32]: A TensorFlow Variable of shape
        ``[layers, 2*(max(1, modes-1) + modes**2 + modes)]``, where the Lth
        row represents the layer parameters for the Lth layer.
    """
    # Number of interferometer parameters:
    M = int(modes * (modes - 1)) + max(1, modes - 1)

    # Create the TensorFlow variables
    int1_weights = tf.random.normal(shape=[layers, M], stddev=passive_sd)
    s_weights = tf.random.normal(shape=[layers, modes], stddev=active_sd)
    int2_weights = tf.random.normal(shape=[layers, M], stddev=passive_sd)
    dr_weights = tf.random.normal(shape=[layers, modes], stddev=active_sd)
    dp_weights = tf.random.normal(shape=[layers, modes], stddev=passive_sd)
    k_weights = tf.random.normal(shape=[layers, modes], stddev=active_sd)

    weights = tf.concat(
        [int1_weights, s_weights, int2_weights, dr_weights, dp_weights, k_weights], axis=1
    )

    weights = tf.Variable(weights)

    return weights

Optimization

Now that we have our utility functions, lets begin defining our optimization problem In this particular example, let’s create a 1 mode CVQNN with 8 layers and a Fock-basis cutoff dimension of 6. We will train this QNN to output a desired target state; a single photon state.

# set the random seed
tf.random.set_seed(137)
np.random.seed(137)


# define width and depth of CV quantum neural network
modes = 1
layers = 8
cutoff_dim = 6


# defining desired state (single photon state)
target_state = np.zeros(cutoff_dim)
target_state[1] = 1
target_state = tf.constant(target_state, dtype=tf.complex64)

Now, let’s initialize an engine with the TensorFlow "tf" backend, and begin constructing out QNN program.

# initialize engine and program
eng = sf.Engine(backend="tf", backend_options={"cutoff_dim": cutoff_dim})
qnn = sf.Program(modes)

# initialize QNN weights
weights = init_weights(modes, layers) # our TensorFlow weights
num_params = np.prod(weights.shape)   # total number of parameters in our model

To construct the program, we must create and use Strawberry Fields symbolic gate arguments. These will be mapped to the TensorFlow variables on engine execution.

# Create array of Strawberry Fields symbolic gate arguments, matching
# the size of the weights Variable.
sf_params = np.arange(num_params).reshape(weights.shape).astype(np.str)
sf_params = np.array([qnn.params(*i) for i in sf_params])


# Construct the symbolic Strawberry Fields program by
# looping and applying layers to the program.
with qnn.context as q:
    for k in range(layers):
        layer(sf_params[k], q)

where sf_params is a real array of size [layers, 2*(max(1, modes-1) + modes**2 + modes)] containing the symbolic gate arguments for the quantum neural network.

Now that our QNN program is defined, we can create our cost function. Our cost function simply executes the QNN on our engine using the values of the input weights.

Since we want to maximize the fidelity \(f(w) = \langle \psi(w) | \psi_t\rangle\) between our QNN output state \(|\psi(w)\rangle\) and our target state \(\psi_t\rangle\), we compute the inner product between the two statevectors, as well as the norm \(\left\lVert \psi(w) - \psi_t\right\rVert\).

Finally, we also return the trace of the output QNN state. This should always have a value close to 1. If it deviates significantly from 1, this is an indication that we need to increase our Fock-basis cutoff.

def cost(weights):
    # Create a dictionary mapping from the names of the Strawberry Fields
    # symbolic gate parameters to the TensorFlow weight values.
    mapping = {p.name: w for p, w in zip(sf_params.flatten(), tf.reshape(weights, [-1]))}

    # run the engine
    state = eng.run(qnn, args=mapping).state
    ket = state.ket()

    difference = tf.reduce_sum(tf.abs(ket - target_state))
    fidelity = tf.abs(tf.reduce_sum(tf.math.conj(ket) * target_state)) ** 2
    return difference, fidelity, ket, tf.math.real(state.trace())

We are now ready to minimize our cost function using TensorFlow:

# set up the optimizer
opt = tf.keras.optimizers.Adam()
cost_before, fidelity_before, _, _ = cost(weights)

# Perform the optimization
for i in range(1000):
    # reset the engine if it has already been executed
    if eng.run_progs:
        eng.reset()

    with tf.GradientTape() as tape:
        loss, fid, ket, trace = cost(weights)

    # one repetition of the optimization
    gradients = tape.gradient(loss, weights)
    opt.apply_gradients(zip([gradients], [weights]))

    # Prints progress at every rep
    if i % 1 == 0:
        print("Rep: {} Cost: {:.4f} Fidelity: {:.4f} Trace: {:.4f}".format(i, loss, fid, trace))


print("\nFidelity before optimization: ", fidelity_before.numpy())
print("Fidelity after optimization: ", fid.numpy())
print("\nTarget state: ", target_state.numpy())
print("Output state: ", np.round(ket.numpy(), decimals=3))

Out:

Rep: 0 Cost: 2.0001 Fidelity: 0.0000 Trace: 1.0000
Rep: 1 Cost: 1.9978 Fidelity: 0.0001 Trace: 1.0000
Rep: 2 Cost: 1.9897 Fidelity: 0.0002 Trace: 1.0000
Rep: 3 Cost: 1.9794 Fidelity: 0.0006 Trace: 1.0000
Rep: 4 Cost: 1.9681 Fidelity: 0.0010 Trace: 1.0000
Rep: 5 Cost: 1.9632 Fidelity: 0.0016 Trace: 1.0000
Rep: 6 Cost: 1.9563 Fidelity: 0.0023 Trace: 1.0000
Rep: 7 Cost: 1.9476 Fidelity: 0.0031 Trace: 1.0000
Rep: 8 Cost: 1.9377 Fidelity: 0.0041 Trace: 1.0000
Rep: 9 Cost: 1.9268 Fidelity: 0.0052 Trace: 1.0000
Rep: 10 Cost: 1.9196 Fidelity: 0.0064 Trace: 1.0000
Rep: 11 Cost: 1.9130 Fidelity: 0.0077 Trace: 1.0000
Rep: 12 Cost: 1.9055 Fidelity: 0.0091 Trace: 1.0000
Rep: 13 Cost: 1.8971 Fidelity: 0.0107 Trace: 1.0000
Rep: 14 Cost: 1.8880 Fidelity: 0.0124 Trace: 1.0000
Rep: 15 Cost: 1.8789 Fidelity: 0.0142 Trace: 1.0000
Rep: 16 Cost: 1.8695 Fidelity: 0.0162 Trace: 1.0000
Rep: 17 Cost: 1.8601 Fidelity: 0.0183 Trace: 1.0000
Rep: 18 Cost: 1.8505 Fidelity: 0.0205 Trace: 1.0000
Rep: 19 Cost: 1.8410 Fidelity: 0.0229 Trace: 1.0000
Rep: 20 Cost: 1.8327 Fidelity: 0.0254 Trace: 1.0000
Rep: 21 Cost: 1.8241 Fidelity: 0.0280 Trace: 1.0000
Rep: 22 Cost: 1.8145 Fidelity: 0.0308 Trace: 1.0000
Rep: 23 Cost: 1.8060 Fidelity: 0.0337 Trace: 1.0000
Rep: 24 Cost: 1.7979 Fidelity: 0.0367 Trace: 1.0000
Rep: 25 Cost: 1.7897 Fidelity: 0.0398 Trace: 1.0000
Rep: 26 Cost: 1.7815 Fidelity: 0.0431 Trace: 1.0000
Rep: 27 Cost: 1.7732 Fidelity: 0.0464 Trace: 1.0000
Rep: 28 Cost: 1.7649 Fidelity: 0.0498 Trace: 1.0000
Rep: 29 Cost: 1.7566 Fidelity: 0.0533 Trace: 1.0000
Rep: 30 Cost: 1.7484 Fidelity: 0.0569 Trace: 1.0000
Rep: 31 Cost: 1.7403 Fidelity: 0.0606 Trace: 1.0000
Rep: 32 Cost: 1.7322 Fidelity: 0.0644 Trace: 1.0000
Rep: 33 Cost: 1.7242 Fidelity: 0.0683 Trace: 1.0000
Rep: 34 Cost: 1.7164 Fidelity: 0.0723 Trace: 1.0000
Rep: 35 Cost: 1.7087 Fidelity: 0.0763 Trace: 1.0000
Rep: 36 Cost: 1.7012 Fidelity: 0.0804 Trace: 1.0000
Rep: 37 Cost: 1.6938 Fidelity: 0.0846 Trace: 1.0000
Rep: 38 Cost: 1.6866 Fidelity: 0.0888 Trace: 1.0000
Rep: 39 Cost: 1.6795 Fidelity: 0.0931 Trace: 1.0000
Rep: 40 Cost: 1.6726 Fidelity: 0.0975 Trace: 1.0000
Rep: 41 Cost: 1.6659 Fidelity: 0.1019 Trace: 1.0000
Rep: 42 Cost: 1.6593 Fidelity: 0.1063 Trace: 1.0000
Rep: 43 Cost: 1.6529 Fidelity: 0.1108 Trace: 1.0000
Rep: 44 Cost: 1.6467 Fidelity: 0.1154 Trace: 1.0000
Rep: 45 Cost: 1.6405 Fidelity: 0.1199 Trace: 1.0000
Rep: 46 Cost: 1.6346 Fidelity: 0.1245 Trace: 1.0000
Rep: 47 Cost: 1.6287 Fidelity: 0.1291 Trace: 1.0000
Rep: 48 Cost: 1.6230 Fidelity: 0.1337 Trace: 1.0000
Rep: 49 Cost: 1.6173 Fidelity: 0.1384 Trace: 1.0000
Rep: 50 Cost: 1.6117 Fidelity: 0.1430 Trace: 1.0000
Rep: 51 Cost: 1.6062 Fidelity: 0.1476 Trace: 1.0000
Rep: 52 Cost: 1.6007 Fidelity: 0.1523 Trace: 1.0000
Rep: 53 Cost: 1.5952 Fidelity: 0.1569 Trace: 1.0000
Rep: 54 Cost: 1.5897 Fidelity: 0.1616 Trace: 1.0000
Rep: 55 Cost: 1.5842 Fidelity: 0.1662 Trace: 1.0000
Rep: 56 Cost: 1.5786 Fidelity: 0.1708 Trace: 1.0000
Rep: 57 Cost: 1.5731 Fidelity: 0.1754 Trace: 1.0000
Rep: 58 Cost: 1.5674 Fidelity: 0.1800 Trace: 1.0000
Rep: 59 Cost: 1.5617 Fidelity: 0.1846 Trace: 1.0000
Rep: 60 Cost: 1.5560 Fidelity: 0.1892 Trace: 1.0000
Rep: 61 Cost: 1.5502 Fidelity: 0.1938 Trace: 1.0000
Rep: 62 Cost: 1.5445 Fidelity: 0.1984 Trace: 1.0000
Rep: 63 Cost: 1.5389 Fidelity: 0.2030 Trace: 1.0000
Rep: 64 Cost: 1.5333 Fidelity: 0.2076 Trace: 1.0000
Rep: 65 Cost: 1.5276 Fidelity: 0.2122 Trace: 1.0000
Rep: 66 Cost: 1.5219 Fidelity: 0.2168 Trace: 1.0000
Rep: 67 Cost: 1.5161 Fidelity: 0.2215 Trace: 1.0000
Rep: 68 Cost: 1.5101 Fidelity: 0.2261 Trace: 1.0000
Rep: 69 Cost: 1.5040 Fidelity: 0.2307 Trace: 1.0000
Rep: 70 Cost: 1.4977 Fidelity: 0.2354 Trace: 1.0000
Rep: 71 Cost: 1.4912 Fidelity: 0.2400 Trace: 1.0000
Rep: 72 Cost: 1.4845 Fidelity: 0.2446 Trace: 1.0000
Rep: 73 Cost: 1.4775 Fidelity: 0.2492 Trace: 1.0000
Rep: 74 Cost: 1.4703 Fidelity: 0.2538 Trace: 1.0000
Rep: 75 Cost: 1.4629 Fidelity: 0.2583 Trace: 1.0000
Rep: 76 Cost: 1.4553 Fidelity: 0.2630 Trace: 1.0000
Rep: 77 Cost: 1.4474 Fidelity: 0.2676 Trace: 1.0000
Rep: 78 Cost: 1.4392 Fidelity: 0.2724 Trace: 1.0000
Rep: 79 Cost: 1.4308 Fidelity: 0.2772 Trace: 1.0000
Rep: 80 Cost: 1.4222 Fidelity: 0.2822 Trace: 1.0000
Rep: 81 Cost: 1.4132 Fidelity: 0.2873 Trace: 1.0000
Rep: 82 Cost: 1.4040 Fidelity: 0.2926 Trace: 1.0000
Rep: 83 Cost: 1.3945 Fidelity: 0.2980 Trace: 1.0000
Rep: 84 Cost: 1.3848 Fidelity: 0.3036 Trace: 1.0000
Rep: 85 Cost: 1.3748 Fidelity: 0.3094 Trace: 1.0000
Rep: 86 Cost: 1.3646 Fidelity: 0.3153 Trace: 1.0000
Rep: 87 Cost: 1.3543 Fidelity: 0.3214 Trace: 1.0000
Rep: 88 Cost: 1.3438 Fidelity: 0.3276 Trace: 1.0000
Rep: 89 Cost: 1.3334 Fidelity: 0.3340 Trace: 1.0000
Rep: 90 Cost: 1.3231 Fidelity: 0.3406 Trace: 1.0000
Rep: 91 Cost: 1.3129 Fidelity: 0.3473 Trace: 1.0000
Rep: 92 Cost: 1.3028 Fidelity: 0.3543 Trace: 1.0000
Rep: 93 Cost: 1.2925 Fidelity: 0.3614 Trace: 1.0000
Rep: 94 Cost: 1.2821 Fidelity: 0.3686 Trace: 1.0000
Rep: 95 Cost: 1.2715 Fidelity: 0.3759 Trace: 1.0000
Rep: 96 Cost: 1.2606 Fidelity: 0.3832 Trace: 1.0000
Rep: 97 Cost: 1.2493 Fidelity: 0.3905 Trace: 1.0000
Rep: 98 Cost: 1.2376 Fidelity: 0.3978 Trace: 1.0000
Rep: 99 Cost: 1.2257 Fidelity: 0.4051 Trace: 1.0000
Rep: 100 Cost: 1.2152 Fidelity: 0.4123 Trace: 1.0000
Rep: 101 Cost: 1.2057 Fidelity: 0.4197 Trace: 1.0000
Rep: 102 Cost: 1.1951 Fidelity: 0.4272 Trace: 1.0000
Rep: 103 Cost: 1.1841 Fidelity: 0.4345 Trace: 1.0000
Rep: 104 Cost: 1.1739 Fidelity: 0.4417 Trace: 1.0000
Rep: 105 Cost: 1.1641 Fidelity: 0.4487 Trace: 1.0000
Rep: 106 Cost: 1.1538 Fidelity: 0.4554 Trace: 1.0000
Rep: 107 Cost: 1.1427 Fidelity: 0.4620 Trace: 1.0000
Rep: 108 Cost: 1.1325 Fidelity: 0.4685 Trace: 1.0000
Rep: 109 Cost: 1.1229 Fidelity: 0.4749 Trace: 1.0000
Rep: 110 Cost: 1.1116 Fidelity: 0.4812 Trace: 1.0000
Rep: 111 Cost: 1.1032 Fidelity: 0.4875 Trace: 0.9999
Rep: 112 Cost: 1.0936 Fidelity: 0.4937 Trace: 0.9999
Rep: 113 Cost: 1.0821 Fidelity: 0.4998 Trace: 0.9999
Rep: 114 Cost: 1.0717 Fidelity: 0.5058 Trace: 0.9999
Rep: 115 Cost: 1.0628 Fidelity: 0.5117 Trace: 0.9999
Rep: 116 Cost: 1.0528 Fidelity: 0.5175 Trace: 0.9999
Rep: 117 Cost: 1.0420 Fidelity: 0.5233 Trace: 0.9999
Rep: 118 Cost: 1.0329 Fidelity: 0.5289 Trace: 0.9999
Rep: 119 Cost: 1.0234 Fidelity: 0.5345 Trace: 0.9999
Rep: 120 Cost: 1.0138 Fidelity: 0.5402 Trace: 0.9999
Rep: 121 Cost: 1.0055 Fidelity: 0.5458 Trace: 0.9999
Rep: 122 Cost: 0.9962 Fidelity: 0.5514 Trace: 0.9999
Rep: 123 Cost: 0.9864 Fidelity: 0.5570 Trace: 0.9998
Rep: 124 Cost: 0.9781 Fidelity: 0.5626 Trace: 0.9998
Rep: 125 Cost: 0.9695 Fidelity: 0.5682 Trace: 0.9998
Rep: 126 Cost: 0.9607 Fidelity: 0.5736 Trace: 0.9998
Rep: 127 Cost: 0.9518 Fidelity: 0.5790 Trace: 0.9998
Rep: 128 Cost: 0.9445 Fidelity: 0.5844 Trace: 0.9998
Rep: 129 Cost: 0.9367 Fidelity: 0.5898 Trace: 0.9998
Rep: 130 Cost: 0.9276 Fidelity: 0.5952 Trace: 0.9997
Rep: 131 Cost: 0.9177 Fidelity: 0.6005 Trace: 0.9997
Rep: 132 Cost: 0.9120 Fidelity: 0.6058 Trace: 0.9997
Rep: 133 Cost: 0.9034 Fidelity: 0.6111 Trace: 0.9997
Rep: 134 Cost: 0.8945 Fidelity: 0.6163 Trace: 0.9996
Rep: 135 Cost: 0.8868 Fidelity: 0.6214 Trace: 0.9996
Rep: 136 Cost: 0.8785 Fidelity: 0.6265 Trace: 0.9996
Rep: 137 Cost: 0.8690 Fidelity: 0.6314 Trace: 0.9996
Rep: 138 Cost: 0.8621 Fidelity: 0.6364 Trace: 0.9995
Rep: 139 Cost: 0.8545 Fidelity: 0.6413 Trace: 0.9995
Rep: 140 Cost: 0.8445 Fidelity: 0.6463 Trace: 0.9995
Rep: 141 Cost: 0.8374 Fidelity: 0.6513 Trace: 0.9995
Rep: 142 Cost: 0.8296 Fidelity: 0.6563 Trace: 0.9994
Rep: 143 Cost: 0.8215 Fidelity: 0.6611 Trace: 0.9994
Rep: 144 Cost: 0.8138 Fidelity: 0.6658 Trace: 0.9994
Rep: 145 Cost: 0.8044 Fidelity: 0.6705 Trace: 0.9993
Rep: 146 Cost: 0.8001 Fidelity: 0.6752 Trace: 0.9993
Rep: 147 Cost: 0.7937 Fidelity: 0.6799 Trace: 0.9993
Rep: 148 Cost: 0.7847 Fidelity: 0.6845 Trace: 0.9992
Rep: 149 Cost: 0.7759 Fidelity: 0.6891 Trace: 0.9992
Rep: 150 Cost: 0.7693 Fidelity: 0.6937 Trace: 0.9991
Rep: 151 Cost: 0.7606 Fidelity: 0.6984 Trace: 0.9991
Rep: 152 Cost: 0.7540 Fidelity: 0.7030 Trace: 0.9990
Rep: 153 Cost: 0.7469 Fidelity: 0.7074 Trace: 0.9990
Rep: 154 Cost: 0.7378 Fidelity: 0.7117 Trace: 0.9989
Rep: 155 Cost: 0.7326 Fidelity: 0.7159 Trace: 0.9989
Rep: 156 Cost: 0.7260 Fidelity: 0.7201 Trace: 0.9988
Rep: 157 Cost: 0.7168 Fidelity: 0.7244 Trace: 0.9988
Rep: 158 Cost: 0.7103 Fidelity: 0.7286 Trace: 0.9987
Rep: 159 Cost: 0.7035 Fidelity: 0.7327 Trace: 0.9987
Rep: 160 Cost: 0.6949 Fidelity: 0.7366 Trace: 0.9986
Rep: 161 Cost: 0.6891 Fidelity: 0.7403 Trace: 0.9985
Rep: 162 Cost: 0.6819 Fidelity: 0.7441 Trace: 0.9984
Rep: 163 Cost: 0.6728 Fidelity: 0.7481 Trace: 0.9984
Rep: 164 Cost: 0.6664 Fidelity: 0.7518 Trace: 0.9983
Rep: 165 Cost: 0.6589 Fidelity: 0.7554 Trace: 0.9982
Rep: 166 Cost: 0.6515 Fidelity: 0.7591 Trace: 0.9982
Rep: 167 Cost: 0.6454 Fidelity: 0.7627 Trace: 0.9981
Rep: 168 Cost: 0.6385 Fidelity: 0.7661 Trace: 0.9980
Rep: 169 Cost: 0.6311 Fidelity: 0.7696 Trace: 0.9979
Rep: 170 Cost: 0.6262 Fidelity: 0.7731 Trace: 0.9979
Rep: 171 Cost: 0.6192 Fidelity: 0.7765 Trace: 0.9978
Rep: 172 Cost: 0.6116 Fidelity: 0.7797 Trace: 0.9977
Rep: 173 Cost: 0.6052 Fidelity: 0.7830 Trace: 0.9976
Rep: 174 Cost: 0.5990 Fidelity: 0.7864 Trace: 0.9975
Rep: 175 Cost: 0.5920 Fidelity: 0.7897 Trace: 0.9974
Rep: 176 Cost: 0.5859 Fidelity: 0.7926 Trace: 0.9973
Rep: 177 Cost: 0.5791 Fidelity: 0.7957 Trace: 0.9972
Rep: 178 Cost: 0.5732 Fidelity: 0.7990 Trace: 0.9971
Rep: 179 Cost: 0.5669 Fidelity: 0.8020 Trace: 0.9970
Rep: 180 Cost: 0.5607 Fidelity: 0.8046 Trace: 0.9969
Rep: 181 Cost: 0.5541 Fidelity: 0.8074 Trace: 0.9968
Rep: 182 Cost: 0.5496 Fidelity: 0.8104 Trace: 0.9967
Rep: 183 Cost: 0.5442 Fidelity: 0.8132 Trace: 0.9966
Rep: 184 Cost: 0.5371 Fidelity: 0.8156 Trace: 0.9964
Rep: 185 Cost: 0.5339 Fidelity: 0.8180 Trace: 0.9962
Rep: 186 Cost: 0.5283 Fidelity: 0.8206 Trace: 0.9961
Rep: 187 Cost: 0.5231 Fidelity: 0.8233 Trace: 0.9960
Rep: 188 Cost: 0.5182 Fidelity: 0.8256 Trace: 0.9959
Rep: 189 Cost: 0.5126 Fidelity: 0.8276 Trace: 0.9957
Rep: 190 Cost: 0.5078 Fidelity: 0.8297 Trace: 0.9955
Rep: 191 Cost: 0.5015 Fidelity: 0.8321 Trace: 0.9954
Rep: 192 Cost: 0.4982 Fidelity: 0.8345 Trace: 0.9953
Rep: 193 Cost: 0.4927 Fidelity: 0.8363 Trace: 0.9951
Rep: 194 Cost: 0.4875 Fidelity: 0.8376 Trace: 0.9949
Rep: 195 Cost: 0.4836 Fidelity: 0.8391 Trace: 0.9947
Rep: 196 Cost: 0.4780 Fidelity: 0.8411 Trace: 0.9946
Rep: 197 Cost: 0.4750 Fidelity: 0.8434 Trace: 0.9944
Rep: 198 Cost: 0.4710 Fidelity: 0.8453 Trace: 0.9943
Rep: 199 Cost: 0.4653 Fidelity: 0.8469 Trace: 0.9941
Rep: 200 Cost: 0.4618 Fidelity: 0.8494 Trace: 0.9939
Rep: 201 Cost: 0.4574 Fidelity: 0.8525 Trace: 0.9938
Rep: 202 Cost: 0.4535 Fidelity: 0.8553 Trace: 0.9937
Rep: 203 Cost: 0.4509 Fidelity: 0.8573 Trace: 0.9935
Rep: 204 Cost: 0.4468 Fidelity: 0.8595 Trace: 0.9933
Rep: 205 Cost: 0.4428 Fidelity: 0.8619 Trace: 0.9932
Rep: 206 Cost: 0.4390 Fidelity: 0.8637 Trace: 0.9930
Rep: 207 Cost: 0.4350 Fidelity: 0.8654 Trace: 0.9928
Rep: 208 Cost: 0.4307 Fidelity: 0.8673 Trace: 0.9927
Rep: 209 Cost: 0.4280 Fidelity: 0.8693 Trace: 0.9925
Rep: 210 Cost: 0.4238 Fidelity: 0.8707 Trace: 0.9923
Rep: 211 Cost: 0.4216 Fidelity: 0.8718 Trace: 0.9921
Rep: 212 Cost: 0.4165 Fidelity: 0.8737 Trace: 0.9919
Rep: 213 Cost: 0.4148 Fidelity: 0.8755 Trace: 0.9918
Rep: 214 Cost: 0.4111 Fidelity: 0.8767 Trace: 0.9916
Rep: 215 Cost: 0.4069 Fidelity: 0.8776 Trace: 0.9914
Rep: 216 Cost: 0.4033 Fidelity: 0.8791 Trace: 0.9912
Rep: 217 Cost: 0.4014 Fidelity: 0.8810 Trace: 0.9911
Rep: 218 Cost: 0.3976 Fidelity: 0.8820 Trace: 0.9909
Rep: 219 Cost: 0.3947 Fidelity: 0.8824 Trace: 0.9906
Rep: 220 Cost: 0.3920 Fidelity: 0.8834 Trace: 0.9904
Rep: 221 Cost: 0.3866 Fidelity: 0.8851 Trace: 0.9903
Rep: 222 Cost: 0.3834 Fidelity: 0.8861 Trace: 0.9901
Rep: 223 Cost: 0.3841 Fidelity: 0.8864 Trace: 0.9898
Rep: 224 Cost: 0.3780 Fidelity: 0.8886 Trace: 0.9897
Rep: 225 Cost: 0.3760 Fidelity: 0.8904 Trace: 0.9896
Rep: 226 Cost: 0.3731 Fidelity: 0.8913 Trace: 0.9893
Rep: 227 Cost: 0.3699 Fidelity: 0.8924 Trace: 0.9891
Rep: 228 Cost: 0.3665 Fidelity: 0.8939 Trace: 0.9890
Rep: 229 Cost: 0.3646 Fidelity: 0.8951 Trace: 0.9888
Rep: 230 Cost: 0.3611 Fidelity: 0.8963 Trace: 0.9887
Rep: 231 Cost: 0.3601 Fidelity: 0.8972 Trace: 0.9884
Rep: 232 Cost: 0.3568 Fidelity: 0.8983 Trace: 0.9883
Rep: 233 Cost: 0.3531 Fidelity: 0.8995 Trace: 0.9881
Rep: 234 Cost: 0.3500 Fidelity: 0.9005 Trace: 0.9879
Rep: 235 Cost: 0.3503 Fidelity: 0.9019 Trace: 0.9878
Rep: 236 Cost: 0.3476 Fidelity: 0.9029 Trace: 0.9876
Rep: 237 Cost: 0.3426 Fidelity: 0.9035 Trace: 0.9873
Rep: 238 Cost: 0.3426 Fidelity: 0.9044 Trace: 0.9872
Rep: 239 Cost: 0.3382 Fidelity: 0.9057 Trace: 0.9871
Rep: 240 Cost: 0.3367 Fidelity: 0.9065 Trace: 0.9869
Rep: 241 Cost: 0.3329 Fidelity: 0.9073 Trace: 0.9866
Rep: 242 Cost: 0.3316 Fidelity: 0.9086 Trace: 0.9865
Rep: 243 Cost: 0.3288 Fidelity: 0.9099 Trace: 0.9863
Rep: 244 Cost: 0.3256 Fidelity: 0.9106 Trace: 0.9861
Rep: 245 Cost: 0.3246 Fidelity: 0.9112 Trace: 0.9859
Rep: 246 Cost: 0.3203 Fidelity: 0.9122 Trace: 0.9857
Rep: 247 Cost: 0.3190 Fidelity: 0.9135 Trace: 0.9857
Rep: 248 Cost: 0.3167 Fidelity: 0.9143 Trace: 0.9855
Rep: 249 Cost: 0.3121 Fidelity: 0.9146 Trace: 0.9852
Rep: 250 Cost: 0.3113 Fidelity: 0.9157 Trace: 0.9851
Rep: 251 Cost: 0.3081 Fidelity: 0.9164 Trace: 0.9849
Rep: 252 Cost: 0.3071 Fidelity: 0.9167 Trace: 0.9847
Rep: 253 Cost: 0.3033 Fidelity: 0.9181 Trace: 0.9846
Rep: 254 Cost: 0.3009 Fidelity: 0.9189 Trace: 0.9844
Rep: 255 Cost: 0.2994 Fidelity: 0.9192 Trace: 0.9841
Rep: 256 Cost: 0.2949 Fidelity: 0.9205 Trace: 0.9841
Rep: 257 Cost: 0.2969 Fidelity: 0.9221 Trace: 0.9840
Rep: 258 Cost: 0.2946 Fidelity: 0.9231 Trace: 0.9839
Rep: 259 Cost: 0.2894 Fidelity: 0.9233 Trace: 0.9836
Rep: 260 Cost: 0.2903 Fidelity: 0.9231 Trace: 0.9833
Rep: 261 Cost: 0.2848 Fidelity: 0.9242 Trace: 0.9832
Rep: 262 Cost: 0.2839 Fidelity: 0.9261 Trace: 0.9833
Rep: 263 Cost: 0.2826 Fidelity: 0.9273 Trace: 0.9832
Rep: 264 Cost: 0.2779 Fidelity: 0.9277 Trace: 0.9829
Rep: 265 Cost: 0.2762 Fidelity: 0.9275 Trace: 0.9825
Rep: 266 Cost: 0.2734 Fidelity: 0.9282 Trace: 0.9824
Rep: 267 Cost: 0.2698 Fidelity: 0.9297 Trace: 0.9824
Rep: 268 Cost: 0.2679 Fidelity: 0.9307 Trace: 0.9823
Rep: 269 Cost: 0.2647 Fidelity: 0.9310 Trace: 0.9821
Rep: 270 Cost: 0.2635 Fidelity: 0.9310 Trace: 0.9818
Rep: 271 Cost: 0.2590 Fidelity: 0.9324 Trace: 0.9817
Rep: 272 Cost: 0.2585 Fidelity: 0.9338 Trace: 0.9817
Rep: 273 Cost: 0.2552 Fidelity: 0.9344 Trace: 0.9815
Rep: 274 Cost: 0.2530 Fidelity: 0.9344 Trace: 0.9812
Rep: 275 Cost: 0.2494 Fidelity: 0.9354 Trace: 0.9811
Rep: 276 Cost: 0.2480 Fidelity: 0.9367 Trace: 0.9811
Rep: 277 Cost: 0.2451 Fidelity: 0.9374 Trace: 0.9809
Rep: 278 Cost: 0.2426 Fidelity: 0.9374 Trace: 0.9806
Rep: 279 Cost: 0.2394 Fidelity: 0.9383 Trace: 0.9806
Rep: 280 Cost: 0.2390 Fidelity: 0.9399 Trace: 0.9807
Rep: 281 Cost: 0.2376 Fidelity: 0.9409 Trace: 0.9806
Rep: 282 Cost: 0.2331 Fidelity: 0.9413 Trace: 0.9803
Rep: 283 Cost: 0.2297 Fidelity: 0.9412 Trace: 0.9800
Rep: 284 Cost: 0.2261 Fidelity: 0.9421 Trace: 0.9799
Rep: 285 Cost: 0.2242 Fidelity: 0.9435 Trace: 0.9799
Rep: 286 Cost: 0.2214 Fidelity: 0.9442 Trace: 0.9798
Rep: 287 Cost: 0.2176 Fidelity: 0.9444 Trace: 0.9795
Rep: 288 Cost: 0.2165 Fidelity: 0.9452 Trace: 0.9794
Rep: 289 Cost: 0.2122 Fidelity: 0.9463 Trace: 0.9794
Rep: 290 Cost: 0.2102 Fidelity: 0.9469 Trace: 0.9792
Rep: 291 Cost: 0.2069 Fidelity: 0.9476 Trace: 0.9790
Rep: 292 Cost: 0.2039 Fidelity: 0.9486 Trace: 0.9790
Rep: 293 Cost: 0.2006 Fidelity: 0.9492 Trace: 0.9788
Rep: 294 Cost: 0.1981 Fidelity: 0.9497 Trace: 0.9787
Rep: 295 Cost: 0.1949 Fidelity: 0.9506 Trace: 0.9786
Rep: 296 Cost: 0.1916 Fidelity: 0.9514 Trace: 0.9785
Rep: 297 Cost: 0.1908 Fidelity: 0.9516 Trace: 0.9783
Rep: 298 Cost: 0.1865 Fidelity: 0.9529 Trace: 0.9783
Rep: 299 Cost: 0.1832 Fidelity: 0.9536 Trace: 0.9782
Rep: 300 Cost: 0.1801 Fidelity: 0.9541 Trace: 0.9780
Rep: 301 Cost: 0.1768 Fidelity: 0.9549 Trace: 0.9779
Rep: 302 Cost: 0.1735 Fidelity: 0.9559 Trace: 0.9779
Rep: 303 Cost: 0.1698 Fidelity: 0.9565 Trace: 0.9778
Rep: 304 Cost: 0.1674 Fidelity: 0.9571 Trace: 0.9776
Rep: 305 Cost: 0.1649 Fidelity: 0.9582 Trace: 0.9777
Rep: 306 Cost: 0.1607 Fidelity: 0.9589 Trace: 0.9776
Rep: 307 Cost: 0.1586 Fidelity: 0.9592 Trace: 0.9773
Rep: 308 Cost: 0.1543 Fidelity: 0.9599 Trace: 0.9772
Rep: 309 Cost: 0.1511 Fidelity: 0.9610 Trace: 0.9773
Rep: 310 Cost: 0.1474 Fidelity: 0.9617 Trace: 0.9772
Rep: 311 Cost: 0.1438 Fidelity: 0.9619 Trace: 0.9769
Rep: 312 Cost: 0.1425 Fidelity: 0.9626 Trace: 0.9769
Rep: 313 Cost: 0.1373 Fidelity: 0.9637 Trace: 0.9769
Rep: 314 Cost: 0.1351 Fidelity: 0.9643 Trace: 0.9768
Rep: 315 Cost: 0.1293 Fidelity: 0.9647 Trace: 0.9766
Rep: 316 Cost: 0.1298 Fidelity: 0.9652 Trace: 0.9765
Rep: 317 Cost: 0.1235 Fidelity: 0.9660 Trace: 0.9765
Rep: 318 Cost: 0.1221 Fidelity: 0.9666 Trace: 0.9764
Rep: 319 Cost: 0.1184 Fidelity: 0.9672 Trace: 0.9764
Rep: 320 Cost: 0.1123 Fidelity: 0.9678 Trace: 0.9763
Rep: 321 Cost: 0.1113 Fidelity: 0.9681 Trace: 0.9762
Rep: 322 Cost: 0.1045 Fidelity: 0.9687 Trace: 0.9761
Rep: 323 Cost: 0.1030 Fidelity: 0.9693 Trace: 0.9761
Rep: 324 Cost: 0.0982 Fidelity: 0.9698 Trace: 0.9760
Rep: 325 Cost: 0.0953 Fidelity: 0.9699 Trace: 0.9757
Rep: 326 Cost: 0.0905 Fidelity: 0.9702 Trace: 0.9755
Rep: 327 Cost: 0.0863 Fidelity: 0.9707 Trace: 0.9755
Rep: 328 Cost: 0.0848 Fidelity: 0.9712 Trace: 0.9755
Rep: 329 Cost: 0.0794 Fidelity: 0.9715 Trace: 0.9754
Rep: 330 Cost: 0.0789 Fidelity: 0.9714 Trace: 0.9750
Rep: 331 Cost: 0.0732 Fidelity: 0.9718 Trace: 0.9749
Rep: 332 Cost: 0.0715 Fidelity: 0.9724 Trace: 0.9751
Rep: 333 Cost: 0.0683 Fidelity: 0.9728 Trace: 0.9750
Rep: 334 Cost: 0.0626 Fidelity: 0.9728 Trace: 0.9747
Rep: 335 Cost: 0.0599 Fidelity: 0.9727 Trace: 0.9744
Rep: 336 Cost: 0.0561 Fidelity: 0.9728 Trace: 0.9742
Rep: 337 Cost: 0.0525 Fidelity: 0.9730 Trace: 0.9741
Rep: 338 Cost: 0.0473 Fidelity: 0.9732 Trace: 0.9741
Rep: 339 Cost: 0.0452 Fidelity: 0.9734 Trace: 0.9741
Rep: 340 Cost: 0.0402 Fidelity: 0.9733 Trace: 0.9738
Rep: 341 Cost: 0.0376 Fidelity: 0.9730 Trace: 0.9734
Rep: 342 Cost: 0.0338 Fidelity: 0.9728 Trace: 0.9731
Rep: 343 Cost: 0.0287 Fidelity: 0.9729 Trace: 0.9731
Rep: 344 Cost: 0.0281 Fidelity: 0.9729 Trace: 0.9730
Rep: 345 Cost: 0.0231 Fidelity: 0.9727 Trace: 0.9727
Rep: 346 Cost: 0.0213 Fidelity: 0.9722 Trace: 0.9722
Rep: 347 Cost: 0.0231 Fidelity: 0.9720 Trace: 0.9720
Rep: 348 Cost: 0.0242 Fidelity: 0.9719 Trace: 0.9719
Rep: 349 Cost: 0.0253 Fidelity: 0.9720 Trace: 0.9720
Rep: 350 Cost: 0.0274 Fidelity: 0.9720 Trace: 0.9720
Rep: 351 Cost: 0.0303 Fidelity: 0.9717 Trace: 0.9718
Rep: 352 Cost: 0.0281 Fidelity: 0.9714 Trace: 0.9715
Rep: 353 Cost: 0.0307 Fidelity: 0.9712 Trace: 0.9713
Rep: 354 Cost: 0.0284 Fidelity: 0.9715 Trace: 0.9715
Rep: 355 Cost: 0.0279 Fidelity: 0.9718 Trace: 0.9718
Rep: 356 Cost: 0.0277 Fidelity: 0.9719 Trace: 0.9720
Rep: 357 Cost: 0.0238 Fidelity: 0.9719 Trace: 0.9719
Rep: 358 Cost: 0.0244 Fidelity: 0.9719 Trace: 0.9719
Rep: 359 Cost: 0.0184 Fidelity: 0.9722 Trace: 0.9722
Rep: 360 Cost: 0.0196 Fidelity: 0.9726 Trace: 0.9726
Rep: 361 Cost: 0.0173 Fidelity: 0.9726 Trace: 0.9726
Rep: 362 Cost: 0.0192 Fidelity: 0.9726 Trace: 0.9726
Rep: 363 Cost: 0.0206 Fidelity: 0.9728 Trace: 0.9728
Rep: 364 Cost: 0.0190 Fidelity: 0.9728 Trace: 0.9728
Rep: 365 Cost: 0.0208 Fidelity: 0.9728 Trace: 0.9728
Rep: 366 Cost: 0.0190 Fidelity: 0.9728 Trace: 0.9728
Rep: 367 Cost: 0.0195 Fidelity: 0.9727 Trace: 0.9727
Rep: 368 Cost: 0.0198 Fidelity: 0.9726 Trace: 0.9726
Rep: 369 Cost: 0.0166 Fidelity: 0.9727 Trace: 0.9727
Rep: 370 Cost: 0.0160 Fidelity: 0.9727 Trace: 0.9727
Rep: 371 Cost: 0.0246 Fidelity: 0.9724 Trace: 0.9725
Rep: 372 Cost: 0.0203 Fidelity: 0.9726 Trace: 0.9726
Rep: 373 Cost: 0.0257 Fidelity: 0.9728 Trace: 0.9728
Rep: 374 Cost: 0.0239 Fidelity: 0.9727 Trace: 0.9727
Rep: 375 Cost: 0.0215 Fidelity: 0.9724 Trace: 0.9724
Rep: 376 Cost: 0.0212 Fidelity: 0.9725 Trace: 0.9725
Rep: 377 Cost: 0.0219 Fidelity: 0.9728 Trace: 0.9728
Rep: 378 Cost: 0.0208 Fidelity: 0.9728 Trace: 0.9728
Rep: 379 Cost: 0.0221 Fidelity: 0.9726 Trace: 0.9726
Rep: 380 Cost: 0.0186 Fidelity: 0.9728 Trace: 0.9728
Rep: 381 Cost: 0.0270 Fidelity: 0.9732 Trace: 0.9732
Rep: 382 Cost: 0.0276 Fidelity: 0.9732 Trace: 0.9733
Rep: 383 Cost: 0.0172 Fidelity: 0.9730 Trace: 0.9730
Rep: 384 Cost: 0.0277 Fidelity: 0.9726 Trace: 0.9727
Rep: 385 Cost: 0.0221 Fidelity: 0.9728 Trace: 0.9728
Rep: 386 Cost: 0.0250 Fidelity: 0.9732 Trace: 0.9733
Rep: 387 Cost: 0.0282 Fidelity: 0.9733 Trace: 0.9733
Rep: 388 Cost: 0.0176 Fidelity: 0.9731 Trace: 0.9731
Rep: 389 Cost: 0.0284 Fidelity: 0.9726 Trace: 0.9726
Rep: 390 Cost: 0.0284 Fidelity: 0.9725 Trace: 0.9726
Rep: 391 Cost: 0.0187 Fidelity: 0.9729 Trace: 0.9729
Rep: 392 Cost: 0.0295 Fidelity: 0.9734 Trace: 0.9734
Rep: 393 Cost: 0.0289 Fidelity: 0.9734 Trace: 0.9735
Rep: 394 Cost: 0.0217 Fidelity: 0.9732 Trace: 0.9732
Rep: 395 Cost: 0.0268 Fidelity: 0.9727 Trace: 0.9727
Rep: 396 Cost: 0.0253 Fidelity: 0.9727 Trace: 0.9727
Rep: 397 Cost: 0.0238 Fidelity: 0.9730 Trace: 0.9731
Rep: 398 Cost: 0.0231 Fidelity: 0.9733 Trace: 0.9733
Rep: 399 Cost: 0.0207 Fidelity: 0.9733 Trace: 0.9733
Rep: 400 Cost: 0.0259 Fidelity: 0.9730 Trace: 0.9731
Rep: 401 Cost: 0.0206 Fidelity: 0.9731 Trace: 0.9731
Rep: 402 Cost: 0.0240 Fidelity: 0.9733 Trace: 0.9734
Rep: 403 Cost: 0.0262 Fidelity: 0.9734 Trace: 0.9734
Rep: 404 Cost: 0.0154 Fidelity: 0.9733 Trace: 0.9733
Rep: 405 Cost: 0.0271 Fidelity: 0.9729 Trace: 0.9730
Rep: 406 Cost: 0.0267 Fidelity: 0.9729 Trace: 0.9730
Rep: 407 Cost: 0.0228 Fidelity: 0.9732 Trace: 0.9732
Rep: 408 Cost: 0.0274 Fidelity: 0.9735 Trace: 0.9735
Rep: 409 Cost: 0.0237 Fidelity: 0.9735 Trace: 0.9735
Rep: 410 Cost: 0.0207 Fidelity: 0.9732 Trace: 0.9732
Rep: 411 Cost: 0.0242 Fidelity: 0.9733 Trace: 0.9733
Rep: 412 Cost: 0.0198 Fidelity: 0.9736 Trace: 0.9736
Rep: 413 Cost: 0.0211 Fidelity: 0.9736 Trace: 0.9736
Rep: 414 Cost: 0.0191 Fidelity: 0.9734 Trace: 0.9734
Rep: 415 Cost: 0.0211 Fidelity: 0.9731 Trace: 0.9731
Rep: 416 Cost: 0.0210 Fidelity: 0.9732 Trace: 0.9732
Rep: 417 Cost: 0.0175 Fidelity: 0.9735 Trace: 0.9735
Rep: 418 Cost: 0.0182 Fidelity: 0.9735 Trace: 0.9735
Rep: 419 Cost: 0.0208 Fidelity: 0.9733 Trace: 0.9733
Rep: 420 Cost: 0.0189 Fidelity: 0.9734 Trace: 0.9734
Rep: 421 Cost: 0.0200 Fidelity: 0.9737 Trace: 0.9737
Rep: 422 Cost: 0.0224 Fidelity: 0.9737 Trace: 0.9738
Rep: 423 Cost: 0.0161 Fidelity: 0.9735 Trace: 0.9735
Rep: 424 Cost: 0.0205 Fidelity: 0.9733 Trace: 0.9733
Rep: 425 Cost: 0.0173 Fidelity: 0.9735 Trace: 0.9735
Rep: 426 Cost: 0.0193 Fidelity: 0.9737 Trace: 0.9737
Rep: 427 Cost: 0.0183 Fidelity: 0.9737 Trace: 0.9737
Rep: 428 Cost: 0.0187 Fidelity: 0.9734 Trace: 0.9734
Rep: 429 Cost: 0.0205 Fidelity: 0.9734 Trace: 0.9734
Rep: 430 Cost: 0.0169 Fidelity: 0.9738 Trace: 0.9738
Rep: 431 Cost: 0.0199 Fidelity: 0.9739 Trace: 0.9739
Rep: 432 Cost: 0.0192 Fidelity: 0.9738 Trace: 0.9738
Rep: 433 Cost: 0.0158 Fidelity: 0.9737 Trace: 0.9737
Rep: 434 Cost: 0.0214 Fidelity: 0.9738 Trace: 0.9738
Rep: 435 Cost: 0.0183 Fidelity: 0.9737 Trace: 0.9738
Rep: 436 Cost: 0.0224 Fidelity: 0.9735 Trace: 0.9735
Rep: 437 Cost: 0.0226 Fidelity: 0.9735 Trace: 0.9735
Rep: 438 Cost: 0.0171 Fidelity: 0.9737 Trace: 0.9737
Rep: 439 Cost: 0.0225 Fidelity: 0.9740 Trace: 0.9740
Rep: 440 Cost: 0.0192 Fidelity: 0.9739 Trace: 0.9739
Rep: 441 Cost: 0.0207 Fidelity: 0.9737 Trace: 0.9737
Rep: 442 Cost: 0.0193 Fidelity: 0.9737 Trace: 0.9737
Rep: 443 Cost: 0.0197 Fidelity: 0.9740 Trace: 0.9740
Rep: 444 Cost: 0.0202 Fidelity: 0.9740 Trace: 0.9740
Rep: 445 Cost: 0.0180 Fidelity: 0.9738 Trace: 0.9738
Rep: 446 Cost: 0.0196 Fidelity: 0.9739 Trace: 0.9739
Rep: 447 Cost: 0.0187 Fidelity: 0.9741 Trace: 0.9741
Rep: 448 Cost: 0.0196 Fidelity: 0.9741 Trace: 0.9742
Rep: 449 Cost: 0.0159 Fidelity: 0.9739 Trace: 0.9739
Rep: 450 Cost: 0.0229 Fidelity: 0.9737 Trace: 0.9738
Rep: 451 Cost: 0.0220 Fidelity: 0.9739 Trace: 0.9739
Rep: 452 Cost: 0.0193 Fidelity: 0.9741 Trace: 0.9741
Rep: 453 Cost: 0.0203 Fidelity: 0.9741 Trace: 0.9741
Rep: 454 Cost: 0.0182 Fidelity: 0.9740 Trace: 0.9740
Rep: 455 Cost: 0.0196 Fidelity: 0.9740 Trace: 0.9740
Rep: 456 Cost: 0.0196 Fidelity: 0.9743 Trace: 0.9743
Rep: 457 Cost: 0.0183 Fidelity: 0.9743 Trace: 0.9743
Rep: 458 Cost: 0.0177 Fidelity: 0.9740 Trace: 0.9740
Rep: 459 Cost: 0.0184 Fidelity: 0.9739 Trace: 0.9739
Rep: 460 Cost: 0.0161 Fidelity: 0.9742 Trace: 0.9742
Rep: 461 Cost: 0.0197 Fidelity: 0.9744 Trace: 0.9744
Rep: 462 Cost: 0.0176 Fidelity: 0.9743 Trace: 0.9743
Rep: 463 Cost: 0.0186 Fidelity: 0.9740 Trace: 0.9741
Rep: 464 Cost: 0.0156 Fidelity: 0.9741 Trace: 0.9741
Rep: 465 Cost: 0.0216 Fidelity: 0.9744 Trace: 0.9745
Rep: 466 Cost: 0.0206 Fidelity: 0.9745 Trace: 0.9745
Rep: 467 Cost: 0.0163 Fidelity: 0.9742 Trace: 0.9742
Rep: 468 Cost: 0.0184 Fidelity: 0.9741 Trace: 0.9741
Rep: 469 Cost: 0.0164 Fidelity: 0.9743 Trace: 0.9743
Rep: 470 Cost: 0.0170 Fidelity: 0.9743 Trace: 0.9743
Rep: 471 Cost: 0.0156 Fidelity: 0.9741 Trace: 0.9741
Rep: 472 Cost: 0.0163 Fidelity: 0.9742 Trace: 0.9742
Rep: 473 Cost: 0.0154 Fidelity: 0.9742 Trace: 0.9742
Rep: 474 Cost: 0.0156 Fidelity: 0.9743 Trace: 0.9743
Rep: 475 Cost: 0.0158 Fidelity: 0.9742 Trace: 0.9742
Rep: 476 Cost: 0.0184 Fidelity: 0.9743 Trace: 0.9743
Rep: 477 Cost: 0.0179 Fidelity: 0.9744 Trace: 0.9744
Rep: 478 Cost: 0.0176 Fidelity: 0.9744 Trace: 0.9744
Rep: 479 Cost: 0.0197 Fidelity: 0.9745 Trace: 0.9745
Rep: 480 Cost: 0.0185 Fidelity: 0.9747 Trace: 0.9747
Rep: 481 Cost: 0.0213 Fidelity: 0.9746 Trace: 0.9746
Rep: 482 Cost: 0.0185 Fidelity: 0.9745 Trace: 0.9745
Rep: 483 Cost: 0.0190 Fidelity: 0.9744 Trace: 0.9744
Rep: 484 Cost: 0.0205 Fidelity: 0.9744 Trace: 0.9744
Rep: 485 Cost: 0.0169 Fidelity: 0.9745 Trace: 0.9745
Rep: 486 Cost: 0.0174 Fidelity: 0.9744 Trace: 0.9744
Rep: 487 Cost: 0.0169 Fidelity: 0.9744 Trace: 0.9744
Rep: 488 Cost: 0.0175 Fidelity: 0.9746 Trace: 0.9746
Rep: 489 Cost: 0.0169 Fidelity: 0.9746 Trace: 0.9746
Rep: 490 Cost: 0.0164 Fidelity: 0.9744 Trace: 0.9744
Rep: 491 Cost: 0.0160 Fidelity: 0.9744 Trace: 0.9744
Rep: 492 Cost: 0.0167 Fidelity: 0.9746 Trace: 0.9746
Rep: 493 Cost: 0.0160 Fidelity: 0.9746 Trace: 0.9746
Rep: 494 Cost: 0.0170 Fidelity: 0.9744 Trace: 0.9744
Rep: 495 Cost: 0.0158 Fidelity: 0.9745 Trace: 0.9745
Rep: 496 Cost: 0.0171 Fidelity: 0.9746 Trace: 0.9746
Rep: 497 Cost: 0.0163 Fidelity: 0.9746 Trace: 0.9746
Rep: 498 Cost: 0.0169 Fidelity: 0.9746 Trace: 0.9746
Rep: 499 Cost: 0.0165 Fidelity: 0.9746 Trace: 0.9746
Rep: 500 Cost: 0.0162 Fidelity: 0.9747 Trace: 0.9747
Rep: 501 Cost: 0.0169 Fidelity: 0.9748 Trace: 0.9748
Rep: 502 Cost: 0.0151 Fidelity: 0.9747 Trace: 0.9747
Rep: 503 Cost: 0.0149 Fidelity: 0.9747 Trace: 0.9747
Rep: 504 Cost: 0.0177 Fidelity: 0.9747 Trace: 0.9747
Rep: 505 Cost: 0.0169 Fidelity: 0.9748 Trace: 0.9748
Rep: 506 Cost: 0.0159 Fidelity: 0.9747 Trace: 0.9747
Rep: 507 Cost: 0.0172 Fidelity: 0.9746 Trace: 0.9746
Rep: 508 Cost: 0.0161 Fidelity: 0.9747 Trace: 0.9747
Rep: 509 Cost: 0.0180 Fidelity: 0.9749 Trace: 0.9749
Rep: 510 Cost: 0.0149 Fidelity: 0.9748 Trace: 0.9748
Rep: 511 Cost: 0.0218 Fidelity: 0.9745 Trace: 0.9745
Rep: 512 Cost: 0.0216 Fidelity: 0.9745 Trace: 0.9746
Rep: 513 Cost: 0.0158 Fidelity: 0.9748 Trace: 0.9748
Rep: 514 Cost: 0.0176 Fidelity: 0.9750 Trace: 0.9750
Rep: 515 Cost: 0.0151 Fidelity: 0.9749 Trace: 0.9749
Rep: 516 Cost: 0.0193 Fidelity: 0.9747 Trace: 0.9747
Rep: 517 Cost: 0.0169 Fidelity: 0.9747 Trace: 0.9747
Rep: 518 Cost: 0.0180 Fidelity: 0.9750 Trace: 0.9750
Rep: 519 Cost: 0.0172 Fidelity: 0.9750 Trace: 0.9750
Rep: 520 Cost: 0.0167 Fidelity: 0.9748 Trace: 0.9748
Rep: 521 Cost: 0.0176 Fidelity: 0.9748 Trace: 0.9748
Rep: 522 Cost: 0.0174 Fidelity: 0.9750 Trace: 0.9750
Rep: 523 Cost: 0.0159 Fidelity: 0.9750 Trace: 0.9750
Rep: 524 Cost: 0.0186 Fidelity: 0.9749 Trace: 0.9749
Rep: 525 Cost: 0.0148 Fidelity: 0.9749 Trace: 0.9749
Rep: 526 Cost: 0.0173 Fidelity: 0.9751 Trace: 0.9751
Rep: 527 Cost: 0.0155 Fidelity: 0.9751 Trace: 0.9751
Rep: 528 Cost: 0.0182 Fidelity: 0.9749 Trace: 0.9749
Rep: 529 Cost: 0.0158 Fidelity: 0.9750 Trace: 0.9750
Rep: 530 Cost: 0.0196 Fidelity: 0.9753 Trace: 0.9753
Rep: 531 Cost: 0.0191 Fidelity: 0.9753 Trace: 0.9753
Rep: 532 Cost: 0.0163 Fidelity: 0.9751 Trace: 0.9751
Rep: 533 Cost: 0.0175 Fidelity: 0.9749 Trace: 0.9750
Rep: 534 Cost: 0.0173 Fidelity: 0.9751 Trace: 0.9751
Rep: 535 Cost: 0.0162 Fidelity: 0.9752 Trace: 0.9752
Rep: 536 Cost: 0.0179 Fidelity: 0.9751 Trace: 0.9751
Rep: 537 Cost: 0.0153 Fidelity: 0.9750 Trace: 0.9750
Rep: 538 Cost: 0.0191 Fidelity: 0.9753 Trace: 0.9753
Rep: 539 Cost: 0.0178 Fidelity: 0.9753 Trace: 0.9753
Rep: 540 Cost: 0.0175 Fidelity: 0.9751 Trace: 0.9751
Rep: 541 Cost: 0.0172 Fidelity: 0.9751 Trace: 0.9751
Rep: 542 Cost: 0.0173 Fidelity: 0.9753 Trace: 0.9753
Rep: 543 Cost: 0.0167 Fidelity: 0.9753 Trace: 0.9753
Rep: 544 Cost: 0.0174 Fidelity: 0.9752 Trace: 0.9752
Rep: 545 Cost: 0.0167 Fidelity: 0.9753 Trace: 0.9753
Rep: 546 Cost: 0.0176 Fidelity: 0.9754 Trace: 0.9754
Rep: 547 Cost: 0.0176 Fidelity: 0.9754 Trace: 0.9754
Rep: 548 Cost: 0.0163 Fidelity: 0.9752 Trace: 0.9752
Rep: 549 Cost: 0.0157 Fidelity: 0.9753 Trace: 0.9753
Rep: 550 Cost: 0.0173 Fidelity: 0.9754 Trace: 0.9754
Rep: 551 Cost: 0.0157 Fidelity: 0.9754 Trace: 0.9754
Rep: 552 Cost: 0.0184 Fidelity: 0.9753 Trace: 0.9753
Rep: 553 Cost: 0.0169 Fidelity: 0.9754 Trace: 0.9754
Rep: 554 Cost: 0.0185 Fidelity: 0.9756 Trace: 0.9756
Rep: 555 Cost: 0.0186 Fidelity: 0.9755 Trace: 0.9755
Rep: 556 Cost: 0.0157 Fidelity: 0.9753 Trace: 0.9753
Rep: 557 Cost: 0.0158 Fidelity: 0.9754 Trace: 0.9754
Rep: 558 Cost: 0.0175 Fidelity: 0.9756 Trace: 0.9756
Rep: 559 Cost: 0.0167 Fidelity: 0.9755 Trace: 0.9755
Rep: 560 Cost: 0.0172 Fidelity: 0.9753 Trace: 0.9753
Rep: 561 Cost: 0.0162 Fidelity: 0.9754 Trace: 0.9754
Rep: 562 Cost: 0.0182 Fidelity: 0.9756 Trace: 0.9756
Rep: 563 Cost: 0.0181 Fidelity: 0.9756 Trace: 0.9756
Rep: 564 Cost: 0.0155 Fidelity: 0.9754 Trace: 0.9754
Rep: 565 Cost: 0.0148 Fidelity: 0.9754 Trace: 0.9754
Rep: 566 Cost: 0.0188 Fidelity: 0.9756 Trace: 0.9757
Rep: 567 Cost: 0.0180 Fidelity: 0.9757 Trace: 0.9757
Rep: 568 Cost: 0.0161 Fidelity: 0.9755 Trace: 0.9755
Rep: 569 Cost: 0.0157 Fidelity: 0.9754 Trace: 0.9755
Rep: 570 Cost: 0.0179 Fidelity: 0.9757 Trace: 0.9757
Rep: 571 Cost: 0.0173 Fidelity: 0.9757 Trace: 0.9757
Rep: 572 Cost: 0.0163 Fidelity: 0.9755 Trace: 0.9755
Rep: 573 Cost: 0.0152 Fidelity: 0.9755 Trace: 0.9755
Rep: 574 Cost: 0.0189 Fidelity: 0.9758 Trace: 0.9758
Rep: 575 Cost: 0.0184 Fidelity: 0.9758 Trace: 0.9758
Rep: 576 Cost: 0.0153 Fidelity: 0.9756 Trace: 0.9756
Rep: 577 Cost: 0.0152 Fidelity: 0.9756 Trace: 0.9756
Rep: 578 Cost: 0.0178 Fidelity: 0.9758 Trace: 0.9758
Rep: 579 Cost: 0.0170 Fidelity: 0.9758 Trace: 0.9758
Rep: 580 Cost: 0.0166 Fidelity: 0.9756 Trace: 0.9756
Rep: 581 Cost: 0.0157 Fidelity: 0.9757 Trace: 0.9757
Rep: 582 Cost: 0.0178 Fidelity: 0.9759 Trace: 0.9759
Rep: 583 Cost: 0.0174 Fidelity: 0.9758 Trace: 0.9758
Rep: 584 Cost: 0.0159 Fidelity: 0.9757 Trace: 0.9757
Rep: 585 Cost: 0.0154 Fidelity: 0.9757 Trace: 0.9757
Rep: 586 Cost: 0.0168 Fidelity: 0.9759 Trace: 0.9759
Rep: 587 Cost: 0.0155 Fidelity: 0.9759 Trace: 0.9759
Rep: 588 Cost: 0.0183 Fidelity: 0.9757 Trace: 0.9757
Rep: 589 Cost: 0.0177 Fidelity: 0.9757 Trace: 0.9757
Rep: 590 Cost: 0.0162 Fidelity: 0.9759 Trace: 0.9759
Rep: 591 Cost: 0.0173 Fidelity: 0.9759 Trace: 0.9759
Rep: 592 Cost: 0.0150 Fidelity: 0.9758 Trace: 0.9758
Rep: 593 Cost: 0.0154 Fidelity: 0.9759 Trace: 0.9759
Rep: 594 Cost: 0.0137 Fidelity: 0.9759 Trace: 0.9759
Rep: 595 Cost: 0.0170 Fidelity: 0.9757 Trace: 0.9757
Rep: 596 Cost: 0.0148 Fidelity: 0.9757 Trace: 0.9757
Rep: 597 Cost: 0.0186 Fidelity: 0.9760 Trace: 0.9760
Rep: 598 Cost: 0.0194 Fidelity: 0.9760 Trace: 0.9760
Rep: 599 Cost: 0.0151 Fidelity: 0.9758 Trace: 0.9758
Rep: 600 Cost: 0.0187 Fidelity: 0.9757 Trace: 0.9757
Rep: 601 Cost: 0.0197 Fidelity: 0.9757 Trace: 0.9757
Rep: 602 Cost: 0.0158 Fidelity: 0.9759 Trace: 0.9759
Rep: 603 Cost: 0.0184 Fidelity: 0.9760 Trace: 0.9760
Rep: 604 Cost: 0.0204 Fidelity: 0.9759 Trace: 0.9759
Rep: 605 Cost: 0.0166 Fidelity: 0.9759 Trace: 0.9759
Rep: 606 Cost: 0.0179 Fidelity: 0.9759 Trace: 0.9759
Rep: 607 Cost: 0.0195 Fidelity: 0.9759 Trace: 0.9759
Rep: 608 Cost: 0.0158 Fidelity: 0.9759 Trace: 0.9759
Rep: 609 Cost: 0.0183 Fidelity: 0.9760 Trace: 0.9760
Rep: 610 Cost: 0.0193 Fidelity: 0.9761 Trace: 0.9761
Rep: 611 Cost: 0.0164 Fidelity: 0.9760 Trace: 0.9760
Rep: 612 Cost: 0.0176 Fidelity: 0.9759 Trace: 0.9759
Rep: 613 Cost: 0.0181 Fidelity: 0.9759 Trace: 0.9759
Rep: 614 Cost: 0.0151 Fidelity: 0.9761 Trace: 0.9761
Rep: 615 Cost: 0.0176 Fidelity: 0.9762 Trace: 0.9762
Rep: 616 Cost: 0.0169 Fidelity: 0.9761 Trace: 0.9761
Rep: 617 Cost: 0.0156 Fidelity: 0.9760 Trace: 0.9760
Rep: 618 Cost: 0.0169 Fidelity: 0.9760 Trace: 0.9760
Rep: 619 Cost: 0.0150 Fidelity: 0.9761 Trace: 0.9761
Rep: 620 Cost: 0.0169 Fidelity: 0.9762 Trace: 0.9762
Rep: 621 Cost: 0.0170 Fidelity: 0.9761 Trace: 0.9761
Rep: 622 Cost: 0.0145 Fidelity: 0.9761 Trace: 0.9761
Rep: 623 Cost: 0.0178 Fidelity: 0.9761 Trace: 0.9762
Rep: 624 Cost: 0.0157 Fidelity: 0.9762 Trace: 0.9762
Rep: 625 Cost: 0.0167 Fidelity: 0.9762 Trace: 0.9762
Rep: 626 Cost: 0.0167 Fidelity: 0.9762 Trace: 0.9763
Rep: 627 Cost: 0.0141 Fidelity: 0.9762 Trace: 0.9762
Rep: 628 Cost: 0.0175 Fidelity: 0.9760 Trace: 0.9760
Rep: 629 Cost: 0.0160 Fidelity: 0.9760 Trace: 0.9760
Rep: 630 Cost: 0.0167 Fidelity: 0.9763 Trace: 0.9763
Rep: 631 Cost: 0.0173 Fidelity: 0.9764 Trace: 0.9764
Rep: 632 Cost: 0.0140 Fidelity: 0.9762 Trace: 0.9762
Rep: 633 Cost: 0.0144 Fidelity: 0.9762 Trace: 0.9762
Rep: 634 Cost: 0.0150 Fidelity: 0.9763 Trace: 0.9763
Rep: 635 Cost: 0.0142 Fidelity: 0.9763 Trace: 0.9763
Rep: 636 Cost: 0.0145 Fidelity: 0.9763 Trace: 0.9763
Rep: 637 Cost: 0.0135 Fidelity: 0.9763 Trace: 0.9763
Rep: 638 Cost: 0.0146 Fidelity: 0.9763 Trace: 0.9763
Rep: 639 Cost: 0.0134 Fidelity: 0.9763 Trace: 0.9763
Rep: 640 Cost: 0.0147 Fidelity: 0.9762 Trace: 0.9762
Rep: 641 Cost: 0.0139 Fidelity: 0.9763 Trace: 0.9763
Rep: 642 Cost: 0.0138 Fidelity: 0.9762 Trace: 0.9762
Rep: 643 Cost: 0.0147 Fidelity: 0.9763 Trace: 0.9763
Rep: 644 Cost: 0.0147 Fidelity: 0.9764 Trace: 0.9764
Rep: 645 Cost: 0.0141 Fidelity: 0.9764 Trace: 0.9764
Rep: 646 Cost: 0.0143 Fidelity: 0.9764 Trace: 0.9764
Rep: 647 Cost: 0.0168 Fidelity: 0.9765 Trace: 0.9765
Rep: 648 Cost: 0.0153 Fidelity: 0.9765 Trace: 0.9765
Rep: 649 Cost: 0.0171 Fidelity: 0.9762 Trace: 0.9762
Rep: 650 Cost: 0.0167 Fidelity: 0.9763 Trace: 0.9763
Rep: 651 Cost: 0.0157 Fidelity: 0.9765 Trace: 0.9765
Rep: 652 Cost: 0.0160 Fidelity: 0.9765 Trace: 0.9765
Rep: 653 Cost: 0.0155 Fidelity: 0.9763 Trace: 0.9763
Rep: 654 Cost: 0.0178 Fidelity: 0.9762 Trace: 0.9762
Rep: 655 Cost: 0.0164 Fidelity: 0.9763 Trace: 0.9764
Rep: 656 Cost: 0.0168 Fidelity: 0.9766 Trace: 0.9766
Rep: 657 Cost: 0.0183 Fidelity: 0.9767 Trace: 0.9767
Rep: 658 Cost: 0.0177 Fidelity: 0.9766 Trace: 0.9766
Rep: 659 Cost: 0.0170 Fidelity: 0.9764 Trace: 0.9764
Rep: 660 Cost: 0.0172 Fidelity: 0.9765 Trace: 0.9765
Rep: 661 Cost: 0.0165 Fidelity: 0.9766 Trace: 0.9766
Rep: 662 Cost: 0.0171 Fidelity: 0.9765 Trace: 0.9765
Rep: 663 Cost: 0.0174 Fidelity: 0.9765 Trace: 0.9765
Rep: 664 Cost: 0.0141 Fidelity: 0.9766 Trace: 0.9766
Rep: 665 Cost: 0.0170 Fidelity: 0.9766 Trace: 0.9766
Rep: 666 Cost: 0.0141 Fidelity: 0.9765 Trace: 0.9765
Rep: 667 Cost: 0.0157 Fidelity: 0.9766 Trace: 0.9767
Rep: 668 Cost: 0.0150 Fidelity: 0.9767 Trace: 0.9767
Rep: 669 Cost: 0.0148 Fidelity: 0.9766 Trace: 0.9766
Rep: 670 Cost: 0.0150 Fidelity: 0.9766 Trace: 0.9766
Rep: 671 Cost: 0.0148 Fidelity: 0.9767 Trace: 0.9767
Rep: 672 Cost: 0.0152 Fidelity: 0.9767 Trace: 0.9767
Rep: 673 Cost: 0.0144 Fidelity: 0.9767 Trace: 0.9767
Rep: 674 Cost: 0.0151 Fidelity: 0.9766 Trace: 0.9766
Rep: 675 Cost: 0.0137 Fidelity: 0.9767 Trace: 0.9767
Rep: 676 Cost: 0.0149 Fidelity: 0.9768 Trace: 0.9768
Rep: 677 Cost: 0.0137 Fidelity: 0.9767 Trace: 0.9767
Rep: 678 Cost: 0.0157 Fidelity: 0.9766 Trace: 0.9766
Rep: 679 Cost: 0.0136 Fidelity: 0.9767 Trace: 0.9767
Rep: 680 Cost: 0.0166 Fidelity: 0.9768 Trace: 0.9768
Rep: 681 Cost: 0.0151 Fidelity: 0.9767 Trace: 0.9767
Rep: 682 Cost: 0.0167 Fidelity: 0.9766 Trace: 0.9766
Rep: 683 Cost: 0.0140 Fidelity: 0.9767 Trace: 0.9767
Rep: 684 Cost: 0.0188 Fidelity: 0.9769 Trace: 0.9769
Rep: 685 Cost: 0.0183 Fidelity: 0.9769 Trace: 0.9769
Rep: 686 Cost: 0.0147 Fidelity: 0.9767 Trace: 0.9767
Rep: 687 Cost: 0.0180 Fidelity: 0.9767 Trace: 0.9767
Rep: 688 Cost: 0.0148 Fidelity: 0.9768 Trace: 0.9768
Rep: 689 Cost: 0.0168 Fidelity: 0.9770 Trace: 0.9770
Rep: 690 Cost: 0.0159 Fidelity: 0.9769 Trace: 0.9769
Rep: 691 Cost: 0.0165 Fidelity: 0.9767 Trace: 0.9767
Rep: 692 Cost: 0.0156 Fidelity: 0.9767 Trace: 0.9767
Rep: 693 Cost: 0.0162 Fidelity: 0.9769 Trace: 0.9769
Rep: 694 Cost: 0.0167 Fidelity: 0.9770 Trace: 0.9770
Rep: 695 Cost: 0.0154 Fidelity: 0.9769 Trace: 0.9769
Rep: 696 Cost: 0.0143 Fidelity: 0.9769 Trace: 0.9769
Rep: 697 Cost: 0.0164 Fidelity: 0.9769 Trace: 0.9769
Rep: 698 Cost: 0.0147 Fidelity: 0.9769 Trace: 0.9769
Rep: 699 Cost: 0.0163 Fidelity: 0.9768 Trace: 0.9768
Rep: 700 Cost: 0.0162 Fidelity: 0.9768 Trace: 0.9769
Rep: 701 Cost: 0.0151 Fidelity: 0.9770 Trace: 0.9770
Rep: 702 Cost: 0.0151 Fidelity: 0.9771 Trace: 0.9771
Rep: 703 Cost: 0.0145 Fidelity: 0.9771 Trace: 0.9771
Rep: 704 Cost: 0.0160 Fidelity: 0.9768 Trace: 0.9768
Rep: 705 Cost: 0.0149 Fidelity: 0.9769 Trace: 0.9769
Rep: 706 Cost: 0.0161 Fidelity: 0.9771 Trace: 0.9771
Rep: 707 Cost: 0.0157 Fidelity: 0.9771 Trace: 0.9771
Rep: 708 Cost: 0.0158 Fidelity: 0.9769 Trace: 0.9769
Rep: 709 Cost: 0.0154 Fidelity: 0.9769 Trace: 0.9769
Rep: 710 Cost: 0.0160 Fidelity: 0.9771 Trace: 0.9771
Rep: 711 Cost: 0.0143 Fidelity: 0.9771 Trace: 0.9771
Rep: 712 Cost: 0.0171 Fidelity: 0.9769 Trace: 0.9769
Rep: 713 Cost: 0.0163 Fidelity: 0.9769 Trace: 0.9769
Rep: 714 Cost: 0.0154 Fidelity: 0.9772 Trace: 0.9772
Rep: 715 Cost: 0.0156 Fidelity: 0.9772 Trace: 0.9772
Rep: 716 Cost: 0.0148 Fidelity: 0.9771 Trace: 0.9771
Rep: 717 Cost: 0.0141 Fidelity: 0.9771 Trace: 0.9771
Rep: 718 Cost: 0.0144 Fidelity: 0.9772 Trace: 0.9772
Rep: 719 Cost: 0.0150 Fidelity: 0.9771 Trace: 0.9771
Rep: 720 Cost: 0.0145 Fidelity: 0.9771 Trace: 0.9771
Rep: 721 Cost: 0.0161 Fidelity: 0.9771 Trace: 0.9771
Rep: 722 Cost: 0.0154 Fidelity: 0.9771 Trace: 0.9771
Rep: 723 Cost: 0.0156 Fidelity: 0.9771 Trace: 0.9771
Rep: 724 Cost: 0.0150 Fidelity: 0.9771 Trace: 0.9771
Rep: 725 Cost: 0.0161 Fidelity: 0.9772 Trace: 0.9772
Rep: 726 Cost: 0.0154 Fidelity: 0.9771 Trace: 0.9771
Rep: 727 Cost: 0.0160 Fidelity: 0.9770 Trace: 0.9770
Rep: 728 Cost: 0.0153 Fidelity: 0.9771 Trace: 0.9771
Rep: 729 Cost: 0.0159 Fidelity: 0.9773 Trace: 0.9773
Rep: 730 Cost: 0.0149 Fidelity: 0.9772 Trace: 0.9772
Rep: 731 Cost: 0.0162 Fidelity: 0.9771 Trace: 0.9771
Rep: 732 Cost: 0.0151 Fidelity: 0.9772 Trace: 0.9772
Rep: 733 Cost: 0.0166 Fidelity: 0.9774 Trace: 0.9774
Rep: 734 Cost: 0.0163 Fidelity: 0.9774 Trace: 0.9774
Rep: 735 Cost: 0.0151 Fidelity: 0.9772 Trace: 0.9772
Rep: 736 Cost: 0.0150 Fidelity: 0.9772 Trace: 0.9772
Rep: 737 Cost: 0.0158 Fidelity: 0.9774 Trace: 0.9774
Rep: 738 Cost: 0.0153 Fidelity: 0.9774 Trace: 0.9774
Rep: 739 Cost: 0.0157 Fidelity: 0.9772 Trace: 0.9772
Rep: 740 Cost: 0.0150 Fidelity: 0.9772 Trace: 0.9772
Rep: 741 Cost: 0.0165 Fidelity: 0.9774 Trace: 0.9774
Rep: 742 Cost: 0.0165 Fidelity: 0.9774 Trace: 0.9774
Rep: 743 Cost: 0.0142 Fidelity: 0.9773 Trace: 0.9773
Rep: 744 Cost: 0.0150 Fidelity: 0.9772 Trace: 0.9772
Rep: 745 Cost: 0.0154 Fidelity: 0.9774 Trace: 0.9774
Rep: 746 Cost: 0.0146 Fidelity: 0.9775 Trace: 0.9775
Rep: 747 Cost: 0.0139 Fidelity: 0.9774 Trace: 0.9774
Rep: 748 Cost: 0.0135 Fidelity: 0.9774 Trace: 0.9774
Rep: 749 Cost: 0.0127 Fidelity: 0.9774 Trace: 0.9774
Rep: 750 Cost: 0.0150 Fidelity: 0.9773 Trace: 0.9773
Rep: 751 Cost: 0.0135 Fidelity: 0.9773 Trace: 0.9773
Rep: 752 Cost: 0.0160 Fidelity: 0.9775 Trace: 0.9775
Rep: 753 Cost: 0.0144 Fidelity: 0.9775 Trace: 0.9775
Rep: 754 Cost: 0.0167 Fidelity: 0.9773 Trace: 0.9773
Rep: 755 Cost: 0.0146 Fidelity: 0.9774 Trace: 0.9774
Rep: 756 Cost: 0.0169 Fidelity: 0.9775 Trace: 0.9775
Rep: 757 Cost: 0.0154 Fidelity: 0.9776 Trace: 0.9776
Rep: 758 Cost: 0.0165 Fidelity: 0.9774 Trace: 0.9774
Rep: 759 Cost: 0.0169 Fidelity: 0.9774 Trace: 0.9775
Rep: 760 Cost: 0.0139 Fidelity: 0.9776 Trace: 0.9776
Rep: 761 Cost: 0.0142 Fidelity: 0.9776 Trace: 0.9776
Rep: 762 Cost: 0.0151 Fidelity: 0.9775 Trace: 0.9775
Rep: 763 Cost: 0.0137 Fidelity: 0.9775 Trace: 0.9775
Rep: 764 Cost: 0.0165 Fidelity: 0.9776 Trace: 0.9777
Rep: 765 Cost: 0.0150 Fidelity: 0.9776 Trace: 0.9776
Rep: 766 Cost: 0.0166 Fidelity: 0.9775 Trace: 0.9775
Rep: 767 Cost: 0.0168 Fidelity: 0.9775 Trace: 0.9775
Rep: 768 Cost: 0.0139 Fidelity: 0.9777 Trace: 0.9777
Rep: 769 Cost: 0.0147 Fidelity: 0.9777 Trace: 0.9777
Rep: 770 Cost: 0.0151 Fidelity: 0.9775 Trace: 0.9775
Rep: 771 Cost: 0.0141 Fidelity: 0.9776 Trace: 0.9776
Rep: 772 Cost: 0.0166 Fidelity: 0.9778 Trace: 0.9778
Rep: 773 Cost: 0.0162 Fidelity: 0.9778 Trace: 0.9778
Rep: 774 Cost: 0.0151 Fidelity: 0.9776 Trace: 0.9776
Rep: 775 Cost: 0.0164 Fidelity: 0.9775 Trace: 0.9775
Rep: 776 Cost: 0.0135 Fidelity: 0.9777 Trace: 0.9777
Rep: 777 Cost: 0.0162 Fidelity: 0.9778 Trace: 0.9778
Rep: 778 Cost: 0.0126 Fidelity: 0.9777 Trace: 0.9777
Rep: 779 Cost: 0.0172 Fidelity: 0.9775 Trace: 0.9775
Rep: 780 Cost: 0.0160 Fidelity: 0.9776 Trace: 0.9776
Rep: 781 Cost: 0.0163 Fidelity: 0.9778 Trace: 0.9778
Rep: 782 Cost: 0.0165 Fidelity: 0.9778 Trace: 0.9778
Rep: 783 Cost: 0.0142 Fidelity: 0.9776 Trace: 0.9777
Rep: 784 Cost: 0.0167 Fidelity: 0.9776 Trace: 0.9776
Rep: 785 Cost: 0.0130 Fidelity: 0.9777 Trace: 0.9777
Rep: 786 Cost: 0.0172 Fidelity: 0.9779 Trace: 0.9779
Rep: 787 Cost: 0.0164 Fidelity: 0.9779 Trace: 0.9779
Rep: 788 Cost: 0.0142 Fidelity: 0.9777 Trace: 0.9778
Rep: 789 Cost: 0.0146 Fidelity: 0.9777 Trace: 0.9777
Rep: 790 Cost: 0.0148 Fidelity: 0.9778 Trace: 0.9778
Rep: 791 Cost: 0.0145 Fidelity: 0.9778 Trace: 0.9779
Rep: 792 Cost: 0.0160 Fidelity: 0.9777 Trace: 0.9777
Rep: 793 Cost: 0.0147 Fidelity: 0.9777 Trace: 0.9777
Rep: 794 Cost: 0.0150 Fidelity: 0.9779 Trace: 0.9779
Rep: 795 Cost: 0.0139 Fidelity: 0.9779 Trace: 0.9779
Rep: 796 Cost: 0.0166 Fidelity: 0.9777 Trace: 0.9777
Rep: 797 Cost: 0.0165 Fidelity: 0.9777 Trace: 0.9777
Rep: 798 Cost: 0.0135 Fidelity: 0.9778 Trace: 0.9778
Rep: 799 Cost: 0.0137 Fidelity: 0.9779 Trace: 0.9779
Rep: 800 Cost: 0.0135 Fidelity: 0.9778 Trace: 0.9778
Rep: 801 Cost: 0.0139 Fidelity: 0.9779 Trace: 0.9779
Rep: 802 Cost: 0.0126 Fidelity: 0.9779 Trace: 0.9779
Rep: 803 Cost: 0.0160 Fidelity: 0.9777 Trace: 0.9778
Rep: 804 Cost: 0.0149 Fidelity: 0.9777 Trace: 0.9777
Rep: 805 Cost: 0.0151 Fidelity: 0.9779 Trace: 0.9779
Rep: 806 Cost: 0.0133 Fidelity: 0.9779 Trace: 0.9779
Rep: 807 Cost: 0.0173 Fidelity: 0.9777 Trace: 0.9778
Rep: 808 Cost: 0.0166 Fidelity: 0.9778 Trace: 0.9778
Rep: 809 Cost: 0.0147 Fidelity: 0.9780 Trace: 0.9780
Rep: 810 Cost: 0.0172 Fidelity: 0.9781 Trace: 0.9781
Rep: 811 Cost: 0.0132 Fidelity: 0.9780 Trace: 0.9780
Rep: 812 Cost: 0.0180 Fidelity: 0.9778 Trace: 0.9778
Rep: 813 Cost: 0.0178 Fidelity: 0.9779 Trace: 0.9779
Rep: 814 Cost: 0.0143 Fidelity: 0.9781 Trace: 0.9781
Rep: 815 Cost: 0.0183 Fidelity: 0.9783 Trace: 0.9783
Rep: 816 Cost: 0.0180 Fidelity: 0.9782 Trace: 0.9782
Rep: 817 Cost: 0.0162 Fidelity: 0.9780 Trace: 0.9780
Rep: 818 Cost: 0.0167 Fidelity: 0.9778 Trace: 0.9779
Rep: 819 Cost: 0.0162 Fidelity: 0.9779 Trace: 0.9779
Rep: 820 Cost: 0.0150 Fidelity: 0.9781 Trace: 0.9781
Rep: 821 Cost: 0.0158 Fidelity: 0.9782 Trace: 0.9782
Rep: 822 Cost: 0.0154 Fidelity: 0.9780 Trace: 0.9780
Rep: 823 Cost: 0.0140 Fidelity: 0.9779 Trace: 0.9780
Rep: 824 Cost: 0.0147 Fidelity: 0.9781 Trace: 0.9781
Rep: 825 Cost: 0.0130 Fidelity: 0.9781 Trace: 0.9781
Rep: 826 Cost: 0.0137 Fidelity: 0.9781 Trace: 0.9781
Rep: 827 Cost: 0.0139 Fidelity: 0.9782 Trace: 0.9782
Rep: 828 Cost: 0.0128 Fidelity: 0.9781 Trace: 0.9781
Rep: 829 Cost: 0.0143 Fidelity: 0.9781 Trace: 0.9781
Rep: 830 Cost: 0.0146 Fidelity: 0.9781 Trace: 0.9781
Rep: 831 Cost: 0.0141 Fidelity: 0.9782 Trace: 0.9782
Rep: 832 Cost: 0.0133 Fidelity: 0.9782 Trace: 0.9782
Rep: 833 Cost: 0.0139 Fidelity: 0.9782 Trace: 0.9782
Rep: 834 Cost: 0.0135 Fidelity: 0.9782 Trace: 0.9782
Rep: 835 Cost: 0.0148 Fidelity: 0.9782 Trace: 0.9782
Rep: 836 Cost: 0.0120 Fidelity: 0.9781 Trace: 0.9781
Rep: 837 Cost: 0.0153 Fidelity: 0.9782 Trace: 0.9783
Rep: 838 Cost: 0.0149 Fidelity: 0.9783 Trace: 0.9783
Rep: 839 Cost: 0.0135 Fidelity: 0.9782 Trace: 0.9782
Rep: 840 Cost: 0.0145 Fidelity: 0.9781 Trace: 0.9781
Rep: 841 Cost: 0.0150 Fidelity: 0.9782 Trace: 0.9782
Rep: 842 Cost: 0.0142 Fidelity: 0.9783 Trace: 0.9783
Rep: 843 Cost: 0.0130 Fidelity: 0.9783 Trace: 0.9783
Rep: 844 Cost: 0.0130 Fidelity: 0.9783 Trace: 0.9783
Rep: 845 Cost: 0.0133 Fidelity: 0.9783 Trace: 0.9783
Rep: 846 Cost: 0.0132 Fidelity: 0.9782 Trace: 0.9782
Rep: 847 Cost: 0.0127 Fidelity: 0.9782 Trace: 0.9782
Rep: 848 Cost: 0.0144 Fidelity: 0.9784 Trace: 0.9784
Rep: 849 Cost: 0.0128 Fidelity: 0.9784 Trace: 0.9784
Rep: 850 Cost: 0.0146 Fidelity: 0.9782 Trace: 0.9782
Rep: 851 Cost: 0.0142 Fidelity: 0.9782 Trace: 0.9782
Rep: 852 Cost: 0.0139 Fidelity: 0.9784 Trace: 0.9784
Rep: 853 Cost: 0.0144 Fidelity: 0.9784 Trace: 0.9784
Rep: 854 Cost: 0.0145 Fidelity: 0.9783 Trace: 0.9783
Rep: 855 Cost: 0.0121 Fidelity: 0.9783 Trace: 0.9783
Rep: 856 Cost: 0.0161 Fidelity: 0.9784 Trace: 0.9784
Rep: 857 Cost: 0.0171 Fidelity: 0.9783 Trace: 0.9783
Rep: 858 Cost: 0.0143 Fidelity: 0.9783 Trace: 0.9783
Rep: 859 Cost: 0.0152 Fidelity: 0.9782 Trace: 0.9783
Rep: 860 Cost: 0.0171 Fidelity: 0.9784 Trace: 0.9784
Rep: 861 Cost: 0.0163 Fidelity: 0.9785 Trace: 0.9785
Rep: 862 Cost: 0.0132 Fidelity: 0.9785 Trace: 0.9785
Rep: 863 Cost: 0.0163 Fidelity: 0.9783 Trace: 0.9783
Rep: 864 Cost: 0.0171 Fidelity: 0.9782 Trace: 0.9782
Rep: 865 Cost: 0.0148 Fidelity: 0.9783 Trace: 0.9784
Rep: 866 Cost: 0.0143 Fidelity: 0.9786 Trace: 0.9786
Rep: 867 Cost: 0.0165 Fidelity: 0.9786 Trace: 0.9786
Rep: 868 Cost: 0.0155 Fidelity: 0.9785 Trace: 0.9785
Rep: 869 Cost: 0.0140 Fidelity: 0.9784 Trace: 0.9784
Rep: 870 Cost: 0.0146 Fidelity: 0.9785 Trace: 0.9785
Rep: 871 Cost: 0.0155 Fidelity: 0.9784 Trace: 0.9784
Rep: 872 Cost: 0.0137 Fidelity: 0.9784 Trace: 0.9784
Rep: 873 Cost: 0.0142 Fidelity: 0.9785 Trace: 0.9785
Rep: 874 Cost: 0.0153 Fidelity: 0.9785 Trace: 0.9785
Rep: 875 Cost: 0.0132 Fidelity: 0.9784 Trace: 0.9784
Rep: 876 Cost: 0.0147 Fidelity: 0.9785 Trace: 0.9786
Rep: 877 Cost: 0.0152 Fidelity: 0.9785 Trace: 0.9785
Rep: 878 Cost: 0.0131 Fidelity: 0.9785 Trace: 0.9785
Rep: 879 Cost: 0.0146 Fidelity: 0.9786 Trace: 0.9786
Rep: 880 Cost: 0.0151 Fidelity: 0.9787 Trace: 0.9787
Rep: 881 Cost: 0.0138 Fidelity: 0.9787 Trace: 0.9787
Rep: 882 Cost: 0.0135 Fidelity: 0.9786 Trace: 0.9786
Rep: 883 Cost: 0.0146 Fidelity: 0.9785 Trace: 0.9785
Rep: 884 Cost: 0.0121 Fidelity: 0.9786 Trace: 0.9786
Rep: 885 Cost: 0.0165 Fidelity: 0.9786 Trace: 0.9786
Rep: 886 Cost: 0.0169 Fidelity: 0.9787 Trace: 0.9787
Rep: 887 Cost: 0.0150 Fidelity: 0.9787 Trace: 0.9787
Rep: 888 Cost: 0.0136 Fidelity: 0.9786 Trace: 0.9786
Rep: 889 Cost: 0.0142 Fidelity: 0.9786 Trace: 0.9786
Rep: 890 Cost: 0.0143 Fidelity: 0.9786 Trace: 0.9786
Rep: 891 Cost: 0.0127 Fidelity: 0.9787 Trace: 0.9787
Rep: 892 Cost: 0.0150 Fidelity: 0.9786 Trace: 0.9786
Rep: 893 Cost: 0.0146 Fidelity: 0.9786 Trace: 0.9786
Rep: 894 Cost: 0.0126 Fidelity: 0.9787 Trace: 0.9787
Rep: 895 Cost: 0.0129 Fidelity: 0.9787 Trace: 0.9787
Rep: 896 Cost: 0.0125 Fidelity: 0.9787 Trace: 0.9787
Rep: 897 Cost: 0.0121 Fidelity: 0.9787 Trace: 0.9787
Rep: 898 Cost: 0.0120 Fidelity: 0.9787 Trace: 0.9787
Rep: 899 Cost: 0.0154 Fidelity: 0.9785 Trace: 0.9786
Rep: 900 Cost: 0.0135 Fidelity: 0.9786 Trace: 0.9786
Rep: 901 Cost: 0.0130 Fidelity: 0.9787 Trace: 0.9787
Rep: 902 Cost: 0.0156 Fidelity: 0.9788 Trace: 0.9788
Rep: 903 Cost: 0.0150 Fidelity: 0.9788 Trace: 0.9788
Rep: 904 Cost: 0.0136 Fidelity: 0.9787 Trace: 0.9787
Rep: 905 Cost: 0.0137 Fidelity: 0.9787 Trace: 0.9787
Rep: 906 Cost: 0.0165 Fidelity: 0.9788 Trace: 0.9788
Rep: 907 Cost: 0.0132 Fidelity: 0.9788 Trace: 0.9788
Rep: 908 Cost: 0.0165 Fidelity: 0.9788 Trace: 0.9788
Rep: 909 Cost: 0.0150 Fidelity: 0.9788 Trace: 0.9788
Rep: 910 Cost: 0.0146 Fidelity: 0.9790 Trace: 0.9790
Rep: 911 Cost: 0.0146 Fidelity: 0.9789 Trace: 0.9789
Rep: 912 Cost: 0.0145 Fidelity: 0.9788 Trace: 0.9788
Rep: 913 Cost: 0.0145 Fidelity: 0.9788 Trace: 0.9788
Rep: 914 Cost: 0.0144 Fidelity: 0.9789 Trace: 0.9789
Rep: 915 Cost: 0.0140 Fidelity: 0.9789 Trace: 0.9789
Rep: 916 Cost: 0.0149 Fidelity: 0.9788 Trace: 0.9788
Rep: 917 Cost: 0.0143 Fidelity: 0.9788 Trace: 0.9788
Rep: 918 Cost: 0.0147 Fidelity: 0.9789 Trace: 0.9789
Rep: 919 Cost: 0.0143 Fidelity: 0.9789 Trace: 0.9789
Rep: 920 Cost: 0.0149 Fidelity: 0.9788 Trace: 0.9788
Rep: 921 Cost: 0.0142 Fidelity: 0.9788 Trace: 0.9788
Rep: 922 Cost: 0.0149 Fidelity: 0.9790 Trace: 0.9790
Rep: 923 Cost: 0.0143 Fidelity: 0.9790 Trace: 0.9790
Rep: 924 Cost: 0.0146 Fidelity: 0.9789 Trace: 0.9789
Rep: 925 Cost: 0.0141 Fidelity: 0.9789 Trace: 0.9789
Rep: 926 Cost: 0.0149 Fidelity: 0.9790 Trace: 0.9790
Rep: 927 Cost: 0.0144 Fidelity: 0.9790 Trace: 0.9791
Rep: 928 Cost: 0.0143 Fidelity: 0.9789 Trace: 0.9789
Rep: 929 Cost: 0.0133 Fidelity: 0.9789 Trace: 0.9789
Rep: 930 Cost: 0.0160 Fidelity: 0.9791 Trace: 0.9791
Rep: 931 Cost: 0.0158 Fidelity: 0.9791 Trace: 0.9791
Rep: 932 Cost: 0.0130 Fidelity: 0.9790 Trace: 0.9790
Rep: 933 Cost: 0.0125 Fidelity: 0.9790 Trace: 0.9790
Rep: 934 Cost: 0.0163 Fidelity: 0.9791 Trace: 0.9791
Rep: 935 Cost: 0.0157 Fidelity: 0.9791 Trace: 0.9791
Rep: 936 Cost: 0.0133 Fidelity: 0.9790 Trace: 0.9790
Rep: 937 Cost: 0.0139 Fidelity: 0.9790 Trace: 0.9790
Rep: 938 Cost: 0.0140 Fidelity: 0.9791 Trace: 0.9791
Rep: 939 Cost: 0.0138 Fidelity: 0.9792 Trace: 0.9792
Rep: 940 Cost: 0.0138 Fidelity: 0.9790 Trace: 0.9790
Rep: 941 Cost: 0.0122 Fidelity: 0.9790 Trace: 0.9790
Rep: 942 Cost: 0.0158 Fidelity: 0.9792 Trace: 0.9792
Rep: 943 Cost: 0.0145 Fidelity: 0.9792 Trace: 0.9792
Rep: 944 Cost: 0.0147 Fidelity: 0.9790 Trace: 0.9790
Rep: 945 Cost: 0.0147 Fidelity: 0.9790 Trace: 0.9790
Rep: 946 Cost: 0.0143 Fidelity: 0.9792 Trace: 0.9792
Rep: 947 Cost: 0.0150 Fidelity: 0.9793 Trace: 0.9793
Rep: 948 Cost: 0.0134 Fidelity: 0.9791 Trace: 0.9791
Rep: 949 Cost: 0.0137 Fidelity: 0.9791 Trace: 0.9791
Rep: 950 Cost: 0.0138 Fidelity: 0.9791 Trace: 0.9792
Rep: 951 Cost: 0.0129 Fidelity: 0.9792 Trace: 0.9792
Rep: 952 Cost: 0.0138 Fidelity: 0.9791 Trace: 0.9791
Rep: 953 Cost: 0.0117 Fidelity: 0.9791 Trace: 0.9791
Rep: 954 Cost: 0.0137 Fidelity: 0.9792 Trace: 0.9792
Rep: 955 Cost: 0.0122 Fidelity: 0.9792 Trace: 0.9792
Rep: 956 Cost: 0.0138 Fidelity: 0.9791 Trace: 0.9791
Rep: 957 Cost: 0.0135 Fidelity: 0.9792 Trace: 0.9792
Rep: 958 Cost: 0.0132 Fidelity: 0.9792 Trace: 0.9792
Rep: 959 Cost: 0.0122 Fidelity: 0.9792 Trace: 0.9792
Rep: 960 Cost: 0.0141 Fidelity: 0.9791 Trace: 0.9791
Rep: 961 Cost: 0.0135 Fidelity: 0.9792 Trace: 0.9792
Rep: 962 Cost: 0.0136 Fidelity: 0.9793 Trace: 0.9793
Rep: 963 Cost: 0.0141 Fidelity: 0.9793 Trace: 0.9793
Rep: 964 Cost: 0.0122 Fidelity: 0.9793 Trace: 0.9793
Rep: 965 Cost: 0.0114 Fidelity: 0.9793 Trace: 0.9793
Rep: 966 Cost: 0.0150 Fidelity: 0.9793 Trace: 0.9793
Rep: 967 Cost: 0.0147 Fidelity: 0.9794 Trace: 0.9794
Rep: 968 Cost: 0.0119 Fidelity: 0.9793 Trace: 0.9793
Rep: 969 Cost: 0.0129 Fidelity: 0.9792 Trace: 0.9792
Rep: 970 Cost: 0.0125 Fidelity: 0.9793 Trace: 0.9793
Rep: 971 Cost: 0.0119 Fidelity: 0.9793 Trace: 0.9793
Rep: 972 Cost: 0.0119 Fidelity: 0.9793 Trace: 0.9793
Rep: 973 Cost: 0.0129 Fidelity: 0.9793 Trace: 0.9794
Rep: 974 Cost: 0.0130 Fidelity: 0.9793 Trace: 0.9793
Rep: 975 Cost: 0.0137 Fidelity: 0.9793 Trace: 0.9793
Rep: 976 Cost: 0.0126 Fidelity: 0.9793 Trace: 0.9793
Rep: 977 Cost: 0.0141 Fidelity: 0.9795 Trace: 0.9795
Rep: 978 Cost: 0.0141 Fidelity: 0.9794 Trace: 0.9795
Rep: 979 Cost: 0.0137 Fidelity: 0.9794 Trace: 0.9794
Rep: 980 Cost: 0.0136 Fidelity: 0.9793 Trace: 0.9793
Rep: 981 Cost: 0.0144 Fidelity: 0.9794 Trace: 0.9794
Rep: 982 Cost: 0.0139 Fidelity: 0.9794 Trace: 0.9794
Rep: 983 Cost: 0.0147 Fidelity: 0.9793 Trace: 0.9793
Rep: 984 Cost: 0.0143 Fidelity: 0.9793 Trace: 0.9793
Rep: 985 Cost: 0.0135 Fidelity: 0.9794 Trace: 0.9794
Rep: 986 Cost: 0.0134 Fidelity: 0.9795 Trace: 0.9795
Rep: 987 Cost: 0.0145 Fidelity: 0.9794 Trace: 0.9794
Rep: 988 Cost: 0.0144 Fidelity: 0.9793 Trace: 0.9793
Rep: 989 Cost: 0.0140 Fidelity: 0.9795 Trace: 0.9795
Rep: 990 Cost: 0.0124 Fidelity: 0.9795 Trace: 0.9795
Rep: 991 Cost: 0.0163 Fidelity: 0.9794 Trace: 0.9794
Rep: 992 Cost: 0.0161 Fidelity: 0.9794 Trace: 0.9794
Rep: 993 Cost: 0.0137 Fidelity: 0.9795 Trace: 0.9795
Rep: 994 Cost: 0.0144 Fidelity: 0.9795 Trace: 0.9795
Rep: 995 Cost: 0.0132 Fidelity: 0.9794 Trace: 0.9794
Rep: 996 Cost: 0.0144 Fidelity: 0.9795 Trace: 0.9795
Rep: 997 Cost: 0.0134 Fidelity: 0.9796 Trace: 0.9796
Rep: 998 Cost: 0.0146 Fidelity: 0.9796 Trace: 0.9796
Rep: 999 Cost: 0.0114 Fidelity: 0.9795 Trace: 0.9795

Fidelity before optimization:  3.8496597e-09
Fidelity after optimization:  0.9795241

Target state:  [0.+0.j 1.+0.j 0.+0.j 0.+0.j 0.+0.j 0.+0.j]
Output state:  [ 0.  +0.j  0.99-0.j -0.  +0.j -0.  -0.j  0.  -0.j -0.  -0.j]

For more applications of CV quantum neural networks, see the state learning and gate synthesis demonstrations.

References

1(1,2,3,4)

Nathan Killoran, Thomas R Bromley, Juan Miguel Arrazola, Maria Schuld, Nicolás Quesada, and Seth Lloyd. Continuous-variable quantum neural networks. arXiv preprint arXiv:1806.06871, 2018.

2

Maria Schuld, Ville Bergholm, Christian Gogolin, Josh Izaac, and Nathan Killoran. Evaluating analytic gradients on quantum hardware. Physical Review A, 99(3):032331, 2019.

3

William R Clements, Peter C Humphreys, Benjamin J Metcalf, W Steven Kolthammer, and Ian A Walsmley. Optimal design for universal multiport interferometers. Optica, 3(12):1460–1465, 2016. doi:10.1364/OPTICA.3.001460.

Total running time of the script: ( 2 minutes 4.316 seconds)

Gallery generated by Sphinx-Gallery