Finding the optimal cluster state configuration. Minimization of one-way
quantum computation errors
- URL: http://arxiv.org/abs/2003.09197v1
- Date: Fri, 20 Mar 2020 10:58:14 GMT
- Title: Finding the optimal cluster state configuration. Minimization of one-way
quantum computation errors
- Authors: S. B. Korolev, T. Yu. Golubeva, Yu. M. Golubev
- Abstract summary: From all possible cluster state configurations, we choose those that give the smallest error.
We find the optimal strategy for the implementation of universal Gaussian computations with minimal errors.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we estimate the errors of Gaussian transformations implemented
using one-way quantum computations on cluster states of various configurations.
From all possible cluster state configurations, we choose those that give the
smallest computation error. Furthermore, we evaluate errors in hybrid
computational schemes, in which Gaussian operations are performed using one-way
computations with additional linear transformations. As a result, we find the
optimal strategy for the implementation of universal Gaussian computations with
minimal errors.
Related papers
- Stochastic Optimization for Non-convex Problem with Inexact Hessian
Matrix, Gradient, and Function [99.31457740916815]
Trust-region (TR) and adaptive regularization using cubics have proven to have some very appealing theoretical properties.
We show that TR and ARC methods can simultaneously provide inexact computations of the Hessian, gradient, and function values.
arXiv Detail & Related papers (2023-10-18T10:29:58Z) - Linearization Algorithms for Fully Composite Optimization [61.20539085730636]
This paper studies first-order algorithms for solving fully composite optimization problems convex compact sets.
We leverage the structure of the objective by handling differentiable and non-differentiable separately, linearizing only the smooth parts.
arXiv Detail & Related papers (2023-02-24T18:41:48Z) - Regularization and Optimization in Model-Based Clustering [4.096453902709292]
k-means algorithm variants essentially fit a mixture of identical spherical Gaussians to data that vastly deviates from such a distribution.
We develop more effective optimization algorithms for general GMMs, and we combine these algorithms with regularization strategies that avoid overfitting.
These results shed new light on the current status quo between GMM and k-means methods and suggest the more frequent use of general GMMs for data exploration.
arXiv Detail & Related papers (2023-02-05T18:22:29Z) - Error of an arbitrary single-mode Gaussian transformation on a weighted
cluster state using a cubic phase gate [0.0]
We show that it is possible to minimize the error of the arbitrary single-mode Gaussian transformation by a proper choice of the weight coefficients of the cluster state.
We modify the scheme by adding a non-Gaussian state obtained using a cubic phase gate as one of the nodes of the cluster.
arXiv Detail & Related papers (2022-07-19T20:56:43Z) - Error correction of the continuous-variable quantum hybrid computation
on two-node cluster states: limit of squeezing [0.0]
In this paper, we investigate the error correction of universal Gaussian transformations obtained in continuous-variable quantum computations.
We have considered a hybrid scheme to implement the universal Gaussian transformations.
arXiv Detail & Related papers (2022-01-19T12:14:32Z) - Local optimization on pure Gaussian state manifolds [63.76263875368856]
We exploit insights into the geometry of bosonic and fermionic Gaussian states to develop an efficient local optimization algorithm.
The method is based on notions of descent gradient attuned to the local geometry.
We use the presented methods to collect numerical and analytical evidence for the conjecture that Gaussian purifications are sufficient to compute the entanglement of purification of arbitrary mixed Gaussian states.
arXiv Detail & Related papers (2020-09-24T18:00:36Z) - Global Optimization of Gaussian processes [52.77024349608834]
We propose a reduced-space formulation with trained Gaussian processes trained on few data points.
The approach also leads to significantly smaller and computationally cheaper sub solver for lower bounding.
In total, we reduce time convergence by orders of orders of the proposed method.
arXiv Detail & Related papers (2020-05-21T20:59:11Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z) - Optimization of Graph Total Variation via Active-Set-based Combinatorial
Reconditioning [48.42916680063503]
We propose a novel adaptive preconditioning strategy for proximal algorithms on this problem class.
We show that nested-forest decomposition of the inactive edges yields a guaranteed local linear convergence rate.
Our results suggest that local convergence analysis can serve as a guideline for selecting variable metrics in proximal algorithms.
arXiv Detail & Related papers (2020-02-27T16:33:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.