Generative model for learning quantum ensemble via optimal transport
loss
- URL: http://arxiv.org/abs/2210.10743v1
- Date: Wed, 19 Oct 2022 17:35:38 GMT
- Title: Generative model for learning quantum ensemble via optimal transport
loss
- Authors: Hiroyuki Tezuka, Shumpei Uno, Naoki Yamamoto
- Abstract summary: We propose a quantum generative model that can learn quantum ensemble.
The proposed model paves the way for a wide application such as the health check of quantum devices.
- Score: 0.9404723842159504
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generative modeling is an unsupervised machine learning framework, that
exhibits strong performance in various machine learning tasks. Recently we find
several quantum version of generative model, some of which are even proven to
have quantum advantage. However, those methods are not directly applicable to
construct a generative model for learning a set of quantum states, i.e.,
ensemble. In this paper, we propose a quantum generative model that can learn
quantum ensemble, in an unsupervised machine learning framework. The key idea
is to introduce a new loss function calculated based on optimal transport loss,
which have been widely used in classical machine learning due to its several
good properties; e.g., no need to ensure the common support of two ensembles.
We then give in-depth analysis on this measure, such as the scaling property of
the approximation error. We also demonstrate the generative modeling with the
application to quantum anomaly detection problem, that cannot be handled via
existing methods. The proposed model paves the way for a wide application such
as the health check of quantum devices and efficient initialization of quantum
computation.
Related papers
- Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - Quantum data learning for quantum simulations in high-energy physics [55.41644538483948]
We explore the applicability of quantum-data learning to practical problems in high-energy physics.
We make use of ansatz based on quantum convolutional neural networks and numerically show that it is capable of recognizing quantum phases of ground states.
The observation of non-trivial learning properties demonstrated in these benchmarks will motivate further exploration of the quantum-data learning architecture in high-energy physics.
arXiv Detail & Related papers (2023-06-29T18:00:01Z) - Shadows of quantum machine learning [2.236957801565796]
We introduce a new class of quantum models where quantum resources are only required during training, while the deployment of the trained model is classical.
We prove that this class of models is universal for classically-deployed quantum machine learning.
arXiv Detail & Related papers (2023-05-31T18:00:02Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Classical surrogates for quantum learning models [0.7734726150561088]
We introduce the concept of a classical surrogate, a classical model which can be efficiently obtained from a trained quantum learning model.
We show that large classes of well-analyzed re-uploading models have a classical surrogate.
arXiv Detail & Related papers (2022-06-23T14:37:02Z) - Entanglement Forging with generative neural network models [0.0]
We show that a hybrid quantum-classical variational ans"atze can forge entanglement to lower quantum resource overhead.
The method is efficient in terms of the number of measurements required to achieve fixed precision on expected values of observables.
arXiv Detail & Related papers (2022-05-02T14:29:17Z) - Generalization Metrics for Practical Quantum Advantage in Generative
Models [68.8204255655161]
Generative modeling is a widely accepted natural use case for quantum computers.
We construct a simple and unambiguous approach to probe practical quantum advantage for generative modeling by measuring the algorithm's generalization performance.
Our simulation results show that our quantum-inspired models have up to a $68 times$ enhancement in generating unseen unique and valid samples.
arXiv Detail & Related papers (2022-01-21T16:35:35Z) - Quantum machine learning beyond kernel methods [0.0]
We show that parametrized quantum circuit models can exhibit a critically better generalization performance than their kernel formulations.
Our results constitute another step towards a more comprehensive theory of quantum machine learning models next to kernel formulations.
arXiv Detail & Related papers (2021-10-25T18:00:02Z) - Quantum algorithms for quantum dynamics: A performance study on the
spin-boson model [68.8204255655161]
Quantum algorithms for quantum dynamics simulations are traditionally based on implementing a Trotter-approximation of the time-evolution operator.
variational quantum algorithms have become an indispensable alternative, enabling small-scale simulations on present-day hardware.
We show that, despite providing a clear reduction of quantum gate cost, the variational method in its current implementation is unlikely to lead to a quantum advantage.
arXiv Detail & Related papers (2021-08-09T18:00:05Z) - Enhancing Generative Models via Quantum Correlations [1.6099403809839032]
Generative modeling using samples drawn from the probability distribution constitutes a powerful approach for unsupervised machine learning.
We show theoretically that such quantum correlations provide a powerful resource for generative modeling.
We numerically test this separation on standard machine learning data sets and show that it holds for practical problems.
arXiv Detail & Related papers (2021-01-20T22:57:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.