Quantum versus Classical Generative Modelling in Finance
- URL: http://arxiv.org/abs/2008.00691v1
- Date: Mon, 3 Aug 2020 07:50:33 GMT
- Title: Quantum versus Classical Generative Modelling in Finance
- Authors: Brian Coyle, Maxwell Henderson, Justin Chan Jin Le, Niraj Kumar, Marco
Paini, Elham Kashefi
- Abstract summary: We investigate and compare the capabilities of quantum versus classical models for the task of generative modelling in machine learning.
We find that entanglement typically plays a role in the problem instances which demonstrate an advantage over the Boltzmann machine.
- Score: 1.3212032015497979
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Finding a concrete use case for quantum computers in the near term is still
an open question, with machine learning typically touted as one of the first
fields which will be impacted by quantum technologies. In this work, we
investigate and compare the capabilities of quantum versus classical models for
the task of generative modelling in machine learning. We use a real world
financial dataset consisting of correlated currency pairs and compare two
models in their ability to learn the resulting distribution - a restricted
Boltzmann machine, and a quantum circuit Born machine. We provide extensive
numerical results indicating that the simulated Born machine always at least
matches the performance of the Boltzmann machine in this task, and demonstrates
superior performance as the model scales. We perform experiments on both
simulated and physical quantum chips using the Rigetti forest platform, and
also are able to partially train the largest instance to date of a quantum
circuit Born machine on quantum hardware. Finally, by studying the entanglement
capacity of the training Born machines, we find that entanglement typically
plays a role in the problem instances which demonstrate an advantage over the
Boltzmann machine.
Related papers
- Entanglement-induced provable and robust quantum learning advantages [0.0]
We rigorously establish a noise-robust, unconditional quantum learning advantage in terms of expressivity, inference speed, and training efficiency.
Our proof is information-theoretic and pinpoints the origin of this advantage.
arXiv Detail & Related papers (2024-10-04T02:39:07Z) - The curse of random quantum data [62.24825255497622]
We quantify the performances of quantum machine learning in the landscape of quantum data.
We find that the training efficiency and generalization capabilities in quantum machine learning will be exponentially suppressed with the increase in qubits.
Our findings apply to both the quantum kernel method and the large-width limit of quantum neural networks.
arXiv Detail & Related papers (2024-08-19T12:18:07Z) - Quantum data learning for quantum simulations in high-energy physics [55.41644538483948]
We explore the applicability of quantum-data learning to practical problems in high-energy physics.
We make use of ansatz based on quantum convolutional neural networks and numerically show that it is capable of recognizing quantum phases of ground states.
The observation of non-trivial learning properties demonstrated in these benchmarks will motivate further exploration of the quantum-data learning architecture in high-energy physics.
arXiv Detail & Related papers (2023-06-29T18:00:01Z) - Shadows of quantum machine learning [2.236957801565796]
We introduce a new class of quantum models where quantum resources are only required during training, while the deployment of the trained model is classical.
We prove that this class of models is universal for classically-deployed quantum machine learning.
arXiv Detail & Related papers (2023-05-31T18:00:02Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - Quantum Machine Learning: from physics to software engineering [58.720142291102135]
We show how classical machine learning approach can help improve the facilities of quantum computers.
We discuss how quantum algorithms and quantum computers may be useful for solving classical machine learning tasks.
arXiv Detail & Related papers (2023-01-04T23:37:45Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Generative model for learning quantum ensemble via optimal transport
loss [0.9404723842159504]
We propose a quantum generative model that can learn quantum ensemble.
The proposed model paves the way for a wide application such as the health check of quantum devices.
arXiv Detail & Related papers (2022-10-19T17:35:38Z) - Generative Quantum Machine Learning [0.0]
The aim of this thesis is to develop new generative quantum machine learning algorithms.
We introduce a quantum generative adversarial network and a quantum Boltzmann machine implementation, both of which can be realized with parameterized quantum circuits.
arXiv Detail & Related papers (2021-11-24T19:00:21Z) - Information Scrambling in Computationally Complex Quantum Circuits [56.22772134614514]
We experimentally investigate the dynamics of quantum scrambling on a 53-qubit quantum processor.
We show that while operator spreading is captured by an efficient classical model, operator entanglement requires exponentially scaled computational resources to simulate.
arXiv Detail & Related papers (2021-01-21T22:18:49Z) - Power of data in quantum machine learning [2.1012068875084964]
We show that some problems that are classically hard to compute can be easily predicted by classical machines learning from data.
We propose a projected quantum model that provides a simple and rigorous quantum speed-up for a learning problem in the fault-tolerant regime.
arXiv Detail & Related papers (2020-11-03T19:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.