Quantum Boltzmann Machines: Applications in Quantitative Finance
- URL: http://arxiv.org/abs/2301.13295v1
- Date: Mon, 30 Jan 2023 21:25:14 GMT
- Title: Quantum Boltzmann Machines: Applications in Quantitative Finance
- Authors: Cameron Perot
- Abstract summary: We explore using the D-Wave Advantage 4.1 quantum annealer to sample from quantum Boltzmann distributions and train quantum Boltzmann machines (QBMs)
Our findings indicate that QBMs trained using the Advantage 4.1 are much noisier than those trained using simulations and struggle to perform at the same level as classical RBMs.
There is the potential for QBMs to outperform classical RBMs if future generation annealers can generate samples closer to the desired theoretical distributions.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this thesis we explore using the D-Wave Advantage 4.1 quantum annealer to
sample from quantum Boltzmann distributions and train quantum Boltzmann
machines (QBMs). We focus on the real-world problem of using QBMs as generative
models to produce synthetic foreign exchange market data and analyze how the
results stack up against classical models based on restricted Boltzmann
machines (RBMs). Additionally, we study a small 12-qubit problem which we use
to compare samples obtained from the Advantage 4.1 with theory, and in the
process gain vital insights into how well the Advantage 4.1 can sample quantum
Boltzmann random variables and be used to train QBMs. Through this, we are able
to show that the Advantage 4.1 can sample classical Boltzmann random variables
to some extent, but is limited in its ability to sample from quantum Boltzmann
distributions. Our findings indicate that QBMs trained using the Advantage 4.1
are much noisier than those trained using simulations and struggle to perform
at the same level as classical RBMs. However, there is the potential for QBMs
to outperform classical RBMs if future generation annealers can generate
samples closer to the desired theoretical distributions.
Related papers
- BNEM: A Boltzmann Sampler Based on Bootstrapped Noised Energy Matching [29.531531253864753]
We learn neural samplers given energy functions instead of data sampled from the Boltzmann distribution.
We propose a diffusion-based sampler, Noised Energy Matching, which theoretically has lower variance and more complexity.
The experimental results demonstrate that BNEM can achieve state-of-the-art performance while being more robust.
arXiv Detail & Related papers (2024-09-15T16:41:30Z) - Training Quantum Boltzmann Machines with the $β$-Variational Quantum Eigensolver [0.3670008893193884]
The quantum Boltzmann machine (QBM) is a generative machine learning model for both classical data and quantum states.
We show that low-rank representations obtained by $beta$-VQE provide an efficient way to learn low-rank target states.
We implement a trained model on a physical quantum device.
arXiv Detail & Related papers (2023-04-17T21:56:52Z) - A hybrid quantum-classical approach for inference on restricted
Boltzmann machines [1.0928470926399563]
A Boltzmann machine is a powerful machine learning model with many real-world applications.
Statistical inference on a Boltzmann machine can be carried out by sampling from its posterior distribution.
Quantum computers have the promise of solving some non-trivial problems in an efficient manner.
arXiv Detail & Related papers (2023-03-31T11:10:31Z) - A Framework for Demonstrating Practical Quantum Advantage: Racing
Quantum against Classical Generative Models [62.997667081978825]
We build over a proposed framework for evaluating the generalization performance of generative models.
We establish the first comparative race towards practical quantum advantage (PQA) between classical and quantum generative models.
Our results suggest that QCBMs are more efficient in the data-limited regime than the other state-of-the-art classical generative models.
arXiv Detail & Related papers (2023-03-27T22:48:28Z) - Importance sampling for stochastic quantum simulations [68.8204255655161]
We introduce the qDrift protocol, which builds random product formulas by sampling from the Hamiltonian according to the coefficients.
We show that the simulation cost can be reduced while achieving the same accuracy, by considering the individual simulation cost during the sampling stage.
Results are confirmed by numerical simulations performed on a lattice nuclear effective field theory.
arXiv Detail & Related papers (2022-12-12T15:06:32Z) - Do Quantum Circuit Born Machines Generalize? [58.720142291102135]
We present the first work in the literature that presents the QCBM's generalization performance as an integral evaluation metric for quantum generative models.
We show that the QCBM is able to effectively learn the reweighted dataset and generate unseen samples with higher quality than those in the training set.
arXiv Detail & Related papers (2022-07-27T17:06:34Z) - Theory of Quantum Generative Learning Models with Maximum Mean
Discrepancy [67.02951777522547]
We study learnability of quantum circuit Born machines (QCBMs) and quantum generative adversarial networks (QGANs)
We first analyze the generalization ability of QCBMs and identify their superiorities when the quantum devices can directly access the target distribution.
Next, we prove how the generalization error bound of QGANs depends on the employed Ansatz, the number of qudits, and input states.
arXiv Detail & Related papers (2022-05-10T08:05:59Z) - Learnability of the output distributions of local quantum circuits [53.17490581210575]
We investigate, within two different oracle models, the learnability of quantum circuit Born machines.
We first show a negative result, that the output distributions of super-logarithmic depth Clifford circuits are not sample-efficiently learnable.
We show that in a more powerful oracle model, namely when directly given access to samples, the output distributions of local Clifford circuits are computationally efficiently PAC learnable.
arXiv Detail & Related papers (2021-10-11T18:00:20Z) - Assessment of image generation by quantum annealer [0.0]
A quantum annealer may also serve as a fast sampler for the Ising spin-glass problem.
In this study, we focused on the performance of a quantum annealer as a generative model.
arXiv Detail & Related papers (2021-03-15T13:24:05Z) - Defence against adversarial attacks using classical and quantum-enhanced
Boltzmann machines [64.62510681492994]
generative models attempt to learn the distribution underlying a dataset, making them inherently more robust to small perturbations.
We find improvements ranging from 5% to 72% against attacks with Boltzmann machines on the MNIST dataset.
arXiv Detail & Related papers (2020-12-21T19:00:03Z) - Generative and discriminative training of Boltzmann machine through
Quantum annealing [0.0]
A hybrid quantum-classical method for learning Boltzmann machines (BM) is presented.
The cost function for learning BM is defined as a weighted sum of Kullback-Leibler (KL) divergence and Negative conditional Log-Likelihood (NCLL)
A Newton-Raphson optimization scheme is presented.
arXiv Detail & Related papers (2020-02-03T14:41:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.