Density Matrix Emulation of Quantum Recurrent Neural Networks for
Multivariate Time Series Prediction
- URL: http://arxiv.org/abs/2310.20671v1
- Date: Tue, 31 Oct 2023 17:32:11 GMT
- Title: Density Matrix Emulation of Quantum Recurrent Neural Networks for
Multivariate Time Series Prediction
- Authors: Jos\'e Daniel Viqueira, Daniel Fa\'ilde, Mariamo M. Juane, Andr\'es
G\'omez and David Mera
- Abstract summary: Quantum Recurrent Neural Networks (QRNNs) are robust candidates to model and predict future values in multivariate time series.
We show how QRNNs can make accurate predictions of future values by capturing non-trivial patterns of input series with different complexities.
- Score: 0.07499722271664144
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum Recurrent Neural Networks (QRNNs) are robust candidates to model and
predict future values in multivariate time series. However, the effective
implementation of some QRNN models is limited by the need of mid-circuit
measurements. Those increase the requirements for quantum hardware, which in
the current NISQ era does not allow reliable computations. Emulation arises as
the main near-term alternative to explore the potential of QRNNs, but existing
quantum emulators are not dedicated to circuits with multiple intermediate
measurements. In this context, we design a specific emulation method that
relies on density matrix formalism. The mathematical development is explicitly
provided as a compact formulation by using tensor notation. It allows us to
show how the present and past information from a time series is transmitted
through the circuit, and how to reduce the computational cost in every time
step of the emulated network. In addition, we derive the analytical gradient
and the Hessian of the network outputs with respect to its trainable
parameters, with an eye on gradient-based training and noisy outputs that would
appear when using real quantum processors. We finally test the presented
methods using a novel hardware-efficient ansatz and three diverse datasets that
include univariate and multivariate time series. Our results show how QRNNs can
make accurate predictions of future values by capturing non-trivial patterns of
input series with different complexities.
Related papers
- Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - A Quantum Optical Recurrent Neural Network for Online Processing of
Quantum Times Series [0.7087237546722617]
We show that a quantum optical recurrent neural network (QORNN) can enhance the transmission rate of quantum channels.
We also show that our model can counteract similar memory effects if they are unwanted.
We run a small-scale version of this last task on the photonic processor Borealis.
arXiv Detail & Related papers (2023-05-31T19:19:25Z) - TeD-Q: a tensor network enhanced distributed hybrid quantum machine
learning framework [59.07246314484875]
TeD-Q is an open-source software framework for quantum machine learning.
It seamlessly integrates classical machine learning libraries with quantum simulators.
It provides a graphical mode in which the quantum circuit and the training progress can be visualized in real-time.
arXiv Detail & Related papers (2023-01-13T09:35:05Z) - Power and limitations of single-qubit native quantum neural networks [5.526775342940154]
Quantum neural networks (QNNs) have emerged as a leading strategy to establish applications in machine learning, chemistry, and optimization.
We formulate a theoretical framework for the expressive ability of data re-uploading quantum neural networks.
arXiv Detail & Related papers (2022-05-16T17:58:27Z) - A quantum algorithm for training wide and deep classical neural networks [72.2614468437919]
We show that conditions amenable to classical trainability via gradient descent coincide with those necessary for efficiently solving quantum linear systems.
We numerically demonstrate that the MNIST image dataset satisfies such conditions.
We provide empirical evidence for $O(log n)$ training of a convolutional neural network with pooling.
arXiv Detail & Related papers (2021-07-19T23:41:03Z) - Learning temporal data with variational quantum recurrent neural network [0.5658123802733283]
We propose a method for learning temporal data using a parametrized quantum circuit.
This work provides a way to exploit complex quantum dynamics for learning temporal data.
arXiv Detail & Related papers (2020-12-21T10:47:28Z) - Random Sampling Neural Network for Quantum Many-Body Problems [0.0]
We propose a general numerical method, Random Sampling Neural Networks (RSNN), to utilize the pattern recognition technique for the random sampling matrix elements of an interacting many-body system via a self-supervised learning approach.
Several exactly solvable 1D models, including Ising model with transverse field, Fermi-Hubbard model, and spin-$1/2$ $XXZ$ model, are used to test the applicability of RSNN.
arXiv Detail & Related papers (2020-11-10T15:52:44Z) - On the learnability of quantum neural networks [132.1981461292324]
We consider the learnability of the quantum neural network (QNN) built on the variational hybrid quantum-classical scheme.
We show that if a concept can be efficiently learned by QNN, then it can also be effectively learned by QNN even with gate noise.
arXiv Detail & Related papers (2020-07-24T06:34:34Z) - Recurrent Quantum Neural Networks [7.6146285961466]
Recurrent neural networks are the foundation of many sequence-to-sequence models in machine learning.
We construct a quantum recurrent neural network (QRNN) with demonstrable performance on non-trivial tasks.
We evaluate the QRNN on MNIST classification, both by feeding the QRNN each image pixel-by-pixel; and by utilising modern data augmentation as preprocessing step.
arXiv Detail & Related papers (2020-06-25T17:59:44Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.