Density Matrix Emulation of Quantum Recurrent Neural Networks for
Multivariate Time Series Prediction
- URL: http://arxiv.org/abs/2310.20671v1
- Date: Tue, 31 Oct 2023 17:32:11 GMT
- Title: Density Matrix Emulation of Quantum Recurrent Neural Networks for
Multivariate Time Series Prediction
- Authors: Jos\'e Daniel Viqueira, Daniel Fa\'ilde, Mariamo M. Juane, Andr\'es
G\'omez and David Mera
- Abstract summary: Quantum Recurrent Neural Networks (QRNNs) are robust candidates to model and predict future values in multivariate time series.
We show how QRNNs can make accurate predictions of future values by capturing non-trivial patterns of input series with different complexities.
- Score: 0.07499722271664144
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum Recurrent Neural Networks (QRNNs) are robust candidates to model and
predict future values in multivariate time series. However, the effective
implementation of some QRNN models is limited by the need of mid-circuit
measurements. Those increase the requirements for quantum hardware, which in
the current NISQ era does not allow reliable computations. Emulation arises as
the main near-term alternative to explore the potential of QRNNs, but existing
quantum emulators are not dedicated to circuits with multiple intermediate
measurements. In this context, we design a specific emulation method that
relies on density matrix formalism. The mathematical development is explicitly
provided as a compact formulation by using tensor notation. It allows us to
show how the present and past information from a time series is transmitted
through the circuit, and how to reduce the computational cost in every time
step of the emulated network. In addition, we derive the analytical gradient
and the Hessian of the network outputs with respect to its trainable
parameters, with an eye on gradient-based training and noisy outputs that would
appear when using real quantum processors. We finally test the presented
methods using a novel hardware-efficient ansatz and three diverse datasets that
include univariate and multivariate time series. Our results show how QRNNs can
make accurate predictions of future values by capturing non-trivial patterns of
input series with different complexities.
Related papers
- MultiPDENet: PDE-embedded Learning with Multi-time-stepping for Accelerated Flow Simulation [48.41289705783405]
We propose a PDE-embedded network with multiscale time stepping (MultiPDENet)
In particular, we design a convolutional filter based on the structure of finite difference with a small number of parameters to optimize.
A Physics Block with a 4th-order Runge-Kutta integrator at the fine time scale is established that embeds the structure of PDEs to guide the prediction.
arXiv Detail & Related papers (2025-01-27T12:15:51Z) - Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - A Quantum Optical Recurrent Neural Network for Online Processing of
Quantum Times Series [0.7087237546722617]
We show that a quantum optical recurrent neural network (QORNN) can enhance the transmission rate of quantum channels.
We also show that our model can counteract similar memory effects if they are unwanted.
We run a small-scale version of this last task on the photonic processor Borealis.
arXiv Detail & Related papers (2023-05-31T19:19:25Z) - Importance sampling for stochastic quantum simulations [68.8204255655161]
We introduce the qDrift protocol, which builds random product formulas by sampling from the Hamiltonian according to the coefficients.
We show that the simulation cost can be reduced while achieving the same accuracy, by considering the individual simulation cost during the sampling stage.
Results are confirmed by numerical simulations performed on a lattice nuclear effective field theory.
arXiv Detail & Related papers (2022-12-12T15:06:32Z) - Quantum Quantile Mechanics: Solving Stochastic Differential Equations
for Generating Time-Series [19.830330492689978]
We propose a quantum algorithm for sampling from a solution of differential equations (SDEs)
We represent the quantile function for an underlying probability distribution and extract samples as expectation values.
We test the method by simulating the Ornstein-Uhlenbeck process and sampling at times different from the initial point.
arXiv Detail & Related papers (2021-08-06T16:14:24Z) - Learning temporal data with variational quantum recurrent neural network [0.5658123802733283]
We propose a method for learning temporal data using a parametrized quantum circuit.
This work provides a way to exploit complex quantum dynamics for learning temporal data.
arXiv Detail & Related papers (2020-12-21T10:47:28Z) - Random Sampling Neural Network for Quantum Many-Body Problems [0.0]
We propose a general numerical method, Random Sampling Neural Networks (RSNN), to utilize the pattern recognition technique for the random sampling matrix elements of an interacting many-body system via a self-supervised learning approach.
Several exactly solvable 1D models, including Ising model with transverse field, Fermi-Hubbard model, and spin-$1/2$ $XXZ$ model, are used to test the applicability of RSNN.
arXiv Detail & Related papers (2020-11-10T15:52:44Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - Recurrent Quantum Neural Networks [7.6146285961466]
Recurrent neural networks are the foundation of many sequence-to-sequence models in machine learning.
We construct a quantum recurrent neural network (QRNN) with demonstrable performance on non-trivial tasks.
We evaluate the QRNN on MNIST classification, both by feeding the QRNN each image pixel-by-pixel; and by utilising modern data augmentation as preprocessing step.
arXiv Detail & Related papers (2020-06-25T17:59:44Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.