Quantum Long Short-Term Memory
- URL: http://arxiv.org/abs/2009.01783v1
- Date: Thu, 3 Sep 2020 16:41:09 GMT
- Title: Quantum Long Short-Term Memory
- Authors: Samuel Yen-Chi Chen, Shinjae Yoo, and Yao-Lung L. Fang
- Abstract summary: Long short-term memory (LSTM) is a recurrent neural network (RNN) for sequence and temporal dependency data modeling.
We propose a hybrid quantum-classical model of LSTM, which we dub QLSTM.
Our work paves the way toward implementing machine learning algorithms for sequence modeling on noisy intermediate-scale quantum (NISQ) devices.
- Score: 3.675884635364471
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Long short-term memory (LSTM) is a kind of recurrent neural networks (RNN)
for sequence and temporal dependency data modeling and its effectiveness has
been extensively established. In this work, we propose a hybrid
quantum-classical model of LSTM, which we dub QLSTM. We demonstrate that the
proposed model successfully learns several kinds of temporal data. In
particular, we show that for certain testing cases, this quantum version of
LSTM converges faster, or equivalently, reaches a better accuracy, than its
classical counterpart. Due to the variational nature of our approach, the
requirements on qubit counts and circuit depth are eased, and our work thus
paves the way toward implementing machine learning algorithms for sequence
modeling on noisy intermediate-scale quantum (NISQ) devices.
Related papers
- Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - Unlocking the Power of LSTM for Long Term Time Series Forecasting [27.245021350821638]
We propose a simple yet efficient algorithm named P-sLSTM built upon sLSTM by incorporating patching and channel independence.
These modifications substantially enhance sLSTM's performance in TSF, achieving state-of-the-art results.
arXiv Detail & Related papers (2024-08-19T13:59:26Z) - Learning to Program Variational Quantum Circuits with Fast Weights [3.6881738506505988]
This paper introduces the Quantum Fast Weight Programmers (QFWP) as a solution to the temporal or sequential learning challenge.
The proposed QFWP model achieves learning of temporal dependencies without necessitating the use of quantum recurrent neural networks.
Numerical simulations conducted in this study showcase the efficacy of the proposed QFWP model in both time-series prediction and RL tasks.
arXiv Detail & Related papers (2024-02-27T18:53:18Z) - Federated Quantum Long Short-term Memory (FedQLSTM) [58.50321380769256]
Quantum federated learning (QFL) can facilitate collaborative learning across multiple clients using quantum machine learning (QML) models.
No prior work has focused on developing a QFL framework that utilizes temporal data to approximate functions.
A novel QFL framework that is the first to integrate quantum long short-term memory (QLSTM) models with temporal data is proposed.
arXiv Detail & Related papers (2023-12-21T21:40:47Z) - Convolutional State Space Models for Long-Range Spatiotemporal Modeling [65.0993000439043]
ConvS5 is an efficient variant for long-rangetemporal modeling.
It significantly outperforms Transformers and ConvNISTTM on a long horizon Moving-Lab experiment while training 3X faster than ConvLSTM and generating samples 400X faster than Transformers.
arXiv Detail & Related papers (2023-10-30T16:11:06Z) - A Novel Stochastic LSTM Model Inspired by Quantum Machine Learning [0.0]
Works in quantum machine learning (QML) over the past few years indicate that QML algorithms can function just as well as their classical counterparts.
This work aims to elucidate if it is possible to achieve some of QML's major reported benefits on classical machines by incorporating itsity.
arXiv Detail & Related papers (2023-05-17T13:44:25Z) - Simulating the Mott transition on a noisy digital quantum computer via
Cartan-based fast-forwarding circuits [62.73367618671969]
Dynamical mean-field theory (DMFT) maps the local Green's function of the Hubbard model to that of the Anderson impurity model.
Quantum and hybrid quantum-classical algorithms have been proposed to efficiently solve impurity models.
This work presents the first computation of the Mott phase transition using noisy digital quantum hardware.
arXiv Detail & Related papers (2021-12-10T17:32:15Z) - Simulation of Open Quantum Dynamics with Bootstrap-Based Long Short-Term
Memory Recurrent Neural Network [0.0]
bootstrap method is applied in the LSTM-NN construction and prediction.
bootstrap-based LSTM-NN approach is a practical and powerful tool to propagate the long-time quantum dynamics of open systems.
arXiv Detail & Related papers (2021-08-03T05:58:54Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.