Quantum Long Short-Term Memory
- URL: http://arxiv.org/abs/2009.01783v1
- Date: Thu, 3 Sep 2020 16:41:09 GMT
- Title: Quantum Long Short-Term Memory
- Authors: Samuel Yen-Chi Chen, Shinjae Yoo, and Yao-Lung L. Fang
- Abstract summary: Long short-term memory (LSTM) is a recurrent neural network (RNN) for sequence and temporal dependency data modeling.
We propose a hybrid quantum-classical model of LSTM, which we dub QLSTM.
Our work paves the way toward implementing machine learning algorithms for sequence modeling on noisy intermediate-scale quantum (NISQ) devices.
- Score: 3.675884635364471
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Long short-term memory (LSTM) is a kind of recurrent neural networks (RNN)
for sequence and temporal dependency data modeling and its effectiveness has
been extensively established. In this work, we propose a hybrid
quantum-classical model of LSTM, which we dub QLSTM. We demonstrate that the
proposed model successfully learns several kinds of temporal data. In
particular, we show that for certain testing cases, this quantum version of
LSTM converges faster, or equivalently, reaches a better accuracy, than its
classical counterpart. Due to the variational nature of our approach, the
requirements on qubit counts and circuit depth are eased, and our work thus
paves the way toward implementing machine learning algorithms for sequence
modeling on noisy intermediate-scale quantum (NISQ) devices.
Related papers
- Quantum Implicit Neural Representations for 3D Scene Reconstruction and Novel View Synthesis [42.138439537056954]
Implicit neural representations (INRs) have become a powerful paradigm for continuous signal modeling and 3D scene reconstruction.<n>We present Quantum Neural Radiance Fields (Q-NeRF), the first hybrid quantum-classical framework for neural radiance field rendering.
arXiv Detail & Related papers (2025-12-14T13:24:11Z) - QKAN-LSTM: Quantum-inspired Kolmogorov-Arnold Long Short-term Memory [11.996286932948124]
Long short-term memory (LSTM) models are central to sequential modeling tasks in domains such as urban telecommunication forecasting.<n>We propose the Quantum-inspired Kolmogorov-Arnold Long Short-Term Memory (QKAN-LSTM)<n>QKAN-LSTM integrates Data Re-Uploading Activation modules into the gating structure of LSTMs.
arXiv Detail & Related papers (2025-12-04T18:03:23Z) - Towards Quantum Enhanced Adversarial Robustness with Rydberg Reservoir Learning [45.92935470813908]
Quantum computing reservoir (QRC) leverages the high-dimensional, nonlinear dynamics inherent in quantum many-body systems.<n>Recent studies indicate that perturbation quantums based on variational circuits remain susceptible to adversarials.<n>We investigate the first systematic evaluation of adversarial robustness in a QR based learning model.
arXiv Detail & Related papers (2025-10-15T12:17:23Z) - Fast surrogate modelling of EIT in atomic quantum systems using LSTM neural networks [0.0]
We develop a Long Short-Term Memory neural network capable of replicating the output of optical quantum simulations with high accuracy and significantly reduced computational cost.<n>We focus on applying this technique to Doppler-broadened Electromagnetically Induced Transparency in a ladder-type scheme for Rydberg-based sensing.<n>We demonstrate the effectiveness of the LSTM model on this representative optical quantum system, establishing it as a surrogate tool capable of supporting real-time signal processing and feedback-based optimisation.
arXiv Detail & Related papers (2025-10-02T22:30:40Z) - Toward Practical Quantum Machine Learning: A Novel Hybrid Quantum LSTM for Fraud Detection [0.1398098625978622]
We present a novel hybrid quantum-classical neural network architecture for fraud detection.
By leveraging quantum phenomena such as superposition and entanglement, our model enhances the feature representation of sequential transaction data.
Results demonstrate competitive improvements in accuracy, precision, recall, and F1 score relative to a conventional LSTM baseline.
arXiv Detail & Related papers (2025-04-30T19:09:12Z) - Federated Quantum-Train Long Short-Term Memory for Gravitational Wave Signal [3.360429911727189]
We present Federated QT-LSTM, a novel framework that combines the Quantum-Train (QT) methodology with Long Short-Term Memory (LSTM) networks in a federated learning setup.
By leveraging quantum neural networks (QNNs) to generate classical LSTM model parameters during training, the framework effectively addresses challenges in model compression, scalability, and computational efficiency.
arXiv Detail & Related papers (2025-03-20T11:34:13Z) - Toward Large-Scale Distributed Quantum Long Short-Term Memory with Modular Quantum Computers [5.673361333697935]
We introduce a Distributed Quantum Long Short-Term Memory (QLSTM) framework to address scalability challenges on Noisy Intermediate-Scale Quantum (NISQ) devices.
QLSTM captures long-range temporal dependencies, while a distributed architecture partitions the underlying Variational Quantum Circuits into smaller, manageable subcircuits.
We demonstrate that the distributed QLSTM achieves stable convergence and improved training dynamics compared to classical approaches.
arXiv Detail & Related papers (2025-03-18T10:07:34Z) - Enhancing Open Quantum Dynamics Simulations Using Neural Network-Based Non-Markovian Stochastic Schrödinger Equation Method [2.9413085575648235]
We propose a scheme that combines neural network techniques with simulations of the non-Markovian Schrodinger equation.
This approach significantly reduces the number of trajectories required for long-time simulations, particularly at low temperatures.
arXiv Detail & Related papers (2024-11-24T16:57:07Z) - Quantum Kernel-Based Long Short-term Memory [0.30723404270319693]
We introduce the Quantum Kernel-Based Long Short-Term Memory (QK-LSTM) network to capture complex, non-linear patterns in sequential data.
This quantum-enhanced architecture demonstrates efficient convergence, robust loss minimization, and model compactness.
Benchmark comparisons reveal that QK-LSTM achieves performance on par with classical LSTM models, yet with fewer parameters.
arXiv Detail & Related papers (2024-11-20T11:39:30Z) - Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - Unlocking the Power of LSTM for Long Term Time Series Forecasting [27.245021350821638]
We propose a simple yet efficient algorithm named P-sLSTM built upon sLSTM by incorporating patching and channel independence.
These modifications substantially enhance sLSTM's performance in TSF, achieving state-of-the-art results.
arXiv Detail & Related papers (2024-08-19T13:59:26Z) - Compressed-sensing Lindbladian quantum tomography with trapped ions [44.99833362998488]
Characterizing the dynamics of quantum systems is a central task for the development of quantum information processors.
We propose two different improvements of Lindbladian quantum tomography (LQT) that alleviate previous shortcomings.
arXiv Detail & Related papers (2024-03-12T09:58:37Z) - Learning to Program Variational Quantum Circuits with Fast Weights [3.6881738506505988]
This paper introduces the Quantum Fast Weight Programmers (QFWP) as a solution to the temporal or sequential learning challenge.
The proposed QFWP model achieves learning of temporal dependencies without necessitating the use of quantum recurrent neural networks.
Numerical simulations conducted in this study showcase the efficacy of the proposed QFWP model in both time-series prediction and RL tasks.
arXiv Detail & Related papers (2024-02-27T18:53:18Z) - Federated Quantum Long Short-term Memory (FedQLSTM) [58.50321380769256]
Quantum federated learning (QFL) can facilitate collaborative learning across multiple clients using quantum machine learning (QML) models.
No prior work has focused on developing a QFL framework that utilizes temporal data to approximate functions.
A novel QFL framework that is the first to integrate quantum long short-term memory (QLSTM) models with temporal data is proposed.
arXiv Detail & Related papers (2023-12-21T21:40:47Z) - Convolutional State Space Models for Long-Range Spatiotemporal Modeling [65.0993000439043]
ConvS5 is an efficient variant for long-rangetemporal modeling.
It significantly outperforms Transformers and ConvNISTTM on a long horizon Moving-Lab experiment while training 3X faster than ConvLSTM and generating samples 400X faster than Transformers.
arXiv Detail & Related papers (2023-10-30T16:11:06Z) - Simulating the Mott transition on a noisy digital quantum computer via
Cartan-based fast-forwarding circuits [62.73367618671969]
Dynamical mean-field theory (DMFT) maps the local Green's function of the Hubbard model to that of the Anderson impurity model.
Quantum and hybrid quantum-classical algorithms have been proposed to efficiently solve impurity models.
This work presents the first computation of the Mott phase transition using noisy digital quantum hardware.
arXiv Detail & Related papers (2021-12-10T17:32:15Z) - Simulation of Open Quantum Dynamics with Bootstrap-Based Long Short-Term
Memory Recurrent Neural Network [0.0]
bootstrap method is applied in the LSTM-NN construction and prediction.
bootstrap-based LSTM-NN approach is a practical and powerful tool to propagate the long-time quantum dynamics of open systems.
arXiv Detail & Related papers (2021-08-03T05:58:54Z) - High-Accuracy and Low-Latency Speech Recognition with Two-Head
Contextual Layer Trajectory LSTM Model [46.34788932277904]
We improve conventional hybrid LSTM acoustic models for high-accuracy and low-latency automatic speech recognition.
To achieve high accuracy, we use a contextual layer trajectory LSTM (cltLSTM), which decouples the temporal modeling and target classification tasks.
We further improve the training strategy with sequence-level teacher-student learning.
arXiv Detail & Related papers (2020-03-17T00:52:11Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.