Simulation of Open Quantum Dynamics with Bootstrap-Based Long Short-Term
Memory Recurrent Neural Network
- URL: http://arxiv.org/abs/2108.01310v2
- Date: Tue, 14 Sep 2021 03:02:22 GMT
- Title: Simulation of Open Quantum Dynamics with Bootstrap-Based Long Short-Term
Memory Recurrent Neural Network
- Authors: Kunni Lin, Jiawei Peng, Feng Long Gu and Zhenggang Lan
- Abstract summary: bootstrap method is applied in the LSTM-NN construction and prediction.
bootstrap-based LSTM-NN approach is a practical and powerful tool to propagate the long-time quantum dynamics of open systems.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The recurrent neural network with the long short-term memory cell (LSTM-NN)
is employed to simulate the long-time dynamics of open quantum system. The
bootstrap method is applied in the LSTM-NN construction and prediction, which
provides a Monte-Carlo estimation of forecasting confidence interval. Within
this approach, a large number of LSTM-NNs are constructed by resampling
time-series sequences that were obtained from the early-stage quantum evolution
given by numerically-exact multilayer multiconfigurational time-dependent
Hartree method. The built LSTM-NN ensemble is used for the reliable propagation
of the long-time quantum dynamics and the simulated result is highly consistent
with the exact evolution. The forecasting uncertainty that partially reflects
the reliability of the LSTM-NN prediction is also given. This demonstrates the
bootstrap-based LSTM-NN approach is a practical and powerful tool to propagate
the long-time quantum dynamics of open systems with high accuracy and low
computational cost.
Related papers
- A Novel Quantum LSTM Network [2.938337278931738]
This paper introduces the Quantum LSTM (qLSTM) model, which integrates quantum computing principles with traditional LSTM networks.
Our qLSTM model aims to address the limitations of traditional LSTMs, providing a robust framework for more efficient and effective sequential data processing.
arXiv Detail & Related papers (2024-06-13T10:26:14Z) - Federated Quantum Long Short-term Memory (FedQLSTM) [58.50321380769256]
Quantum federated learning (QFL) can facilitate collaborative learning across multiple clients using quantum machine learning (QML) models.
No prior work has focused on developing a QFL framework that utilizes temporal data to approximate functions.
A novel QFL framework that is the first to integrate quantum long short-term memory (QLSTM) models with temporal data is proposed.
arXiv Detail & Related papers (2023-12-21T21:40:47Z) - Disentangling Structured Components: Towards Adaptive, Interpretable and
Scalable Time Series Forecasting [52.47493322446537]
We develop a adaptive, interpretable and scalable forecasting framework, which seeks to individually model each component of the spatial-temporal patterns.
SCNN works with a pre-defined generative process of MTS, which arithmetically characterizes the latent structure of the spatial-temporal patterns.
Extensive experiments are conducted to demonstrate that SCNN can achieve superior performance over state-of-the-art models on three real-world datasets.
arXiv Detail & Related papers (2023-05-22T13:39:44Z) - Reservoir Computing via Quantum Recurrent Neural Networks [0.5999777817331317]
Existing VQC or QNN-based methods require significant computational resources to perform gradient-based optimization of quantum circuit parameters.
In this work, we approach sequential modeling by applying a reservoir computing (RC) framework to quantum recurrent neural networks (QRNN-RC)
Our numerical simulations show that the QRNN-RC can reach results comparable to fully trained QRNN models for several function approximation and time series tasks.
arXiv Detail & Related papers (2022-11-04T17:30:46Z) - Automatic Evolution of Machine-Learning based Quantum Dynamics with
Uncertainty Analysis [4.629634111796585]
The long short-term memory recurrent neural network (LSTM-RNN) models are used to simulate the long-time quantum dynamics.
This work builds an effective machine learning approach to simulate the dynamics evolution of open quantum systems.
arXiv Detail & Related papers (2022-05-07T08:53:55Z) - Comparative Analysis of Interval Reachability for Robust Implicit and
Feedforward Neural Networks [64.23331120621118]
We use interval reachability analysis to obtain robustness guarantees for implicit neural networks (INNs)
INNs are a class of implicit learning models that use implicit equations as layers.
We show that our approach performs at least as well as, and generally better than, applying state-of-the-art interval bound propagation methods to INNs.
arXiv Detail & Related papers (2022-04-01T03:31:27Z) - Quantum Long Short-Term Memory [3.675884635364471]
Long short-term memory (LSTM) is a recurrent neural network (RNN) for sequence and temporal dependency data modeling.
We propose a hybrid quantum-classical model of LSTM, which we dub QLSTM.
Our work paves the way toward implementing machine learning algorithms for sequence modeling on noisy intermediate-scale quantum (NISQ) devices.
arXiv Detail & Related papers (2020-09-03T16:41:09Z) - Automatic Remaining Useful Life Estimation Framework with Embedded
Convolutional LSTM as the Backbone [5.927250637620123]
We propose a new LSTM variant called embedded convolutional LSTM (E NeuralTM)
In ETM a group of different 1D convolutions is embedded into the LSTM structure. Through this, the temporal information is preserved between and within windows.
We show the superiority of our proposed ETM approach over the state-of-the-art approaches on several widely used benchmark data sets for RUL Estimation.
arXiv Detail & Related papers (2020-08-10T08:34:20Z) - Multi-Tones' Phase Coding (MTPC) of Interaural Time Difference by
Spiking Neural Network [68.43026108936029]
We propose a pure spiking neural network (SNN) based computational model for precise sound localization in the noisy real-world environment.
We implement this algorithm in a real-time robotic system with a microphone array.
The experiment results show a mean error azimuth of 13 degrees, which surpasses the accuracy of the other biologically plausible neuromorphic approach for sound source localization.
arXiv Detail & Related papers (2020-07-07T08:22:56Z) - Tensor train decompositions on recurrent networks [60.334946204107446]
Matrix product state (MPS) tensor trains have more attractive features than MPOs, in terms of storage reduction and computing time at inference.
We show that MPS tensor trains should be at the forefront of LSTM network compression through a theoretical analysis and practical experiments on NLP task.
arXiv Detail & Related papers (2020-06-09T18:25:39Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.