A comparative study of different machine learning methods for
dissipative quantum dynamics
- URL: http://arxiv.org/abs/2207.02417v1
- Date: Wed, 6 Jul 2022 03:37:24 GMT
- Title: A comparative study of different machine learning methods for
dissipative quantum dynamics
- Authors: Luis E. Herrera Rodriguez, Arif Ullah, Kennet J. Rueda Espinosa, Pavlo
O. Dral, and Alexei A. Kananenka
- Abstract summary: We show that supervised machine learning algorithms can accurately and efficiently predict the long-time populations dynamics of dissipative quantum systems.
We benchmaked 22 ML models on their ability to predict long-time dynamics of a two-level quantum system linearly coupled to harmonic bath.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: It has been recently shown that supervised machine learning (ML) algorithms
can accurately and efficiently predict the long-time populations dynamics of
dissipative quantum systems given only short-time population dynamics. In the
present article we benchmaked 22 ML models on their ability to predict
long-time dynamics of a two-level quantum system linearly coupled to harmonic
bath. The models include uni- and bidirectional recurrent, convolutional, and
fully-connected feed-forward artificial neural networks (ANNs) and kernel ridge
regression (KRR) with linear and most commonly used nonlinear kernels. Our
results suggest that KRR with nonlinear kernels can serve as inexpensive yet
accurate way to simulate long-time dynamics in cases where the constant length
of input trajectories is appropriate. Convolutional Gated Recurrent Unit model
is found to be the most efficient ANN model.
Related papers
- Scalable Mechanistic Neural Networks [52.28945097811129]
We propose an enhanced neural network framework designed for scientific machine learning applications involving long temporal sequences.
By reformulating the original Mechanistic Neural Network (MNN) we reduce the computational time and space complexities from cubic and quadratic with respect to the sequence length, respectively, to linear.
Extensive experiments demonstrate that S-MNN matches the original MNN in precision while substantially reducing computational resources.
arXiv Detail & Related papers (2024-10-08T14:27:28Z) - Oscillatory State-Space Models [61.923849241099184]
We propose Lineary State-Space models (LinOSS) for efficiently learning on long sequences.
A stable discretization, integrated over time using fast associative parallel scans, yields the proposed state-space model.
We show that LinOSS is universal, i.e., it can approximate any continuous and causal operator mapping between time-varying functions.
arXiv Detail & Related papers (2024-10-04T22:00:13Z) - A short trajectory is all you need: A transformer-based model for long-time dissipative quantum dynamics [0.0]
We show that a deep artificial neural network can predict the long-time population dynamics of a quantum system coupled to a dissipative environment.
Our model is more accurate than classical forecasting models, such as recurrent neural networks.
arXiv Detail & Related papers (2024-09-17T16:17:52Z) - Higher order quantum reservoir computing for non-intrusive reduced-order models [0.0]
Quantum reservoir computing technique (QRC) is a hybrid quantum-classical framework employing an ensemble of interconnected small quantum systems.
We show that QRC is able to predict complex nonlinear dynamical systems in a stable and accurate manner.
arXiv Detail & Related papers (2024-07-31T13:37:04Z) - Modeling Latent Neural Dynamics with Gaussian Process Switching Linear Dynamical Systems [2.170477444239546]
We develop an approach that balances these two objectives: the Gaussian Process Switching Linear Dynamical System (gpSLDS)
Our method builds on previous work modeling the latent state evolution via a differential equation whose nonlinear dynamics are described by a Gaussian process (GP-SDEs)
Our approach resolves key limitations of the rSLDS such as artifactual oscillations in dynamics near discrete state boundaries, while also providing posterior uncertainty estimates of the dynamics.
arXiv Detail & Related papers (2024-07-19T15:32:15Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Realization of the Trajectory Propagation in the MM-SQC Dynamics by
Using Machine Learning [4.629634111796585]
We apply the supervised machine learning (ML) approach to realize the trajectory-based nonadiabatic dynamics.
The proposed idea is proven to be reliable and accurate in the simulations of the dynamics of several site-exciton electron-phonon coupling models.
arXiv Detail & Related papers (2022-07-11T01:23:36Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.