Recurrent convolutional neural networks for non-adiabatic dynamics of quantum-classical systems
- URL: http://arxiv.org/abs/2412.06631v1
- Date: Mon, 09 Dec 2024 16:23:25 GMT
- Title: Recurrent convolutional neural networks for non-adiabatic dynamics of quantum-classical systems
- Authors: Alex P. Ning, Lingyu Yang, Gia-Wei Chern,
- Abstract summary: We present a RNN model based on convolutional neural networks for modeling the nonlinear non-adiabatic dynamics of hybrid quantum-classical systems.
validation studies show that the trained PARC model could reproduce the space-time evolution of a one-dimensional semi-classical Holstein model.
- Score: 1.2972104025246092
- License:
- Abstract: Recurrent neural networks (RNNs) have recently been extensively applied to model the time-evolution in fluid dynamics, weather predictions, and even chaotic systems thanks to their ability to capture temporal dependencies and sequential patterns in data. Here we present a RNN model based on convolutional neural networks for modeling the nonlinear non-adiabatic dynamics of hybrid quantum-classical systems. The dynamical evolution of the hybrid systems is governed by equations of motion for classical degrees of freedom and von Neumann equation for electrons. The physics-aware recurrent convolutional (PARC) neural network structure incorporates a differentiator-integrator architecture that inductively models the spatiotemporal dynamics of generic physical systems. Validation studies show that the trained PARC model could reproduce the space-time evolution of a one-dimensional semi-classical Holstein model {with comparable accuracy to direct numerical simulations}. We also investigate the scaling of prediction errors with size of training dataset, prediction window, step-size, and model size.
Related papers
- Generative Modeling of Neural Dynamics via Latent Stochastic Differential Equations [1.5467259918426441]
We propose a framework for developing computational models of biological neural systems.
We employ a system of coupled differential equations with differentiable drift and diffusion functions.
We show that these hybrid models achieve competitive performance in predicting stimulus-evoked neural and behavioral responses.
arXiv Detail & Related papers (2024-12-01T09:36:03Z) - A short trajectory is all you need: A transformer-based model for long-time dissipative quantum dynamics [0.0]
We show that a deep artificial neural network can predict the long-time population dynamics of a quantum system coupled to a dissipative environment.
Our model is more accurate than classical forecasting models, such as recurrent neural networks.
arXiv Detail & Related papers (2024-09-17T16:17:52Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces novel deep dynamical models designed to represent continuous-time sequences.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experimental results on oscillating systems, videos and real-world state sequences (MuJoCo) demonstrate that our model with the learnable energy-based prior outperforms existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - A predictive physics-aware hybrid reduced order model for reacting flows [65.73506571113623]
A new hybrid predictive Reduced Order Model (ROM) is proposed to solve reacting flow problems.
The number of degrees of freedom is reduced from thousands of temporal points to a few POD modes with their corresponding temporal coefficients.
Two different deep learning architectures have been tested to predict the temporal coefficients.
arXiv Detail & Related papers (2023-01-24T08:39:20Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - Cubature Kalman Filter Based Training of Hybrid Differential Equation
Recurrent Neural Network Physiological Dynamic Models [13.637931956861758]
We show how we can approximate missing ordinary differential equations with known ODEs using a neural network approximation.
Results indicate that this RBSE approach to training the NN parameters yields better outcomes (measurement/state estimation accuracy) than training the neural network with backpropagation.
arXiv Detail & Related papers (2021-10-12T15:38:13Z) - Stochastic Recurrent Neural Network for Multistep Time Series
Forecasting [0.0]
We leverage advances in deep generative models and the concept of state space models to propose an adaptation of the recurrent neural network for time series forecasting.
Our model preserves the architectural workings of a recurrent neural network for which all relevant information is encapsulated in its hidden states, and this flexibility allows our model to be easily integrated into any deep architecture for sequential modelling.
arXiv Detail & Related papers (2021-04-26T01:43:43Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Physics-Incorporated Convolutional Recurrent Neural Networks for Source
Identification and Forecasting of Dynamical Systems [10.689157154434499]
In this paper, we present a hybrid framework combining numerical physics-based models with deep learning for source identification.
We formulate our model PhICNet as a convolutional recurrent neural network (RNN) which is end-to-end trainable for predicting S-temporal evolution.
Experimental results show that the proposed model can forecast the dynamics for a relatively long time and identify the sources as well.
arXiv Detail & Related papers (2020-04-14T00:27:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.