Statistical and machine learning approaches for prediction of long-time
excitation energy transfer dynamics
- URL: http://arxiv.org/abs/2210.14160v2
- Date: Thu, 27 Oct 2022 10:33:45 GMT
- Title: Statistical and machine learning approaches for prediction of long-time
excitation energy transfer dynamics
- Authors: Kimara Naicker, Ilya Sinayskiy, Francesco Petruccione
- Abstract summary: The objective here is to demonstrate whether models such as SARIMA, CatBoost, Prophet, convolutional and recurrent neural networks are able to bypass this requirement.
Our results suggest that the SARIMA model can serve as a computationally inexpensive yet accurate way to predict long-time dynamics.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: One of the approaches used to solve for the dynamics of open quantum systems
is the hierarchical equations of motion (HEOM). Although it is numerically
exact, this method requires immense computational resources to solve. The
objective here is to demonstrate whether models such as SARIMA, CatBoost,
Prophet, convolutional and recurrent neural networks are able to bypass this
requirement. We are able to show this successfully by first solving the HEOM to
generate a data set of time series that depict the dissipative dynamics of
excitation energy transfer in photosynthetic systems then, we use this data to
test the models ability to predict the long-time dynamics when only the initial
short-time dynamics is given. Our results suggest that the SARIMA model can
serve as a computationally inexpensive yet accurate way to predict long-time
dynamics.
Related papers
- A short trajectory is all you need: A transformer-based model for long-time dissipative quantum dynamics [0.0]
We show that a deep artificial neural network can predict the long-time population dynamics of a quantum system coupled to a dissipative environment.
Our model is more accurate than classical forecasting models, such as recurrent neural networks.
arXiv Detail & Related papers (2024-09-17T16:17:52Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Predicting Energy Budgets in Droplet Dynamics: A Recurrent Neural Network Approach [0.0]
This study applies Long Short-Term Memory to predict transient and static outputs for fluid flows under surface tension effects.
Using only dimensionless numbers and geometric time series data from numerical simulations, LSTM predicts the energy budget.
Using a recurrent neural network (RNN) architecture fed with time series data derived from geometrical parameters, our study shows the accuracy of our approach in predicting energy budgets.
arXiv Detail & Related papers (2024-03-24T13:32:42Z) - Koopman Invertible Autoencoder: Leveraging Forward and Backward Dynamics
for Temporal Modeling [13.38194491846739]
We propose a novel machine learning model based on Koopman operator theory, which we call Koopman Invertible Autoencoders (KIA)
KIA captures the inherent characteristic of the system by modeling both forward and backward dynamics in the infinite-dimensional Hilbert space.
This enables us to efficiently learn low-dimensional representations, resulting in more accurate predictions of long-term system behavior.
arXiv Detail & Related papers (2023-09-19T03:42:55Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Physics-Inspired Temporal Learning of Quadrotor Dynamics for Accurate
Model Predictive Trajectory Tracking [76.27433308688592]
Accurately modeling quadrotor's system dynamics is critical for guaranteeing agile, safe, and stable navigation.
We present a novel Physics-Inspired Temporal Convolutional Network (PI-TCN) approach to learning quadrotor's system dynamics purely from robot experience.
Our approach combines the expressive power of sparse temporal convolutions and dense feed-forward connections to make accurate system predictions.
arXiv Detail & Related papers (2022-06-07T13:51:35Z) - A Note on Learning Rare Events in Molecular Dynamics using LSTM and
Transformer [4.80427355202687]
Recently successful examples on learning slow dynamics by LSTM are given with simulation data of low dimensional reaction coordinate.
We show that the following three key factors significantly affect the performance of language model learning, namely dimensionality of reaction coordinates, temporal resolution and state partition.
arXiv Detail & Related papers (2021-07-14T09:26:36Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Convolutional Tensor-Train LSTM for Spatio-temporal Learning [116.24172387469994]
We propose a higher-order LSTM model that can efficiently learn long-term correlations in the video sequence.
This is accomplished through a novel tensor train module that performs prediction by combining convolutional features across time.
Our results achieve state-of-the-art performance-art in a wide range of applications and datasets.
arXiv Detail & Related papers (2020-02-21T05:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.