Reservoir Computing with Error Correction: Long-term Behaviors of
Stochastic Dynamical Systems
- URL: http://arxiv.org/abs/2305.00669v2
- Date: Sun, 30 Jul 2023 13:35:13 GMT
- Title: Reservoir Computing with Error Correction: Long-term Behaviors of
Stochastic Dynamical Systems
- Authors: Cheng Fang, Yubin Lu, Ting Gao, Jinqiao Duan
- Abstract summary: We propose a data-driven framework combining Reservoir Computing and Normalizing Flow to study this issue.
We verify the effectiveness of the proposed framework in several experiments, including the Van der Pal, El Nino-Southern Oscillation simplified model, and Lorenz system.
- Score: 5.815325960286111
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The prediction of stochastic dynamical systems and the capture of dynamical
behaviors are profound problems. In this article, we propose a data-driven
framework combining Reservoir Computing and Normalizing Flow to study this
issue, which mimics error modeling to improve traditional Reservoir Computing
performance and integrates the virtues of both approaches. With few assumptions
about the underlying stochastic dynamical systems, this model-free method
successfully predicts the long-term evolution of stochastic dynamical systems
and replicates dynamical behaviors. We verify the effectiveness of the proposed
framework in several experiments, including the stochastic Van der Pal
oscillator, El Ni\~no-Southern Oscillation simplified model, and stochastic
Lorenz system. These experiments consist of Markov/non-Markov and
stationary/non-stationary stochastic processes which are defined by
linear/nonlinear stochastic differential equations or stochastic delay
differential equations. Additionally, we explore the noise-induced tipping
phenomenon, relaxation oscillation, stochastic mixed-mode oscillation, and
replication of the strange attractor.
Related papers
- Weak Collocation Regression for Inferring Stochastic Dynamics with
L\'{e}vy Noise [8.15076267771005]
We propose a weak form of the Fokker-Planck (FP) equation for extracting dynamics with L'evy noise.
Our approach can simultaneously distinguish mixed noise types, even in multi-dimensional problems.
arXiv Detail & Related papers (2024-03-13T06:54:38Z) - Learning minimal representations of stochastic processes with
variational autoencoders [52.99137594502433]
We introduce an unsupervised machine learning approach to determine the minimal set of parameters required to describe a process.
Our approach enables for the autonomous discovery of unknown parameters describing processes.
arXiv Detail & Related papers (2023-07-21T14:25:06Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Decimation technique for open quantum systems: a case study with
driven-dissipative bosonic chains [62.997667081978825]
Unavoidable coupling of quantum systems to external degrees of freedom leads to dissipative (non-unitary) dynamics.
We introduce a method to deal with these systems based on the calculation of (dissipative) lattice Green's function.
We illustrate the power of this method with several examples of driven-dissipative bosonic chains of increasing complexity.
arXiv Detail & Related papers (2022-02-15T19:00:09Z) - Structure-Preserving Learning Using Gaussian Processes and Variational
Integrators [62.31425348954686]
We propose the combination of a variational integrator for the nominal dynamics of a mechanical system and learning residual dynamics with Gaussian process regression.
We extend our approach to systems with known kinematic constraints and provide formal bounds on the prediction uncertainty.
arXiv Detail & Related papers (2021-12-10T11:09:29Z) - Extracting stochastic dynamical systems with $\alpha$-stable L\'evy
noise from data [14.230182518492311]
We propose a data-driven method to extract systems with $alpha$-stable L'evy noise from short burst data.
More specifically, we first estimate the L'evy jump measure and noise intensity.
Then we approximate the drift coefficient by combining nonlocal Kramers-Moyal formulas with normalizing flows.
arXiv Detail & Related papers (2021-09-30T06:57:42Z) - Extracting Governing Laws from Sample Path Data of Non-Gaussian
Stochastic Dynamical Systems [4.527698247742305]
We infer equations with non-Gaussian L'evy noise from available data to reasonably predict dynamical behaviors.
We establish a theoretical framework and design a numerical algorithm to compute the asymmetric L'evy jump measure, drift and diffusion.
This method will become an effective tool in discovering the governing laws from available data sets and in understanding the mechanisms underlying complex random phenomena.
arXiv Detail & Related papers (2021-07-21T14:50:36Z) - ImitationFlow: Learning Deep Stable Stochastic Dynamic Systems by
Normalizing Flows [29.310742141970394]
We introduce ImitationFlow, a novel Deep generative model that allows learning complex globally stable, nonlinear dynamics.
We show the effectiveness of our method with both standard datasets and a real robot experiment.
arXiv Detail & Related papers (2020-10-25T14:49:46Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - A Data-Driven Approach for Discovering Stochastic Dynamical Systems with
Non-Gaussian Levy Noise [5.17900889163564]
We develop a new data-driven approach to extract governing laws from noisy data sets.
First, we establish a feasible theoretical framework, by expressing the drift coefficient, diffusion coefficient and jump measure.
We then design a numerical algorithm to compute the drift, diffusion coefficient and jump measure, and thus extract a governing equation with Gaussian and non-Gaussian noise.
arXiv Detail & Related papers (2020-05-07T21:29:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.