Reconstruction, forecasting, and stability of chaotic dynamics from
partial data
- URL: http://arxiv.org/abs/2305.15111v2
- Date: Fri, 18 Aug 2023 09:33:59 GMT
- Title: Reconstruction, forecasting, and stability of chaotic dynamics from
partial data
- Authors: Elise \"Ozalp and Georgios Margazoglou and Luca Magri
- Abstract summary: We propose data-driven methods to infer the dynamics of hidden chaotic variables from partial observations.
We show that the proposed networks can forecast the hidden variables, both time-accurately and statistically.
This work opens new opportunities for reconstructing the full state, inferring hidden variables, and computing the stability of chaotic systems from partial data.
- Score: 4.266376725904727
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The forecasting and computation of the stability of chaotic systems from
partial observations are tasks for which traditional equation-based methods may
not be suitable. In this computational paper, we propose data-driven methods to
(i) infer the dynamics of unobserved (hidden) chaotic variables (full-state
reconstruction); (ii) time forecast the evolution of the full state; and (iii)
infer the stability properties of the full state. The tasks are performed with
long short-term memory (LSTM) networks, which are trained with observations
(data) limited to only part of the state: (i) the low-to-high resolution LSTM
(LH-LSTM), which takes partial observations as training input, and requires
access to the full system state when computing the loss; and (ii) the
physics-informed LSTM (PI-LSTM), which is designed to combine partial
observations with the integral formulation of the dynamical system's evolution
equations. First, we derive the Jacobian of the LSTMs. Second, we analyse a
chaotic partial differential equation, the Kuramoto-Sivashinsky (KS), and the
Lorenz-96 system. We show that the proposed networks can forecast the hidden
variables, both time-accurately and statistically. The Lyapunov exponents and
covariant Lyapunov vectors, which characterize the stability of the chaotic
attractors, are correctly inferred from partial observations. Third, the
PI-LSTM outperforms the LH-LSTM by successfully reconstructing the hidden
chaotic dynamics when the input dimension is smaller or similar to the
Kaplan-Yorke dimension of the attractor. This work opens new opportunities for
reconstructing the full state, inferring hidden variables, and computing the
stability of chaotic systems from partial data.
Related papers
- Inferring stability properties of chaotic systems on autoencoders' latent spaces [4.266376725904727]
In chaotic systems and turbulence, convolutional autoencoders and echo state networks (CAE-ESN) successfully forecast the dynamics.
We show that the CAE-ESN model infers the invariant stability properties and the geometry of the space vectors in the low-dimensional manifold.
This work opens up new opportunities for inferring the stability of high-dimensional chaotic systems in latent spaces.
arXiv Detail & Related papers (2024-10-23T16:25:36Z) - Stability analysis of chaotic systems in latent spaces [4.266376725904727]
We show that a latent-space approach can infer the solution of a chaotic partial differential equation.
It can also predict the stability properties of the physical system.
arXiv Detail & Related papers (2024-10-01T08:09:14Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - Physics-Informed Long Short-Term Memory for Forecasting and
Reconstruction of Chaos [5.8010446129208155]
We present the Physics-Informed Long Short-Term Memory (PI-LSTM) network to reconstruct and predict the evolution of unmeasured variables in a chaotic system.
The training is constrained by a regularization term, which penalizes solutions that violate the system's governing equations.
This work opens up new opportunities for state reconstruction and learning of the dynamics of nonlinear systems.
arXiv Detail & Related papers (2023-02-03T18:27:59Z) - Ensemble Reservoir Computing for Dynamical Systems: Prediction of
Phase-Space Stable Region for Hadron Storage Rings [0.0]
Echo State Networks (ESN) are a class of recurrent neural networks that are computationally effective.
We present the performance reached by ESN based on the prediction of the phase-space stability region.
We observe that the proposed ESN approach is capable of effectively predicting the time evolution of the extent of the dynamic aperture.
arXiv Detail & Related papers (2023-01-17T10:29:07Z) - Formal Controller Synthesis for Markov Jump Linear Systems with
Uncertain Dynamics [64.72260320446158]
We propose a method for synthesising controllers for Markov jump linear systems.
Our method is based on a finite-state abstraction that captures both the discrete (mode-jumping) and continuous (stochastic linear) behaviour of the MJLS.
We apply our method to multiple realistic benchmark problems, in particular, a temperature control and an aerial vehicle delivery problem.
arXiv Detail & Related papers (2022-12-01T17:36:30Z) - Likelihood-Free Inference in State-Space Models with Unknown Dynamics [71.94716503075645]
We introduce a method for inferring and predicting latent states in state-space models where observations can only be simulated, and transition dynamics are unknown.
We propose a way of doing likelihood-free inference (LFI) of states and state prediction with a limited number of simulations.
arXiv Detail & Related papers (2021-11-02T12:33:42Z) - Initializing LSTM internal states via manifold learning [0.6524460254566904]
We argue that the converged, "mature" internal states constitute a function on this learned manifold.
We show that learning this data manifold enables the transformation of partially observed dynamics into fully observed ones.
arXiv Detail & Related papers (2021-04-27T10:54:53Z) - Stability and Identification of Random Asynchronous Linear
Time-Invariant Systems [81.02274958043883]
We show the additional benefits of randomization and asynchrony on the stability of linear dynamical systems.
For unknown randomized LTI systems, we propose a systematic identification method to recover the underlying dynamics.
arXiv Detail & Related papers (2020-12-08T02:00:04Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Lipschitz Recurrent Neural Networks [100.72827570987992]
We show that our Lipschitz recurrent unit is more robust with respect to input and parameter perturbations as compared to other continuous-time RNNs.
Our experiments demonstrate that the Lipschitz RNN can outperform existing recurrent units on a range of benchmark tasks.
arXiv Detail & Related papers (2020-06-22T08:44:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.