Inferring stability properties of chaotic systems on autoencoders' latent spaces
- URL: http://arxiv.org/abs/2410.18003v1
- Date: Wed, 23 Oct 2024 16:25:36 GMT
- Title: Inferring stability properties of chaotic systems on autoencoders' latent spaces
- Authors: Elise Özalp, Luca Magri,
- Abstract summary: In chaotic systems and turbulence, convolutional autoencoders and echo state networks (CAE-ESN) successfully forecast the dynamics.
We show that the CAE-ESN model infers the invariant stability properties and the geometry of the space vectors in the low-dimensional manifold.
This work opens up new opportunities for inferring the stability of high-dimensional chaotic systems in latent spaces.
- Score: 4.266376725904727
- License:
- Abstract: The data-driven learning of solutions of partial differential equations can be based on a divide-and-conquer strategy. First, the high dimensional data is compressed to a latent space with an autoencoder; and, second, the temporal dynamics are inferred on the latent space with a form of recurrent neural network. In chaotic systems and turbulence, convolutional autoencoders and echo state networks (CAE-ESN) successfully forecast the dynamics, but little is known about whether the stability properties can also be inferred. We show that the CAE-ESN model infers the invariant stability properties and the geometry of the tangent space in the low-dimensional manifold (i.e. the latent space) through Lyapunov exponents and covariant Lyapunov vectors. This work opens up new opportunities for inferring the stability of high-dimensional chaotic systems in latent spaces.
Related papers
- Stability analysis of chaotic systems in latent spaces [4.266376725904727]
We show that a latent-space approach can infer the solution of a chaotic partial differential equation.
It can also predict the stability properties of the physical system.
arXiv Detail & Related papers (2024-10-01T08:09:14Z) - Input-to-State Stable Coupled Oscillator Networks for Closed-form Model-based Control in Latent Space [2.527926867319859]
We argue that a promising avenue is to leverage powerful and well-understood closed-form strategies from control theory literature.
We identify three fundamental shortcomings in existing latent-space models that have so far prevented this powerful combination.
We propose a novel Coupled Network (CON) model that simultaneously tackles all these issues.
arXiv Detail & Related papers (2024-09-13T00:11:09Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Generative Modeling with Phase Stochastic Bridges [49.4474628881673]
Diffusion models (DMs) represent state-of-the-art generative models for continuous inputs.
We introduce a novel generative modeling framework grounded in textbfphase space dynamics
Our framework demonstrates the capability to generate realistic data points at an early stage of dynamics propagation.
arXiv Detail & Related papers (2023-10-11T18:38:28Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - Reconstruction, forecasting, and stability of chaotic dynamics from
partial data [4.266376725904727]
We propose data-driven methods to infer the dynamics of hidden chaotic variables from partial observations.
We show that the proposed networks can forecast the hidden variables, both time-accurately and statistically.
This work opens new opportunities for reconstructing the full state, inferring hidden variables, and computing the stability of chaotic systems from partial data.
arXiv Detail & Related papers (2023-05-24T13:01:51Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - Learning High-Dimensional Distributions with Latent Neural Fokker-Planck
Kernels [67.81799703916563]
We introduce new techniques to formulate the problem as solving Fokker-Planck equation in a lower-dimensional latent space.
Our proposed model consists of latent-distribution morphing, a generator and a parameterized Fokker-Planck kernel function.
arXiv Detail & Related papers (2021-05-10T17:42:01Z) - Stochastic embeddings of dynamical phenomena through variational
autoencoders [1.7205106391379026]
We use a recognition network to increase the observed space dimensionality during the reconstruction of the phase space.
Our validation shows that this approach not only recovers a state space that resembles the original one, but it is also able to synthetize new time series.
arXiv Detail & Related papers (2020-10-13T10:10:24Z) - Phase space learning with neural networks [0.0]
This work proposes an autoencoder neural network as a non-linear generalization of projection-based methods for solving Partial Differential Equations (PDEs)
The proposed deep learning architecture is capable of generating the dynamics of PDEs by integrating them completely in a very reduced latent space without intermediate reconstructions, to then decode the latent solution back to the original space.
It is shown the reliability of properly regularized neural networks to learn the global characteristics of a dynamical system's phase space from the sample data of a single path, as well as its ability to predict unseen bifurcations.
arXiv Detail & Related papers (2020-06-22T20:28:07Z) - Deep Variational Luenberger-type Observer for Stochastic Video
Prediction [46.82873654555665]
We study the problem of video prediction by combining interpretability of state space models and representation of deep neural networks.
Our model builds upon an variational encoder which transforms the input video into a latent feature space and a Luenberger-type observer which captures the dynamic evolution of the latent features.
arXiv Detail & Related papers (2020-02-12T06:59:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.