Autoencoders for discovering manifold dimension and coordinates in data
from complex dynamical systems
- URL: http://arxiv.org/abs/2305.01090v3
- Date: Wed, 6 Dec 2023 16:23:35 GMT
- Title: Autoencoders for discovering manifold dimension and coordinates in data
from complex dynamical systems
- Authors: Kevin Zeng, Carlos E. P\'erez De Jes\'us, Andrew J. Fox, Michael D.
Graham
- Abstract summary: Autoencoder framework combines implicit regularization with internal linear layers and $L$ regularization (weight decay)
We show that this framework can be naturally extended for applications of state-space modeling and forecasting.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While many phenomena in physics and engineering are formally
high-dimensional, their long-time dynamics often live on a lower-dimensional
manifold. The present work introduces an autoencoder framework that combines
implicit regularization with internal linear layers and $L_2$ regularization
(weight decay) to automatically estimate the underlying dimensionality of a
data set, produce an orthogonal manifold coordinate system, and provide the
mapping functions between the ambient space and manifold space, allowing for
out-of-sample projections. We validate our framework's ability to estimate the
manifold dimension for a series of datasets from dynamical systems of varying
complexities and compare to other state-of-the-art estimators. We analyze the
training dynamics of the network to glean insight into the mechanism of
low-rank learning and find that collectively each of the implicit regularizing
layers compound the low-rank representation and even self-correct during
training. Analysis of gradient descent dynamics for this architecture in the
linear case reveals the role of the internal linear layers in leading to faster
decay of a "collective weight variable" incorporating all layers, and the role
of weight decay in breaking degeneracies and thus driving convergence along
directions in which no decay would occur in its absence. We show that this
framework can be naturally extended for applications of state-space modeling
and forecasting by generating a data-driven dynamic model of a spatiotemporally
chaotic partial differential equation using only the manifold coordinates.
Finally, we demonstrate that our framework is robust to hyperparameter choices.
Related papers
- Automated Global Analysis of Experimental Dynamics through Low-Dimensional Linear Embeddings [3.825457221275617]
We introduce a data-driven computational framework to derive low-dimensional linear models for nonlinear dynamical systems.
This framework enables global stability analysis through interpretable linear models that capture the underlying system structure.
Our method offers a promising pathway to analyze complex dynamical behaviors across fields such as physics, climate science, and engineering.
arXiv Detail & Related papers (2024-11-01T19:27:47Z) - eXponential FAmily Dynamical Systems (XFADS): Large-scale nonlinear Gaussian state-space modeling [9.52474299688276]
We introduce a low-rank structured variational autoencoder framework for nonlinear state-space graphical models.
We show that our approach consistently demonstrates the ability to learn a more predictive generative model.
arXiv Detail & Related papers (2024-03-03T02:19:49Z) - Dynamics Harmonic Analysis of Robotic Systems: Application in Data-Driven Koopman Modelling [24.738444847113232]
We introduce the use of harmonic analysis to decompose the state space of symmetric robotic systems into isotypic subspaces.
For linear dynamics, we characterize how this decomposition leads to a subdivision of the dynamics into independent linear systems on each subspace.
Our architecture, validated on synthetic systems and the dynamics of locomotion of a quadrupedal robot, exhibits enhanced generalization, sample efficiency, and interpretability.
arXiv Detail & Related papers (2023-12-12T17:34:42Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - DIFFormer: Scalable (Graph) Transformers Induced by Energy Constrained
Diffusion [66.21290235237808]
We introduce an energy constrained diffusion model which encodes a batch of instances from a dataset into evolutionary states.
We provide rigorous theory that implies closed-form optimal estimates for the pairwise diffusion strength among arbitrary instance pairs.
Experiments highlight the wide applicability of our model as a general-purpose encoder backbone with superior performance in various tasks.
arXiv Detail & Related papers (2023-01-23T15:18:54Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Linearization and Identification of Multiple-Attractors Dynamical System
through Laplacian Eigenmaps [8.161497377142584]
We propose a Graph-based spectral clustering method that takes advantage of a velocity-augmented kernel to connect data-points belonging to the same dynamics.
We prove that there always exist a set of 2-dimensional embedding spaces in which the sub-dynamics are linear, and n-dimensional embedding where they are quasi-linear.
We learn a diffeomorphism from the Laplacian embedding space to the original space and show that the Laplacian embedding leads to good reconstruction accuracy and a faster training time.
arXiv Detail & Related papers (2022-02-18T12:43:25Z) - Data-Driven Reduced-Order Modeling of Spatiotemporal Chaos with Neural
Ordinary Differential Equations [0.0]
We present a data-driven reduced order modeling method that capitalizes on the chaotic dynamics of partial differential equations.
We find that dimension reduction improves performance relative to predictions in the ambient space.
With the low-dimensional model, we find excellent short- and long-time statistical recreation of the true dynamics for widely spaced data.
arXiv Detail & Related papers (2021-08-31T20:00:33Z) - S2RMs: Spatially Structured Recurrent Modules [105.0377129434636]
We take a step towards exploiting dynamic structure that are capable of simultaneously exploiting both modular andtemporal structures.
We find our models to be robust to the number of available views and better capable of generalization to novel tasks without additional training.
arXiv Detail & Related papers (2020-07-13T17:44:30Z) - Euclideanizing Flows: Diffeomorphic Reduction for Learning Stable
Dynamical Systems [74.80320120264459]
We present an approach to learn such motions from a limited number of human demonstrations.
The complex motions are encoded as rollouts of a stable dynamical system.
The efficacy of this approach is demonstrated through validation on an established benchmark as well demonstrations collected on a real-world robotic system.
arXiv Detail & Related papers (2020-05-27T03:51:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.