Linearization and Identification of Multiple-Attractors Dynamical System
through Laplacian Eigenmaps
- URL: http://arxiv.org/abs/2202.09171v1
- Date: Fri, 18 Feb 2022 12:43:25 GMT
- Title: Linearization and Identification of Multiple-Attractors Dynamical System
through Laplacian Eigenmaps
- Authors: Bernardo Fichera and Aude Billard
- Abstract summary: We propose a Graph-based spectral clustering method that takes advantage of a velocity-augmented kernel to connect data-points belonging to the same dynamics.
We prove that there always exist a set of 2-dimensional embedding spaces in which the sub-dynamics are linear, and n-dimensional embedding where they are quasi-linear.
We learn a diffeomorphism from the Laplacian embedding space to the original space and show that the Laplacian embedding leads to good reconstruction accuracy and a faster training time.
- Score: 8.161497377142584
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Dynamical Systems (DS) are fundamental to the modeling and understanding of
time evolving phenomena, and find application in physics, biology and control.
As determining an analytical description of the dynamics is often difficult,
data-driven approaches are preferred for identifying and controlling nonlinear
DS with multiple equilibrium points. Identification of such DS has been treated
largely as a supervised learning problem. Instead, we focus on a unsupervised
learning scenario where we know neither the number nor the type of dynamics. We
propose a Graph-based spectral clustering method that takes advantage of a
velocity-augmented kernel to connect data-points belonging to the same
dynamics, while preserving the natural temporal evolution. We study the
eigenvectors and eigenvalues of the Graph Laplacian and show that they form a
set of orthogonal embedding spaces, one for each sub-dynamics. We prove that
there always exist a set of 2-dimensional embedding spaces in which the
sub-dynamics are linear, and n-dimensional embedding where they are
quasi-linear. We compare the clustering performance of our algorithm to Kernel
K-Means, Spectral Clustering and Gaussian Mixtures and show that, even when
these algorithms are provided with the true number of sub-dynamics, they fail
to cluster them correctly. We learn a diffeomorphism from the Laplacian
embedding space to the original space and show that the Laplacian embedding
leads to good reconstruction accuracy and a faster training time through an
exponential decaying loss, compared to the state of the art
diffeomorphism-based approaches.
Related papers
- SEGNO: Generalizing Equivariant Graph Neural Networks with Physical
Inductive Biases [66.61789780666727]
We show how the second-order continuity can be incorporated into GNNs while maintaining the equivariant property.
We also offer theoretical insights into SEGNO, highlighting that it can learn a unique trajectory between adjacent states.
Our model yields a significant improvement over the state-of-the-art baselines.
arXiv Detail & Related papers (2023-08-25T07:15:58Z) - Tensor Decompositions Meet Control Theory: Learning General Mixtures of
Linear Dynamical Systems [19.47235707806519]
We give a new approach to learning mixtures of linear dynamical systems based on tensor decompositions.
Our algorithm succeeds without strong separation conditions on the components, and can be used to compete with the Bayes optimal clustering of the trajectories.
arXiv Detail & Related papers (2023-07-13T03:00:01Z) - Local Convergence of Gradient Descent-Ascent for Training Generative
Adversarial Networks [20.362912591032636]
We study the local dynamics of gradient descent-ascent (GDA) for training a GAN with a kernel-based discriminator.
We show phase transitions that indicate when the system converges, oscillates, or diverges.
arXiv Detail & Related papers (2023-05-14T23:23:08Z) - Autoencoders for discovering manifold dimension and coordinates in data
from complex dynamical systems [0.0]
Autoencoder framework combines implicit regularization with internal linear layers and $L$ regularization (weight decay)
We show that this framework can be naturally extended for applications of state-space modeling and forecasting.
arXiv Detail & Related papers (2023-05-01T21:14:47Z) - Propagating Kernel Ambiguity Sets in Nonlinear Data-driven Dynamics
Models [3.743859059772078]
Given a nonlinear data-driven dynamical system model, how can one propagate the ambiguity sets forward for multiple steps?
This problem is the key to solving distributionally robust control and learning-based control of such learned system models under a data-distribution shift.
We propose an algorithm that exactly propagates ambiguity sets through nonlinear data-driven models using the Koopman operator and CME, via the kernel maximum mean discrepancy geometry.
arXiv Detail & Related papers (2023-04-27T09:38:49Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Dynamic Bayesian Learning and Calibration of Spatiotemporal Mechanistic
System [0.0]
We develop an approach for fully learning and calibration of mechanistic models based on noisy observations.
We demonstrate this flexibility through solving problems arising in the analysis of ordinary and partial nonlinear differential equations.
arXiv Detail & Related papers (2022-08-12T23:17:46Z) - Dynamic Mode Decomposition in Adaptive Mesh Refinement and Coarsening
Simulations [58.720142291102135]
Dynamic Mode Decomposition (DMD) is a powerful data-driven method used to extract coherent schemes.
This paper proposes a strategy to enable DMD to extract from observations with different mesh topologies and dimensions.
arXiv Detail & Related papers (2021-04-28T22:14:25Z) - Euclideanizing Flows: Diffeomorphic Reduction for Learning Stable
Dynamical Systems [74.80320120264459]
We present an approach to learn such motions from a limited number of human demonstrations.
The complex motions are encoded as rollouts of a stable dynamical system.
The efficacy of this approach is demonstrated through validation on an established benchmark as well demonstrations collected on a real-world robotic system.
arXiv Detail & Related papers (2020-05-27T03:51:57Z) - Kernel and Rich Regimes in Overparametrized Models [69.40899443842443]
We show that gradient descent on overparametrized multilayer networks can induce rich implicit biases that are not RKHS norms.
We also demonstrate this transition empirically for more complex matrix factorization models and multilayer non-linear networks.
arXiv Detail & Related papers (2020-02-20T15:43:02Z) - Learning Bijective Feature Maps for Linear ICA [73.85904548374575]
We show that existing probabilistic deep generative models (DGMs) which are tailor-made for image data, underperform on non-linear ICA tasks.
To address this, we propose a DGM which combines bijective feature maps with a linear ICA model to learn interpretable latent structures for high-dimensional data.
We create models that converge quickly, are easy to train, and achieve better unsupervised latent factor discovery than flow-based models, linear ICA, and Variational Autoencoders on images.
arXiv Detail & Related papers (2020-02-18T17:58:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.