Learning normal form autoencoders for data-driven discovery of
universal,parameter-dependent governing equations
- URL: http://arxiv.org/abs/2106.05102v1
- Date: Wed, 9 Jun 2021 14:25:18 GMT
- Title: Learning normal form autoencoders for data-driven discovery of
universal,parameter-dependent governing equations
- Authors: Manu Kalia, Steven L. Brunton, Hil G.E. Meijer, Christoph Brune, J.
Nathan Kutz
- Abstract summary: Complex systems manifest a small number of instabilities and bifurcations that are canonical in nature.
Such parametric instabilities are mathematically characterized by their universal un-foldings, or normal form dynamics.
We introduce deep learning autoencoders to discover coordinate transformations that capture the underlying parametric dependence.
- Score: 3.769860395223177
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Complex systems manifest a small number of instabilities and bifurcations
that are canonical in nature, resulting in universal pattern forming
characteristics as a function of some parametric dependence. Such parametric
instabilities are mathematically characterized by their universal un-foldings,
or normal form dynamics, whereby a parsimonious model can be used to represent
the dynamics. Although center manifold theory guarantees the existence of such
low-dimensional normal forms, finding them has remained a long standing
challenge. In this work, we introduce deep learning autoencoders to discover
coordinate transformations that capture the underlying parametric dependence of
a dynamical system in terms of its canonical normal form, allowing for a simple
representation of the parametric dependence and bifurcation structure. The
autoencoder constrains the latent variable to adhere to a given normal form,
thus allowing it to learn the appropriate coordinate transformation. We
demonstrate the method on a number of example problems, showing that it can
capture a diverse set of normal forms associated with Hopf, pitchfork,
transcritical and/or saddle node bifurcations. This method shows how normal
forms can be leveraged as canonical and universal building blocks in deep
learning approaches for model discovery and reduced-order modeling.
Related papers
- Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Data-driven discovery of self-similarity using neural networks [0.0]
We present a novel neural network-based approach that discovers self-similarity directly from observed data.
The presence of self-similar solutions in a physical problem signals that the governing law contains a function whose arguments are given by power-law exponents.
We train the neural network model using the observed data, and when the training is successful, we can extract the power exponents that characterize scale-transformation symmetries of the physical problem.
arXiv Detail & Related papers (2024-06-06T09:36:05Z) - Shape Arithmetic Expressions: Advancing Scientific Discovery Beyond Closed-Form Equations [56.78271181959529]
Generalized Additive Models (GAMs) can capture non-linear relationships between variables and targets, but they cannot capture intricate feature interactions.
We propose Shape Expressions Arithmetic ( SHAREs) that fuses GAM's flexible shape functions with the complex feature interactions found in mathematical expressions.
We also design a set of rules for constructing SHAREs that guarantee transparency of the found expressions beyond the standard constraints.
arXiv Detail & Related papers (2024-04-15T13:44:01Z) - Stochastic parameter reduced-order model based on hybrid machine learning approaches [4.378407481656902]
This paper constructs a Convolutional Autoencoder-Reservoir Computing-Normalizing Flow algorithm framework.
The framework is used to characterize the evolution of latent state variables.
In this way, a data-driven reduced-order model is constructed to describe the complex system and its dynamic behavior.
arXiv Detail & Related papers (2024-03-24T06:52:37Z) - Gradient-Based Feature Learning under Structured Data [57.76552698981579]
In the anisotropic setting, the commonly used spherical gradient dynamics may fail to recover the true direction.
We show that appropriate weight normalization that is reminiscent of batch normalization can alleviate this issue.
In particular, under the spiked model with a suitably large spike, the sample complexity of gradient-based training can be made independent of the information exponent.
arXiv Detail & Related papers (2023-09-07T16:55:50Z) - DIFFormer: Scalable (Graph) Transformers Induced by Energy Constrained
Diffusion [66.21290235237808]
We introduce an energy constrained diffusion model which encodes a batch of instances from a dataset into evolutionary states.
We provide rigorous theory that implies closed-form optimal estimates for the pairwise diffusion strength among arbitrary instance pairs.
Experiments highlight the wide applicability of our model as a general-purpose encoder backbone with superior performance in various tasks.
arXiv Detail & Related papers (2023-01-23T15:18:54Z) - Why neural networks find simple solutions: the many regularizers of
geometric complexity [13.729491571993163]
We develop the notion of geometric complexity, which is a measure of the variability of the model function, computed using a discrete Dirichlet energy.
We show that many common trainings such as parameter norm regularization, spectral norm regularization, flatness regularization, gradient regularization, noise regularization, all act to control geometric complexity.
arXiv Detail & Related papers (2022-09-27T00:16:38Z) - Augmenting Implicit Neural Shape Representations with Explicit
Deformation Fields [95.39603371087921]
Implicit neural representation is a recent approach to learn shape collections as zero level-sets of neural networks.
We advocate deformation-aware regularization for implicit neural representations, aiming at producing plausible deformations as latent code changes.
arXiv Detail & Related papers (2021-08-19T22:07:08Z) - Coupling-based Invertible Neural Networks Are Universal Diffeomorphism
Approximators [72.62940905965267]
Invertible neural networks based on coupling flows (CF-INNs) have various machine learning applications such as image synthesis and representation learning.
Are CF-INNs universal approximators for invertible functions?
We prove a general theorem to show the equivalence of the universality for certain diffeomorphism classes.
arXiv Detail & Related papers (2020-06-20T02:07:37Z) - Self-Supervised Learning of Generative Spin-Glasses with Normalizing
Flows [0.0]
We develop continuous spin-glass distributions with normalizing flows to model correlations in generic discrete problems.
We demonstrate that key physical and computational properties of the spin-glass phase can be successfully learned.
Remarkably, we observe that the learning itself corresponds to a spin-glass phase transition within the layers of the trained normalizing flows.
arXiv Detail & Related papers (2020-01-02T19:00:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.