OnsagerNet: Learning Stable and Interpretable Dynamics using a
Generalized Onsager Principle
- URL: http://arxiv.org/abs/2009.02327v3
- Date: Mon, 18 Oct 2021 02:35:51 GMT
- Title: OnsagerNet: Learning Stable and Interpretable Dynamics using a
Generalized Onsager Principle
- Authors: Haijun Yu, Xinyuan Tian, Weinan E and Qianxiao Li
- Abstract summary: We learn stable and physically interpretable dynamical models using sampled trajectory data from physical processes based on a generalized Onsager principle.
We further apply this method to study Rayleigh-Benard convection and learn Lorenz-like low dimensional autonomous reduced order models.
- Score: 19.13913681239968
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a systematic method for learning stable and physically
interpretable dynamical models using sampled trajectory data from physical
processes based on a generalized Onsager principle. The learned dynamics are
autonomous ordinary differential equations parameterized by neural networks
that retain clear physical structure information, such as free energy,
diffusion, conservative motion and external forces. For high dimensional
problems with a low dimensional slow manifold, an autoencoder with metric
preserving regularization is introduced to find the low dimensional generalized
coordinates on which we learn the generalized Onsager dynamics. Our method
exhibits clear advantages over existing methods on benchmark problems for
learning ordinary differential equations. We further apply this method to study
Rayleigh-Benard convection and learn Lorenz-like low dimensional autonomous
reduced order models that capture both qualitative and quantitative properties
of the underlying dynamics. This forms a general approach to building reduced
order models for forced dissipative systems.
Related papers
- Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Modeling Latent Neural Dynamics with Gaussian Process Switching Linear Dynamical Systems [2.170477444239546]
We develop an approach that balances these two objectives: the Gaussian Process Switching Linear Dynamical System (gpSLDS)
Our method builds on previous work modeling the latent state evolution via a differential equation whose nonlinear dynamics are described by a Gaussian process (GP-SDEs)
Our approach resolves key limitations of the rSLDS such as artifactual oscillations in dynamics near discrete state boundaries, while also providing posterior uncertainty estimates of the dynamics.
arXiv Detail & Related papers (2024-07-19T15:32:15Z) - Symmetry-regularized neural ordinary differential equations [0.0]
This paper introduces new conservation relations in Neural ODEs using Lie symmetries in both the hidden state dynamics and the back propagation dynamics.
These conservation laws are then incorporated into the loss function as additional regularization terms, potentially enhancing the physical interpretability and generalizability of the model.
New loss functions are constructed from these conservation relations, demonstrating the applicability symmetry-regularized Neural ODE in typical modeling tasks.
arXiv Detail & Related papers (2023-11-28T09:27:44Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - Stretched and measured neural predictions of complex network dynamics [2.1024950052120417]
Data-driven approximations of differential equations present a promising alternative to traditional methods for uncovering a model of dynamical systems.
A recently employed machine learning tool for studying dynamics is neural networks, which can be used for data-driven solution finding or discovery of differential equations.
We show that extending the model's generalizability beyond traditional statistical learning theory limits is feasible.
arXiv Detail & Related papers (2023-01-12T09:44:59Z) - Learning Low-Dimensional Quadratic-Embeddings of High-Fidelity Nonlinear
Dynamics using Deep Learning [9.36739413306697]
Learning dynamical models from data plays a vital role in engineering design, optimization, and predictions.
We use deep learning to identify low-dimensional embeddings for high-fidelity dynamical systems.
arXiv Detail & Related papers (2021-11-25T10:09:00Z) - Euclideanizing Flows: Diffeomorphic Reduction for Learning Stable
Dynamical Systems [74.80320120264459]
We present an approach to learn such motions from a limited number of human demonstrations.
The complex motions are encoded as rollouts of a stable dynamical system.
The efficacy of this approach is demonstrated through validation on an established benchmark as well demonstrations collected on a real-world robotic system.
arXiv Detail & Related papers (2020-05-27T03:51:57Z) - Symplectic ODE-Net: Learning Hamiltonian Dynamics with Control [14.24939133094439]
We introduce Symplectic ODE-Net (SymODEN), a deep learning framework which can infer the dynamics of a physical system.
In particular, we enforce Hamiltonian dynamics with control to learn the underlying dynamics in a transparent way.
This framework, by offering interpretable, physically-consistent models for physical systems, opens up new possibilities for synthesizing model-based control strategies.
arXiv Detail & Related papers (2019-09-26T13:13:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.