Symplectic Momentum Neural Networks -- Using Discrete Variational
Mechanics as a prior in Deep Learning
- URL: http://arxiv.org/abs/2201.08281v2
- Date: Fri, 21 Jan 2022 13:52:03 GMT
- Title: Symplectic Momentum Neural Networks -- Using Discrete Variational
Mechanics as a prior in Deep Learning
- Authors: Saul Santos, Monica Ekal, Rodrigo Ventura
- Abstract summary: This paper introduces Sympic Momentum Networks (SyMo) as models from a discrete formulation of mechanics for non-separable mechanical systems.
We show that such combination not only allows these models tol earn from limited data but also provides the models with the capability of preserving the symplectic form and show better long-term behaviour.
- Score: 7.090165638014331
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: With deep learning being gaining attention from the research community for
prediction and control of real physical systems, learning important
representations is becoming now more than ever mandatory. It is of extremely
importance that deep learning representations are coherent with physics. When
learning from discrete data this can be guaranteed by including some sort of
prior into the learning, however not all discretization priors preserve
important structures from the physics. In this paper we introduce Symplectic
Momentum Neural Networks (SyMo) as models from a discrete formulation of
mechanics for non-separable mechanical systems. The combination of such
formulation leads SyMos to be constrained towards preserving important
geometric structures such as momentum and a symplectic form and learn from
limited data. Furthermore, it allows to learn dynamics only from the poses as
training data. We extend SyMos to include variational integrators within the
learning framework by developing an implicit root-find layer which leads to
End-to-End Symplectic Momentum Neural Networks (E2E-SyMo). Through experimental
results, using the pendulum and cartpole we show that such combination not only
allows these models tol earn from limited data but also provides the models
with the capability of preserving the symplectic form and show better long-term
behaviour.
Related papers
- Stretched and measured neural predictions of complex network dynamics [2.1024950052120417]
Data-driven approximations of differential equations present a promising alternative to traditional methods for uncovering a model of dynamical systems.
A recently employed machine learning tool for studying dynamics is neural networks, which can be used for data-driven solution finding or discovery of differential equations.
We show that extending the model's generalizability beyond traditional statistical learning theory limits is feasible.
arXiv Detail & Related papers (2023-01-12T09:44:59Z) - Physics-Inspired Temporal Learning of Quadrotor Dynamics for Accurate
Model Predictive Trajectory Tracking [76.27433308688592]
Accurately modeling quadrotor's system dynamics is critical for guaranteeing agile, safe, and stable navigation.
We present a novel Physics-Inspired Temporal Convolutional Network (PI-TCN) approach to learning quadrotor's system dynamics purely from robot experience.
Our approach combines the expressive power of sparse temporal convolutions and dense feed-forward connections to make accurate system predictions.
arXiv Detail & Related papers (2022-06-07T13:51:35Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - Physics-guided Deep Markov Models for Learning Nonlinear Dynamical
Systems with Uncertainty [6.151348127802708]
We propose a physics-guided framework, termed Physics-guided Deep Markov Model (PgDMM)
The proposed framework takes advantage of the expressive power of deep learning, while retaining the driving physics of the dynamical system.
arXiv Detail & Related papers (2021-10-16T16:35:12Z) - Physics-Coupled Spatio-Temporal Active Learning for Dynamical Systems [15.923190628643681]
One of the major challenges is to infer the underlying causes, which generate the perceived data stream.
Success of machine learning based predictive models requires massive annotated data for model training.
Our experiments on both synthetic and real-world datasets exhibit that the proposed ST-PCNN with active learning converges to optimal accuracy with substantially fewer instances.
arXiv Detail & Related papers (2021-08-11T18:05:55Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z) - Deep learning of contagion dynamics on complex networks [0.0]
We propose a complementary approach based on deep learning to build effective models of contagion dynamics on networks.
By allowing simulations on arbitrary network structures, our approach makes it possible to explore the properties of the learned dynamics beyond the training data.
Our results demonstrate how deep learning offers a new and complementary perspective to build effective models of contagion dynamics on networks.
arXiv Detail & Related papers (2020-06-09T17:18:34Z) - Learning Stable Deep Dynamics Models [91.90131512825504]
We propose an approach for learning dynamical systems that are guaranteed to be stable over the entire state space.
We show that such learning systems are able to model simple dynamical systems and can be combined with additional deep generative models to learn complex dynamics.
arXiv Detail & Related papers (2020-01-17T00:04:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.