Learning reversible symplectic dynamics
- URL: http://arxiv.org/abs/2204.12323v1
- Date: Tue, 26 Apr 2022 14:07:40 GMT
- Title: Learning reversible symplectic dynamics
- Authors: Riccardo Valperga, Kevin Webster, Victoria Klein, Dmitry Turaev and
Jeroen S. W. Lamb
- Abstract summary: We propose a new neural network architecture for learning time-reversible dynamical systems from data.
We focus on an adaptation to symplectic systems, because of their importance in physics-informed learning.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time-reversal symmetry arises naturally as a structural property in many
dynamical systems of interest. While the importance of hard-wiring symmetry is
increasingly recognized in machine learning, to date this has eluded
time-reversibility. In this paper we propose a new neural network architecture
for learning time-reversible dynamical systems from data. We focus in
particular on an adaptation to symplectic systems, because of their importance
in physics-informed learning.
Related papers
- Learning System Dynamics without Forgetting [60.08612207170659]
Predicting trajectories of systems with unknown dynamics is crucial in various research fields, including physics and biology.
We present a novel framework of Mode-switching Graph ODE (MS-GODE), which can continually learn varying dynamics.
We construct a novel benchmark of biological dynamic systems, featuring diverse systems with disparate dynamics.
arXiv Detail & Related papers (2024-06-30T14:55:18Z) - On instabilities in neural network-based physics simulators [0.0]
Long-time dynamics produced by neural networks are often unphysical or unstable.
We show that the rate of convergence of the training dynamics is uneven and depends on the distribution of energy in the data.
Injecting synthetic noise into the data during training adds damping to the training dynamics and can stabilize the learned simulator.
arXiv Detail & Related papers (2024-06-18T23:25:14Z) - Persistent learning signals and working memory without continuous
attractors [6.135577623169029]
We show that quasi-periodic attractors can support learning arbitrarily long temporal relationships.
Our theory has broad implications for the design of artificial learning systems.
arXiv Detail & Related papers (2023-08-24T06:12:41Z) - Critical Learning Periods for Multisensory Integration in Deep Networks [112.40005682521638]
We show that the ability of a neural network to integrate information from diverse sources hinges critically on being exposed to properly correlated signals during the early phases of training.
We show that critical periods arise from the complex and unstable early transient dynamics, which are decisive of final performance of the trained system and their learned representations.
arXiv Detail & Related papers (2022-10-06T23:50:38Z) - Symplectic Momentum Neural Networks -- Using Discrete Variational
Mechanics as a prior in Deep Learning [7.090165638014331]
This paper introduces Sympic Momentum Networks (SyMo) as models from a discrete formulation of mechanics for non-separable mechanical systems.
We show that such combination not only allows these models tol earn from limited data but also provides the models with the capability of preserving the symplectic form and show better long-term behaviour.
arXiv Detail & Related papers (2022-01-20T16:33:19Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Port-Hamiltonian Neural Networks for Learning Explicit Time-Dependent
Dynamical Systems [2.6084034060847894]
Accurately learning the temporal behavior of dynamical systems requires models with well-chosen learning biases.
Recent innovations embed the Hamiltonian and Lagrangian formalisms into neural networks.
We show that the proposed emphport-Hamiltonian neural network can efficiently learn the dynamics of nonlinear physical systems of practical interest.
arXiv Detail & Related papers (2021-07-16T17:31:54Z) - Learning Contact Dynamics using Physically Structured Neural Networks [81.73947303886753]
We use connections between deep neural networks and differential equations to design a family of deep network architectures for representing contact dynamics between objects.
We show that these networks can learn discontinuous contact events in a data-efficient manner from noisy observations.
Our results indicate that an idealised form of touch feedback is a key component of making this learning problem tractable.
arXiv Detail & Related papers (2021-02-22T17:33:51Z) - Time-Reversal Symmetric ODE Network [138.02741983098454]
Time-reversal symmetry is a fundamental property that frequently holds in classical and quantum mechanics.
We propose a novel loss function that measures how well our ordinary differential equation (ODE) networks comply with this time-reversal symmetry.
We show that, even for systems that do not possess the full time-reversal symmetry, TRS-ODENs can achieve better predictive performances over baselines.
arXiv Detail & Related papers (2020-07-22T12:19:40Z) - Learning Stable Deep Dynamics Models [91.90131512825504]
We propose an approach for learning dynamical systems that are guaranteed to be stable over the entire state space.
We show that such learning systems are able to model simple dynamical systems and can be combined with additional deep generative models to learn complex dynamics.
arXiv Detail & Related papers (2020-01-17T00:04:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.