Discrete Lagrangian Neural Networks with Automatic Symmetry Discovery
- URL: http://arxiv.org/abs/2211.10830v1
- Date: Sun, 20 Nov 2022 00:46:33 GMT
- Title: Discrete Lagrangian Neural Networks with Automatic Symmetry Discovery
- Authors: Yana Lishkova, Paul Scherer, Steffen Ridderbusch, Mateja Jamnik,
Pietro Li\`o, Sina Ober-Bl\"obaum, Christian Offen
- Abstract summary: We introduce a framework to learn a discrete Lagrangian along with its symmetry group from discrete observations of motions.
The learning process does not restrict the form of the Lagrangian, does not require velocity or momentum observations or predictions and incorporates a cost term.
- Score: 3.06483729892265
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: By one of the most fundamental principles in physics, a dynamical system will
exhibit those motions which extremise an action functional. This leads to the
formation of the Euler-Lagrange equations, which serve as a model of how the
system will behave in time. If the dynamics exhibit additional symmetries, then
the motion fulfils additional conservation laws, such as conservation of energy
(time invariance), momentum (translation invariance), or angular momentum
(rotational invariance). To learn a system representation, one could learn the
discrete Euler-Lagrange equations, or alternatively, learn the discrete
Lagrangian function $\mathcal{L}_d$ which defines them. Based on ideas from Lie
group theory, in this work we introduce a framework to learn a discrete
Lagrangian along with its symmetry group from discrete observations of motions
and, therefore, identify conserved quantities. The learning process does not
restrict the form of the Lagrangian, does not require velocity or momentum
observations or predictions and incorporates a cost term which safeguards
against unwanted solutions and against potential numerical issues in forward
simulations. The learnt discrete quantities are related to their continuous
analogues using variational backward error analysis and numerical results
demonstrate the improvement such models can have both qualitatively and
quantitatively even in the presence of noise.
Related papers
- Lagrangian Neural Networks for Reversible Dissipative Evolution [0.04681661603096333]
Most commonly, conservative systems are modeled, in which there are no frictional losses, so the system may be run forward and backward in time without requiring regularization.
This work addresses systems in which the reverse direction is ill-posed because of the dissipation that occurs in forward evolution.
The novelty is the use of Morse-Feshbach Lagrangian, which models dissipative dynamics by doubling the number of dimensions of the system.
arXiv Detail & Related papers (2024-05-23T14:47:07Z) - Machine learning of hidden variables in multiscale fluid simulation [77.34726150561087]
Solving fluid dynamics equations often requires the use of closure relations that account for missing microphysics.
In our study, a partial differential equation simulator that is end-to-end differentiable is used to train judiciously placed neural networks.
We show that this method enables an equation based approach to reproduce non-linear, large Knudsen number plasma physics.
arXiv Detail & Related papers (2023-06-19T06:02:53Z) - Correspondence between open bosonic systems and stochastic differential
equations [77.34726150561087]
We show that there can also be an exact correspondence at finite $n$ when the bosonic system is generalized to include interactions with the environment.
A particular system with the form of a discrete nonlinear Schr"odinger equation is analyzed in more detail.
arXiv Detail & Related papers (2023-02-03T19:17:37Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Decimation technique for open quantum systems: a case study with
driven-dissipative bosonic chains [62.997667081978825]
Unavoidable coupling of quantum systems to external degrees of freedom leads to dissipative (non-unitary) dynamics.
We introduce a method to deal with these systems based on the calculation of (dissipative) lattice Green's function.
We illustrate the power of this method with several examples of driven-dissipative bosonic chains of increasing complexity.
arXiv Detail & Related papers (2022-02-15T19:00:09Z) - Structure-Preserving Learning Using Gaussian Processes and Variational
Integrators [62.31425348954686]
We propose the combination of a variational integrator for the nominal dynamics of a mechanical system and learning residual dynamics with Gaussian process regression.
We extend our approach to systems with known kinematic constraints and provide formal bounds on the prediction uncertainty.
arXiv Detail & Related papers (2021-12-10T11:09:29Z) - Lagrangian Neural Network with Differential Symmetries and Relational
Inductive Bias [5.017136256232997]
We present a momentum conserving Lagrangian neural network (MCLNN) that learns the Lagrangian of a system.
We also show that the model developed can generalize to systems of any arbitrary size.
arXiv Detail & Related papers (2021-10-07T08:49:57Z) - Learning non-stationary Langevin dynamics from stochastic observations
of latent trajectories [0.0]
Inferring Langevin equations from data can reveal how transient dynamics of such systems give rise to their function.
Here we present a non-stationary framework for inferring the Langevin equation, which explicitly models the observation process and non-stationary latent dynamics.
Omitting any of these non-stationary components results in incorrect inference, in which erroneous features arise in the dynamics due to non-stationary data distribution.
arXiv Detail & Related papers (2020-12-29T21:22:21Z) - LagNetViP: A Lagrangian Neural Network for Video Prediction [12.645753197663584]
We introduce a video prediction model where the equations of motion are explicitly constructed from learned representations of the underlying physical quantities.
We demonstrate the efficacy of this approach for video prediction on image sequences rendered in modified OpenAI gym Pendulum-v0 and Acrobot environments.
arXiv Detail & Related papers (2020-10-24T16:50:14Z) - The role of boundary conditions in quantum computations of scattering
observables [58.720142291102135]
Quantum computing may offer the opportunity to simulate strongly-interacting field theories, such as quantum chromodynamics, with physical time evolution.
As with present-day calculations, quantum computation strategies still require the restriction to a finite system size.
We quantify the volume effects for various $1+1$D Minkowski-signature quantities and show that these can be a significant source of systematic uncertainty.
arXiv Detail & Related papers (2020-07-01T17:43:11Z) - Lagrangian Neural Networks [3.0059120458540383]
We propose Lagrangian Neural Networks (LNNs), which can parameterize arbitrary Lagrangians using neural networks.
In contrast to models that learn Hamiltonians, LNNs do not require canonical coordinates.
We show how this model can be applied to graphs and continuous systems using a Lagrangian Graph Network.
arXiv Detail & Related papers (2020-03-10T10:55:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.