Lagrangian Neural Networks
- URL: http://arxiv.org/abs/2003.04630v2
- Date: Thu, 30 Jul 2020 05:22:58 GMT
- Title: Lagrangian Neural Networks
- Authors: Miles Cranmer, Sam Greydanus, Stephan Hoyer, Peter Battaglia, David
Spergel, Shirley Ho
- Abstract summary: We propose Lagrangian Neural Networks (LNNs), which can parameterize arbitrary Lagrangians using neural networks.
In contrast to models that learn Hamiltonians, LNNs do not require canonical coordinates.
We show how this model can be applied to graphs and continuous systems using a Lagrangian Graph Network.
- Score: 3.0059120458540383
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Accurate models of the world are built upon notions of its underlying
symmetries. In physics, these symmetries correspond to conservation laws, such
as for energy and momentum. Yet even though neural network models see
increasing use in the physical sciences, they struggle to learn these
symmetries. In this paper, we propose Lagrangian Neural Networks (LNNs), which
can parameterize arbitrary Lagrangians using neural networks. In contrast to
models that learn Hamiltonians, LNNs do not require canonical coordinates, and
thus perform well in situations where canonical momenta are unknown or
difficult to compute. Unlike previous approaches, our method does not restrict
the functional form of learned energies and will produce energy-conserving
models for a variety of tasks. We test our approach on a double pendulum and a
relativistic particle, demonstrating energy conservation where a baseline
approach incurs dissipation and modeling relativity without canonical
coordinates where a Hamiltonian approach fails. Finally, we show how this model
can be applied to graphs and continuous systems using a Lagrangian Graph
Network, and demonstrate it on the 1D wave equation.
Related papers
- Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Hamiltonian Neural Networks with Automatic Symmetry Detection [0.0]
Hamiltonian neural networks (HNN) have been introduced to incorporate prior physical knowledge.
We enhance HNN with a Lie algebra framework to detect and embed symmetries in the neural network.
arXiv Detail & Related papers (2023-01-19T07:34:57Z) - Discrete Lagrangian Neural Networks with Automatic Symmetry Discovery [3.06483729892265]
We introduce a framework to learn a discrete Lagrangian along with its symmetry group from discrete observations of motions.
The learning process does not restrict the form of the Lagrangian, does not require velocity or momentum observations or predictions and incorporates a cost term.
arXiv Detail & Related papers (2022-11-20T00:46:33Z) - Learning Trajectories of Hamiltonian Systems with Neural Networks [81.38804205212425]
We propose to enhance Hamiltonian neural networks with an estimation of a continuous-time trajectory of the modeled system.
We demonstrate that the proposed integration scheme works well for HNNs, especially with low sampling rates, noisy and irregular observations.
arXiv Detail & Related papers (2022-04-11T13:25:45Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Ab-Initio Potential Energy Surfaces by Pairing GNNs with Neural Wave
Functions [2.61072980439312]
In this work, we combine a Graph Neural Network (GNN) with a neural wave function to simultaneously solve the Schr"odinger equation for multiple geometries via VMC.
Compared to existing state-of-the-art networks, our Potential Energy Surface Network (PESNet) speeds up training for multiple geometries by up to 40 times while matching or surpassing their accuracy.
arXiv Detail & Related papers (2021-10-11T07:58:31Z) - Lagrangian Neural Network with Differential Symmetries and Relational
Inductive Bias [5.017136256232997]
We present a momentum conserving Lagrangian neural network (MCLNN) that learns the Lagrangian of a system.
We also show that the model developed can generalize to systems of any arbitrary size.
arXiv Detail & Related papers (2021-10-07T08:49:57Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z) - Sparse Symplectically Integrated Neural Networks [15.191984347149667]
We introduce Sparselectically Integrated Neural Networks (SSINNs)
SSINNs are a novel model for learning Hamiltonian dynamical systems from data.
We evaluate SSINNs on four classical Hamiltonian dynamical problems.
arXiv Detail & Related papers (2020-06-10T03:33:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.