Lagrangian Neural Network with Differential Symmetries and Relational
Inductive Bias
- URL: http://arxiv.org/abs/2110.03266v1
- Date: Thu, 7 Oct 2021 08:49:57 GMT
- Title: Lagrangian Neural Network with Differential Symmetries and Relational
Inductive Bias
- Authors: Ravinder Bhattoo, Sayan Ranu and N. M. Anoop Krishnan
- Abstract summary: We present a momentum conserving Lagrangian neural network (MCLNN) that learns the Lagrangian of a system.
We also show that the model developed can generalize to systems of any arbitrary size.
- Score: 5.017136256232997
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Realistic models of physical world rely on differentiable symmetries that, in
turn, correspond to conservation laws. Recent works on Lagrangian and
Hamiltonian neural networks show that the underlying symmetries of a system can
be easily learned by a neural network when provided with an appropriate
inductive bias. However, these models still suffer from issues such as
inability to generalize to arbitrary system sizes, poor interpretability, and
most importantly, inability to learn translational and rotational symmetries,
which lead to the conservation laws of linear and angular momentum,
respectively. Here, we present a momentum conserving Lagrangian neural network
(MCLNN) that learns the Lagrangian of a system, while also preserving the
translational and rotational symmetries. We test our approach on linear and
non-linear spring systems, and a gravitational system, demonstrating the energy
and momentum conservation. We also show that the model developed can generalize
to systems of any arbitrary size. Finally, we discuss the interpretability of
the MCLNN, which directly provides physical insights into the interactions of
multi-particle systems.
Related papers
- Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Unification of Symmetries Inside Neural Networks: Transformer,
Feedforward and Neural ODE [2.002741592555996]
This study introduces a novel approach by applying the principles of gauge symmetries, a key concept in physics, to neural network architectures.
We mathematically formulate the parametric redundancies in neural ODEs, and find that their gauge symmetries are given by spacetime diffeomorphisms.
Viewing neural ODEs as a continuum version of feedforward neural networks, we show that the parametric redundancies in feedforward neural networks are indeed lifted to diffeomorphisms in neural ODEs.
arXiv Detail & Related papers (2024-02-04T06:11:54Z) - Symmetry-regularized neural ordinary differential equations [0.0]
This paper introduces new conservation relations in Neural ODEs using Lie symmetries in both the hidden state dynamics and the back propagation dynamics.
These conservation laws are then incorporated into the loss function as additional regularization terms, potentially enhancing the physical interpretability and generalizability of the model.
New loss functions are constructed from these conservation relations, demonstrating the applicability symmetry-regularized Neural ODE in typical modeling tasks.
arXiv Detail & Related papers (2023-11-28T09:27:44Z) - SEGNO: Generalizing Equivariant Graph Neural Networks with Physical
Inductive Biases [66.61789780666727]
We show how the second-order continuity can be incorporated into GNNs while maintaining the equivariant property.
We also offer theoretical insights into SEGNO, highlighting that it can learn a unique trajectory between adjacent states.
Our model yields a significant improvement over the state-of-the-art baselines.
arXiv Detail & Related papers (2023-08-25T07:15:58Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Hamiltonian Neural Networks with Automatic Symmetry Detection [0.0]
Hamiltonian neural networks (HNN) have been introduced to incorporate prior physical knowledge.
We enhance HNN with a Lie algebra framework to detect and embed symmetries in the neural network.
arXiv Detail & Related papers (2023-01-19T07:34:57Z) - Discrete Lagrangian Neural Networks with Automatic Symmetry Discovery [3.06483729892265]
We introduce a framework to learn a discrete Lagrangian along with its symmetry group from discrete observations of motions.
The learning process does not restrict the form of the Lagrangian, does not require velocity or momentum observations or predictions and incorporates a cost term.
arXiv Detail & Related papers (2022-11-20T00:46:33Z) - Exact solutions of interacting dissipative systems via weak symmetries [77.34726150561087]
We analytically diagonalize the Liouvillian of a class Markovian dissipative systems with arbitrary strong interactions or nonlinearity.
This enables an exact description of the full dynamics and dissipative spectrum.
Our method is applicable to a variety of other systems, and could provide a powerful new tool for the study of complex driven-dissipative quantum systems.
arXiv Detail & Related papers (2021-09-27T17:45:42Z) - Machine Learning S-Wave Scattering Phase Shifts Bypassing the Radial
Schr\"odinger Equation [77.34726150561087]
We present a proof of concept machine learning model resting on a convolutional neural network capable to yield accurate scattering s-wave phase shifts.
We discuss how the Hamiltonian can serve as a guiding principle in the construction of a physically-motivated descriptor.
arXiv Detail & Related papers (2021-06-25T17:25:38Z) - Lagrangian Neural Networks [3.0059120458540383]
We propose Lagrangian Neural Networks (LNNs), which can parameterize arbitrary Lagrangians using neural networks.
In contrast to models that learn Hamiltonians, LNNs do not require canonical coordinates.
We show how this model can be applied to graphs and continuous systems using a Lagrangian Graph Network.
arXiv Detail & Related papers (2020-03-10T10:55:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.