Discovering Symbolic Laws Directly from Trajectories with Hamiltonian
Graph Neural Networks
- URL: http://arxiv.org/abs/2307.05299v1
- Date: Tue, 11 Jul 2023 14:43:25 GMT
- Title: Discovering Symbolic Laws Directly from Trajectories with Hamiltonian
Graph Neural Networks
- Authors: Suresh Bishnoi, Ravinder Bhattoo, Jayadeva, Sayan Ranu, N M Anoop
Krishnan
- Abstract summary: We present a Hamiltonian graph neural network (HGNN) that learns the dynamics of systems directly from their trajectory.
We demonstrate the performance of HGNN on n-springs, n-pendulums, gravitational systems, and binary Lennard Jones systems.
- Score: 5.824034325431987
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The time evolution of physical systems is described by differential
equations, which depend on abstract quantities like energy and force.
Traditionally, these quantities are derived as functionals based on observables
such as positions and velocities. Discovering these governing symbolic laws is
the key to comprehending the interactions in nature. Here, we present a
Hamiltonian graph neural network (HGNN), a physics-enforced GNN that learns the
dynamics of systems directly from their trajectory. We demonstrate the
performance of HGNN on n-springs, n-pendulums, gravitational systems, and
binary Lennard Jones systems; HGNN learns the dynamics in excellent agreement
with the ground truth from small amounts of data. We also evaluate the ability
of HGNN to generalize to larger system sizes, and to hybrid spring-pendulum
system that is a combination of two original systems (spring and pendulum) on
which the models are trained independently. Finally, employing symbolic
regression on the learned HGNN, we infer the underlying equations relating the
energy functionals, even for complex systems such as the binary Lennard-Jones
liquid. Our framework facilitates the interpretable discovery of interaction
laws directly from physical system trajectories. Furthermore, this approach can
be extended to other systems with topology-dependent dynamics, such as cells,
polydisperse gels, or deformable bodies.
Related papers
- Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - SEGNO: Generalizing Equivariant Graph Neural Networks with Physical
Inductive Biases [66.61789780666727]
We show how the second-order continuity can be incorporated into GNNs while maintaining the equivariant property.
We also offer theoretical insights into SEGNO, highlighting that it can learn a unique trajectory between adjacent states.
Our model yields a significant improvement over the state-of-the-art baselines.
arXiv Detail & Related papers (2023-08-25T07:15:58Z) - Graph Neural Stochastic Differential Equations for Learning Brownian
Dynamics [6.362339104761225]
We propose a framework namely Brownian graph neural networks (BROGNET) to learn Brownian dynamics directly from the trajectory.
We show that BROGNET conserves the linear momentum of the system, which in turn, provides superior performance on learning dynamics.
arXiv Detail & Related papers (2023-06-20T10:30:46Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Unravelling the Performance of Physics-informed Graph Neural Networks
for Dynamical Systems [5.787429262238507]
We evaluate the performance of graph neural networks (GNNs) and their variants with explicit constraints and different architectures.
Our study demonstrates that GNNs with additional inductive biases, such as explicit constraints and decoupling of kinetic and potential energies, exhibit significantly enhanced performance.
All the physics-informed GNNs exhibit zero-shot generalizability to system sizes an order of magnitude larger than the training system, thus providing a promising route to simulate large-scale realistic systems.
arXiv Detail & Related papers (2022-11-10T12:29:30Z) - Learning Rigid Body Dynamics with Lagrangian Graph Neural Network [5.560715621814096]
We present a Lagrangian graph neural network (LGNN) that can learn the dynamics of rigid bodies by exploiting their topology.
We show that the LGNN can be used to model the dynamics of complex real-world structures such as the stability of tensegrity structures.
arXiv Detail & Related papers (2022-09-23T13:41:54Z) - Learning the Dynamics of Particle-based Systems with Lagrangian Graph
Neural Networks [5.560715621814096]
We present a framework, namely, Lagrangian graph neural network (LGnn), that provides a strong inductive bias to learn the Lagrangian of a particle-based system directly from the trajectory.
We show the zero-shot generalizability of the system by simulating systems two orders of magnitude larger than the trained one and also hybrid systems that are unseen by the model.
arXiv Detail & Related papers (2022-09-03T18:38:17Z) - Learning Trajectories of Hamiltonian Systems with Neural Networks [81.38804205212425]
We propose to enhance Hamiltonian neural networks with an estimation of a continuous-time trajectory of the modeled system.
We demonstrate that the proposed integration scheme works well for HNNs, especially with low sampling rates, noisy and irregular observations.
arXiv Detail & Related papers (2022-04-11T13:25:45Z) - Equivariant Graph Mechanics Networks with Constraints [83.38709956935095]
We propose Graph Mechanics Network (GMN) which is efficient, equivariant and constraint-aware.
GMN represents, by generalized coordinates, the forward kinematics information (positions and velocities) of a structural object.
Extensive experiments support the advantages of GMN compared to the state-of-the-art GNNs in terms of prediction accuracy, constraint satisfaction and data efficiency.
arXiv Detail & Related papers (2022-03-12T14:22:14Z) - Lagrangian Neural Network with Differential Symmetries and Relational
Inductive Bias [5.017136256232997]
We present a momentum conserving Lagrangian neural network (MCLNN) that learns the Lagrangian of a system.
We also show that the model developed can generalize to systems of any arbitrary size.
arXiv Detail & Related papers (2021-10-07T08:49:57Z) - Parsimonious neural networks learn interpretable physical laws [77.34726150561087]
We propose parsimonious neural networks (PNNs) that combine neural networks with evolutionary optimization to find models that balance accuracy with parsimony.
The power and versatility of the approach is demonstrated by developing models for classical mechanics and to predict the melting temperature of materials from fundamental properties.
arXiv Detail & Related papers (2020-05-08T16:15:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.