Lie-Poisson Neural Networks (LPNets): Data-Based Computing of
Hamiltonian Systems with Symmetries
- URL: http://arxiv.org/abs/2308.15349v1
- Date: Tue, 29 Aug 2023 14:45:23 GMT
- Title: Lie-Poisson Neural Networks (LPNets): Data-Based Computing of
Hamiltonian Systems with Symmetries
- Authors: Christopher Eldred, Fran\c{c}ois Gay-Balmaz, Sofiia Huraka, Vakhtang
Putkaradze
- Abstract summary: An accurate data-based prediction of the long-term evolution of Hamiltonian systems requires a network that preserves the appropriate structure under each time step.
We present two flavors of such systems: one, where the parameters of transformations are computed from data using a dense neural network (LPNets), and another, where the composition of transformations is used as building blocks (G-LPNets)
The resulting methods are important for the construction of accurate data-based methods for simulating the long-term dynamics of physical systems.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: An accurate data-based prediction of the long-term evolution of Hamiltonian
systems requires a network that preserves the appropriate structure under each
time step. Every Hamiltonian system contains two essential ingredients: the
Poisson bracket and the Hamiltonian. Hamiltonian systems with symmetries, whose
paradigm examples are the Lie-Poisson systems, have been shown to describe a
broad category of physical phenomena, from satellite motion to underwater
vehicles, fluids, geophysical applications, complex fluids, and plasma physics.
The Poisson bracket in these systems comes from the symmetries, while the
Hamiltonian comes from the underlying physics. We view the symmetry of the
system as primary, hence the Lie-Poisson bracket is known exactly, whereas the
Hamiltonian is regarded as coming from physics and is considered not known, or
known approximately. Using this approach, we develop a network based on
transformations that exactly preserve the Poisson bracket and the special
functions of the Lie-Poisson systems (Casimirs) to machine precision. We
present two flavors of such systems: one, where the parameters of
transformations are computed from data using a dense neural network (LPNets),
and another, where the composition of transformations is used as building
blocks (G-LPNets). We also show how to adapt these methods to a larger class of
Poisson brackets. We apply the resulting methods to several examples, such as
rigid body (satellite) motion, underwater vehicles, a particle in a magnetic
field, and others. The methods developed in this paper are important for the
construction of accurate data-based methods for simulating the long-term
dynamics of physical systems.
Related papers
- Coarse-Graining Hamiltonian Systems Using WSINDy [0.0]
We show that WSINDy can successfully identify a reduced Hamiltonian system in the presence of large intrinsics.
WSINDy naturally preserves the Hamiltonian structure by restricting to a trial basis of Hamiltonian vector fields.
We also provide a contribution to averaging theory by proving that first-order averaging at the level of vector fields preserves Hamiltonian structure in nearly-periodic Hamiltonian systems.
arXiv Detail & Related papers (2023-10-09T17:20:04Z) - Discovering Symbolic Laws Directly from Trajectories with Hamiltonian
Graph Neural Networks [5.824034325431987]
We present a Hamiltonian graph neural network (HGNN) that learns the dynamics of systems directly from their trajectory.
We demonstrate the performance of HGNN on n-springs, n-pendulums, gravitational systems, and binary Lennard Jones systems.
arXiv Detail & Related papers (2023-07-11T14:43:25Z) - Machine learning in and out of equilibrium [58.88325379746631]
Our study uses a Fokker-Planck approach, adapted from statistical physics, to explore these parallels.
We focus in particular on the stationary state of the system in the long-time limit, which in conventional SGD is out of equilibrium.
We propose a new variation of Langevin dynamics (SGLD) that harnesses without replacement minibatching.
arXiv Detail & Related papers (2023-06-06T09:12:49Z) - Gaussian Process Port-Hamiltonian Systems: Bayesian Learning with
Physics Prior [17.812064311297117]
Data-driven approaches achieve remarkable results for the modeling of complex dynamics based on collected data.
These models often neglect basic physical principles which determine the behavior of any real-world system.
We propose a physics-informed Bayesian learning approach with uncertainty quantification.
arXiv Detail & Related papers (2023-05-15T20:59:41Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - On discrete symmetries of robotics systems: A group-theoretic and
data-driven analysis [38.92081817503126]
We study discrete morphological symmetries of dynamical systems.
These symmetries arise from the presence of one or more planes/axis of symmetry in the system's morphology.
We exploit these symmetries using data augmentation and $G$-equivariant neural networks.
arXiv Detail & Related papers (2023-02-21T04:10:16Z) - Hamiltonian Neural Networks with Automatic Symmetry Detection [0.0]
Hamiltonian neural networks (HNN) have been introduced to incorporate prior physical knowledge.
We enhance HNN with a Lie algebra framework to detect and embed symmetries in the neural network.
arXiv Detail & Related papers (2023-01-19T07:34:57Z) - Port-Hamiltonian Neural Networks with State Dependent Ports [58.720142291102135]
We stress-test the method on both simple mass-spring systems and more complex and realistic systems with several internal and external forces.
Port-Hamiltonian neural networks can be extended to larger dimensions with state-dependent ports.
We propose a symmetric high-order integrator for improved training on sparse and noisy data.
arXiv Detail & Related papers (2022-06-06T14:57:25Z) - Learning Hamiltonians of constrained mechanical systems [0.0]
Hamiltonian systems are an elegant and compact formalism in classical mechanics.
We propose new approaches for the accurate approximation of the Hamiltonian function of constrained mechanical systems.
arXiv Detail & Related papers (2022-01-31T14:03:17Z) - Neural-Network Quantum States for Periodic Systems in Continuous Space [66.03977113919439]
We introduce a family of neural quantum states for the simulation of strongly interacting systems in the presence of periodicity.
For one-dimensional systems we find very precise estimations of the ground-state energies and the radial distribution functions of the particles.
In two dimensions we obtain good estimations of the ground-state energies, comparable to results obtained from more conventional methods.
arXiv Detail & Related papers (2021-12-22T15:27:30Z) - Autoencoder-driven Spiral Representation Learning for Gravitational Wave
Surrogate Modelling [47.081318079190595]
We investigate the existence of underlying structures in the empirical coefficients using autoencoders.
We design a spiral module with learnable parameters, that is used as the first layer in a neural network, which learns to map the input space to the coefficients.
The spiral module is evaluated on multiple neural network architectures and consistently achieves better speed-accuracy trade-off than baseline models.
arXiv Detail & Related papers (2021-07-09T09:03:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.