Dissipative Hamiltonian Neural Networks: Learning Dissipative and
Conservative Dynamics Separately
- URL: http://arxiv.org/abs/2201.10085v2
- Date: Wed, 26 Jan 2022 02:21:57 GMT
- Title: Dissipative Hamiltonian Neural Networks: Learning Dissipative and
Conservative Dynamics Separately
- Authors: Andrew Sosanya and Sam Greydanus
- Abstract summary: Recent work has shown that neural networks can learn such symmetries directly from data using Hamiltonian Neural Networks (HNNs)
In this paper, we ask whether it is possible to identify and decompose conservative and dissipative dynamics simultaneously.
We propose Dissipative Hamiltonian Neural Networks (D-HNNs), which parameterize both a Hamiltonian and a Rayleigh dissipation function. Taken together, they represent an implicit Helmholtz decomposition which can separate dissipative effects such as friction from symmetries such as conservation of energy.
- Score: 1.52292571922932
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Understanding natural symmetries is key to making sense of our complex and
ever-changing world. Recent work has shown that neural networks can learn such
symmetries directly from data using Hamiltonian Neural Networks (HNNs). But
HNNs struggle when trained on datasets where energy is not conserved. In this
paper, we ask whether it is possible to identify and decompose conservative and
dissipative dynamics simultaneously. We propose Dissipative Hamiltonian Neural
Networks (D-HNNs), which parameterize both a Hamiltonian and a Rayleigh
dissipation function. Taken together, they represent an implicit Helmholtz
decomposition which can separate dissipative effects such as friction from
symmetries such as conservation of energy. We train our model to decompose a
damped mass-spring system into its friction and inertial terms and then show
that this decomposition can be used to predict dynamics for unseen friction
coefficients. Then we apply our model to real world data including a large,
noisy ocean current dataset where decomposing the velocity field yields useful
scientific insights.
Related papers
- Hamiltonian Neural Networks approach to fuzzball geodesics [39.58317527488534]
Hamiltonian Neural Networks (HNNs) are tools that minimize a loss function to solve Hamilton equations of motion.
In this work, we implement several HNNs trained to solve, with high accuracy, the Hamilton equations for a massless probe moving inside a smooth and horizonless geometry known as D1-D5 circular fuzzball.
arXiv Detail & Related papers (2025-02-28T09:25:49Z) - Injecting Hamiltonian Architectural Bias into Deep Graph Networks for Long-Range Propagation [55.227976642410766]
dynamics of information diffusion within graphs is a critical open issue that heavily influences graph representation learning.
Motivated by this, we introduce (port-)Hamiltonian Deep Graph Networks.
We reconcile under a single theoretical and practical framework both non-dissipative long-range propagation and non-conservative behaviors.
arXiv Detail & Related papers (2024-05-27T13:36:50Z) - NeuralClothSim: Neural Deformation Fields Meet the Thin Shell Theory [70.10550467873499]
We propose NeuralClothSim, a new quasistatic cloth simulator using thin shells.
Our memory-efficient solver operates on a new continuous coordinate-based surface representation called neural deformation fields.
arXiv Detail & Related papers (2023-08-24T17:59:54Z) - Applications of Machine Learning to Modelling and Analysing Dynamical
Systems [0.0]
We propose an architecture which combines existing Hamiltonian Neural Network structures into Adaptable Symplectic Recurrent Neural Networks.
This architecture is found to significantly outperform previously proposed neural networks when predicting Hamiltonian dynamics.
We show that this method works efficiently for single parameter potentials and provides accurate predictions even over long periods of time.
arXiv Detail & Related papers (2023-07-22T19:04:17Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Hamiltonian Neural Networks with Automatic Symmetry Detection [0.0]
Hamiltonian neural networks (HNN) have been introduced to incorporate prior physical knowledge.
We enhance HNN with a Lie algebra framework to detect and embed symmetries in the neural network.
arXiv Detail & Related papers (2023-01-19T07:34:57Z) - Learning Neural Hamiltonian Dynamics: A Methodological Overview [109.40968389896639]
Hamiltonian dynamics endows neural networks with accurate long-term prediction, interpretability, and data-efficient learning.
We systematically survey recently proposed Hamiltonian neural network models, with a special emphasis on methodologies.
arXiv Detail & Related papers (2022-02-28T22:54:39Z) - Lagrangian Neural Network with Differential Symmetries and Relational
Inductive Bias [5.017136256232997]
We present a momentum conserving Lagrangian neural network (MCLNN) that learns the Lagrangian of a system.
We also show that the model developed can generalize to systems of any arbitrary size.
arXiv Detail & Related papers (2021-10-07T08:49:57Z) - Simultaneous boundary shape estimation and velocity field de-noising in
Magnetic Resonance Velocimetry using Physics-informed Neural Networks [70.7321040534471]
Magnetic resonance velocimetry (MRV) is a non-invasive technique widely used in medicine and engineering to measure the velocity field of a fluid.
Previous studies have required the shape of the boundary (for example, a blood vessel) to be known a priori.
We present a physics-informed neural network that instead uses the noisy MRV data alone to infer the most likely boundary shape and de-noised velocity field.
arXiv Detail & Related papers (2021-07-16T12:56:09Z) - Machine Learning S-Wave Scattering Phase Shifts Bypassing the Radial
Schr\"odinger Equation [77.34726150561087]
We present a proof of concept machine learning model resting on a convolutional neural network capable to yield accurate scattering s-wave phase shifts.
We discuss how the Hamiltonian can serve as a guiding principle in the construction of a physically-motivated descriptor.
arXiv Detail & Related papers (2021-06-25T17:25:38Z) - Sparse Symplectically Integrated Neural Networks [15.191984347149667]
We introduce Sparselectically Integrated Neural Networks (SSINNs)
SSINNs are a novel model for learning Hamiltonian dynamical systems from data.
We evaluate SSINNs on four classical Hamiltonian dynamical problems.
arXiv Detail & Related papers (2020-06-10T03:33:37Z) - Parsimonious neural networks learn interpretable physical laws [77.34726150561087]
We propose parsimonious neural networks (PNNs) that combine neural networks with evolutionary optimization to find models that balance accuracy with parsimony.
The power and versatility of the approach is demonstrated by developing models for classical mechanics and to predict the melting temperature of materials from fundamental properties.
arXiv Detail & Related papers (2020-05-08T16:15:47Z) - Lagrangian Neural Networks [3.0059120458540383]
We propose Lagrangian Neural Networks (LNNs), which can parameterize arbitrary Lagrangians using neural networks.
In contrast to models that learn Hamiltonians, LNNs do not require canonical coordinates.
We show how this model can be applied to graphs and continuous systems using a Lagrangian Graph Network.
arXiv Detail & Related papers (2020-03-10T10:55:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.