Learning Trajectories of Hamiltonian Systems with Neural Networks
- URL: http://arxiv.org/abs/2204.05077v1
- Date: Mon, 11 Apr 2022 13:25:45 GMT
- Title: Learning Trajectories of Hamiltonian Systems with Neural Networks
- Authors: Katsiaryna Haitsiukevich and Alexander Ilin
- Abstract summary: We propose to enhance Hamiltonian neural networks with an estimation of a continuous-time trajectory of the modeled system.
We demonstrate that the proposed integration scheme works well for HNNs, especially with low sampling rates, noisy and irregular observations.
- Score: 81.38804205212425
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Modeling of conservative systems with neural networks is an area of active
research. A popular approach is to use Hamiltonian neural networks (HNNs) which
rely on the assumptions that a conservative system is described with Hamilton's
equations of motion. Many recent works focus on improving the integration
schemes used when training HNNs. In this work, we propose to enhance HNNs with
an estimation of a continuous-time trajectory of the modeled system using an
additional neural network, called a deep hidden physics model in the
literature. We demonstrate that the proposed integration scheme works well for
HNNs, especially with low sampling rates, noisy and irregular observations.
Related papers
- How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - Hamiltonian Neural Networks with Automatic Symmetry Detection [0.0]
Hamiltonian neural networks (HNN) have been introduced to incorporate prior physical knowledge.
We enhance HNN with a Lie algebra framework to detect and embed symmetries in the neural network.
arXiv Detail & Related papers (2023-01-19T07:34:57Z) - Physics-Informed Machine Learning of Dynamical Systems for Efficient
Bayesian Inference [0.0]
No-u-turn sampler (NUTS) is a widely adopted method for performing Bayesian inference.
Hamiltonian neural networks (HNNs) are a noteworthy architecture.
We propose the use of HNNs for performing Bayesian inference efficiently without requiring numerous posterior gradients.
arXiv Detail & Related papers (2022-09-19T21:17:23Z) - Extrapolation and Spectral Bias of Neural Nets with Hadamard Product: a
Polynomial Net Study [55.12108376616355]
The study on NTK has been devoted to typical neural network architectures, but is incomplete for neural networks with Hadamard products (NNs-Hp)
In this work, we derive the finite-width-K formulation for a special class of NNs-Hp, i.e., neural networks.
We prove their equivalence to the kernel regression predictor with the associated NTK, which expands the application scope of NTK.
arXiv Detail & Related papers (2022-09-16T06:36:06Z) - The Spectral Bias of Polynomial Neural Networks [63.27903166253743]
Polynomial neural networks (PNNs) have been shown to be particularly effective at image generation and face recognition, where high-frequency information is critical.
Previous studies have revealed that neural networks demonstrate a $textitspectral bias$ towards low-frequency functions, which yields faster learning of low-frequency components during training.
Inspired by such studies, we conduct a spectral analysis of the Tangent Kernel (NTK) of PNNs.
We find that the $Pi$-Net family, i.e., a recently proposed parametrization of PNNs, speeds up the
arXiv Detail & Related papers (2022-02-27T23:12:43Z) - A unified framework for Hamiltonian deep neural networks [3.0934684265555052]
Training deep neural networks (DNNs) can be difficult due to vanishing/exploding gradients during weight optimization.
We propose a class of DNNs stemming from the time discretization of Hamiltonian systems.
The proposed Hamiltonian framework, besides encompassing existing networks inspired by marginally stable ODEs, allows one to derive new and more expressive architectures.
arXiv Detail & Related papers (2021-04-27T13:20:24Z) - Adaptable Hamiltonian neural networks [0.0]
Hamiltonian Neural Networks (HNNs) represent a major class of physics-enhanced neural networks.
We introduce a class of HNNs capable of adaptable prediction of nonlinear physical systems.
We show that our parameter-cognizant HNN can successfully predict the route of transition to chaos.
arXiv Detail & Related papers (2021-02-25T23:53:51Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Sparse Symplectically Integrated Neural Networks [15.191984347149667]
We introduce Sparselectically Integrated Neural Networks (SSINNs)
SSINNs are a novel model for learning Hamiltonian dynamical systems from data.
We evaluate SSINNs on four classical Hamiltonian dynamical problems.
arXiv Detail & Related papers (2020-06-10T03:33:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.