Adaptable Hamiltonian neural networks
- URL: http://arxiv.org/abs/2102.13235v1
- Date: Thu, 25 Feb 2021 23:53:51 GMT
- Title: Adaptable Hamiltonian neural networks
- Authors: Chen-Di Han, Bryan Glaz, Mulugeta Haile, and Ying-Cheng Lai
- Abstract summary: Hamiltonian Neural Networks (HNNs) represent a major class of physics-enhanced neural networks.
We introduce a class of HNNs capable of adaptable prediction of nonlinear physical systems.
We show that our parameter-cognizant HNN can successfully predict the route of transition to chaos.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The rapid growth of research in exploiting machine learning to predict
chaotic systems has revived a recent interest in Hamiltonian Neural Networks
(HNNs) with physical constraints defined by the Hamilton's equations of motion,
which represent a major class of physics-enhanced neural networks. We introduce
a class of HNNs capable of adaptable prediction of nonlinear physical systems:
by training the neural network based on time series from a small number of
bifurcation-parameter values of the target Hamiltonian system, the HNN can
predict the dynamical states at other parameter values, where the network has
not been exposed to any information about the system at these parameter values.
The architecture of the HNN differs from the previous ones in that we
incorporate an input parameter channel, rendering the HNN parameter--cognizant.
We demonstrate, using paradigmatic Hamiltonian systems, that training the HNN
using time series from as few as four parameter values bestows the neural
machine with the ability to predict the state of the target system in an entire
parameter interval. Utilizing the ensemble maximum Lyapunov exponent and the
alignment index as indicators, we show that our parameter-cognizant HNN can
successfully predict the route of transition to chaos. Physics-enhanced machine
learning is a forefront area of research, and our adaptable HNNs provide an
approach to understanding machine learning with broad applications.
Related papers
- Scalable Mechanistic Neural Networks [52.28945097811129]
We propose an enhanced neural network framework designed for scientific machine learning applications involving long temporal sequences.
By reformulating the original Mechanistic Neural Network (MNN) we reduce the computational time and space complexities from cubic and quadratic with respect to the sequence length, respectively, to linear.
Extensive experiments demonstrate that S-MNN matches the original MNN in precision while substantially reducing computational resources.
arXiv Detail & Related papers (2024-10-08T14:27:28Z) - Applications of Machine Learning to Modelling and Analysing Dynamical
Systems [0.0]
We propose an architecture which combines existing Hamiltonian Neural Network structures into Adaptable Symplectic Recurrent Neural Networks.
This architecture is found to significantly outperform previously proposed neural networks when predicting Hamiltonian dynamics.
We show that this method works efficiently for single parameter potentials and provides accurate predictions even over long periods of time.
arXiv Detail & Related papers (2023-07-22T19:04:17Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Physics-Informed Learning Using Hamiltonian Neural Networks with Output
Error Noise Models [0.0]
Hamiltonian Neural Networks (HNNs) implement Hamiltonian theory in deep learning.
This paper introduces an Output Error Hamiltonian Neural Network (OE-HNN) modeling approach to address the modeling of physical systems.
arXiv Detail & Related papers (2023-05-02T11:34:53Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - Hamiltonian Neural Networks with Automatic Symmetry Detection [0.0]
Hamiltonian neural networks (HNN) have been introduced to incorporate prior physical knowledge.
We enhance HNN with a Lie algebra framework to detect and embed symmetries in the neural network.
arXiv Detail & Related papers (2023-01-19T07:34:57Z) - Learning to Learn with Generative Models of Neural Network Checkpoints [71.06722933442956]
We construct a dataset of neural network checkpoints and train a generative model on the parameters.
We find that our approach successfully generates parameters for a wide range of loss prompts.
We apply our method to different neural network architectures and tasks in supervised and reinforcement learning.
arXiv Detail & Related papers (2022-09-26T17:59:58Z) - Learning Trajectories of Hamiltonian Systems with Neural Networks [81.38804205212425]
We propose to enhance Hamiltonian neural networks with an estimation of a continuous-time trajectory of the modeled system.
We demonstrate that the proposed integration scheme works well for HNNs, especially with low sampling rates, noisy and irregular observations.
arXiv Detail & Related papers (2022-04-11T13:25:45Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z) - Sparse Symplectically Integrated Neural Networks [15.191984347149667]
We introduce Sparselectically Integrated Neural Networks (SSINNs)
SSINNs are a novel model for learning Hamiltonian dynamical systems from data.
We evaluate SSINNs on four classical Hamiltonian dynamical problems.
arXiv Detail & Related papers (2020-06-10T03:33:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.