Applications of Machine Learning to Modelling and Analysing Dynamical
Systems
- URL: http://arxiv.org/abs/2308.03763v1
- Date: Sat, 22 Jul 2023 19:04:17 GMT
- Title: Applications of Machine Learning to Modelling and Analysing Dynamical
Systems
- Authors: Vedanta Thapar
- Abstract summary: We propose an architecture which combines existing Hamiltonian Neural Network structures into Adaptable Symplectic Recurrent Neural Networks.
This architecture is found to significantly outperform previously proposed neural networks when predicting Hamiltonian dynamics.
We show that this method works efficiently for single parameter potentials and provides accurate predictions even over long periods of time.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We explore the use of Physics Informed Neural Networks to analyse nonlinear
Hamiltonian Dynamical Systems with a first integral of motion. In this work, we
propose an architecture which combines existing Hamiltonian Neural Network
structures into Adaptable Symplectic Recurrent Neural Networks which preserve
Hamilton's equations as well as the symplectic structure of phase space while
predicting dynamics for the entire parameter space. This architecture is found
to significantly outperform previously proposed neural networks when predicting
Hamiltonian dynamics especially in potentials which contain multiple
parameters. We demonstrate its robustness using the nonlinear Henon-Heiles
potential under chaotic, quasiperiodic and periodic conditions.
The second problem we tackle is whether we can use the high dimensional
nonlinear capabilities of neural networks to predict the dynamics of a
Hamiltonian system given only partial information of the same. Hence we attempt
to take advantage of Long Short Term Memory networks to implement Takens'
embedding theorem and construct a delay embedding of the system followed by
mapping the topologically invariant attractor to the true form. This
architecture is then layered with Adaptable Symplectic nets to allow for
predictions which preserve the structure of Hamilton's equations. We show that
this method works efficiently for single parameter potentials and provides
accurate predictions even over long periods of time.
Related papers
- How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - Hamiltonian Neural Networks with Automatic Symmetry Detection [0.0]
Hamiltonian neural networks (HNN) have been introduced to incorporate prior physical knowledge.
We enhance HNN with a Lie algebra framework to detect and embed symmetries in the neural network.
arXiv Detail & Related papers (2023-01-19T07:34:57Z) - Learning Trajectories of Hamiltonian Systems with Neural Networks [81.38804205212425]
We propose to enhance Hamiltonian neural networks with an estimation of a continuous-time trajectory of the modeled system.
We demonstrate that the proposed integration scheme works well for HNNs, especially with low sampling rates, noisy and irregular observations.
arXiv Detail & Related papers (2022-04-11T13:25:45Z) - Learning Neural Hamiltonian Dynamics: A Methodological Overview [109.40968389896639]
Hamiltonian dynamics endows neural networks with accurate long-term prediction, interpretability, and data-efficient learning.
We systematically survey recently proposed Hamiltonian neural network models, with a special emphasis on methodologies.
arXiv Detail & Related papers (2022-02-28T22:54:39Z) - Locally-symplectic neural networks for learning volume-preserving
dynamics [0.0]
We propose locally-symplectic neural networks LocSympNets for learning volume-preserving dynamics.
The construction of LocSympNets stems from the theorem of local Hamiltonian description of the vector field of a volume-preserving dynamical system.
arXiv Detail & Related papers (2021-09-19T15:58:09Z) - Nonseparable Symplectic Neural Networks [23.77058934710737]
We propose a novel neural network architecture, Nonseparable Symplectic Neural Networks (NSSNNs)
NSSNNs uncover and embed the symplectic structure of a nonseparable Hamiltonian system from limited observation data.
We show the unique computational merits of our approach to yield long-term, accurate, and robust predictions for large-scale Hamiltonian systems.
arXiv Detail & Related papers (2020-10-23T19:50:13Z) - Sparse Symplectically Integrated Neural Networks [15.191984347149667]
We introduce Sparselectically Integrated Neural Networks (SSINNs)
SSINNs are a novel model for learning Hamiltonian dynamical systems from data.
We evaluate SSINNs on four classical Hamiltonian dynamical problems.
arXiv Detail & Related papers (2020-06-10T03:33:37Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Symplectic Neural Networks in Taylor Series Form for Hamiltonian Systems [15.523425139375226]
We propose an effective and lightweight learning algorithm, Symplectic Taylor Neural Networks (Taylor-nets)
We conduct continuous, long-term predictions of a complex Hamiltonian dynamic system based on sparse, short-term observations.
We demonstrate the efficacy of our Taylor-net in predicting a broad spectrum of Hamiltonian dynamic systems, including the pendulum, the Lotka--Volterra, the Kepler, and the H'enon--Heiles systems.
arXiv Detail & Related papers (2020-05-11T10:32:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.