Nonseparable Symplectic Neural Networks
- URL: http://arxiv.org/abs/2010.12636v3
- Date: Sat, 19 Feb 2022 22:35:35 GMT
- Title: Nonseparable Symplectic Neural Networks
- Authors: Shiying Xiong, Yunjin Tong, Xingzhe He, Shuqi Yang, Cheng Yang, Bo Zhu
- Abstract summary: We propose a novel neural network architecture, Nonseparable Symplectic Neural Networks (NSSNNs)
NSSNNs uncover and embed the symplectic structure of a nonseparable Hamiltonian system from limited observation data.
We show the unique computational merits of our approach to yield long-term, accurate, and robust predictions for large-scale Hamiltonian systems.
- Score: 23.77058934710737
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Predicting the behaviors of Hamiltonian systems has been drawing increasing
attention in scientific machine learning. However, the vast majority of the
literature was focused on predicting separable Hamiltonian systems with their
kinematic and potential energy terms being explicitly decoupled while building
data-driven paradigms to predict nonseparable Hamiltonian systems that are
ubiquitous in fluid dynamics and quantum mechanics were rarely explored. The
main computational challenge lies in the effective embedding of symplectic
priors to describe the inherently coupled evolution of position and momentum,
which typically exhibits intricate dynamics. To solve the problem, we propose a
novel neural network architecture, Nonseparable Symplectic Neural Networks
(NSSNNs), to uncover and embed the symplectic structure of a nonseparable
Hamiltonian system from limited observation data. The enabling mechanics of our
approach is an augmented symplectic time integrator to decouple the position
and momentum energy terms and facilitate their evolution. We demonstrated the
efficacy and versatility of our method by predicting a wide range of
Hamiltonian systems, both separable and nonseparable, including chaotic
vortical flows. We showed the unique computational merits of our approach to
yield long-term, accurate, and robust predictions for large-scale Hamiltonian
systems by rigorously enforcing symplectomorphism.
Related papers
- Artificial Kuramoto Oscillatory Neurons [65.16453738828672]
We introduce Artificial Kuramotoy Neurons (AKOrN) as a dynamical alternative to threshold units.
We show that this idea provides performance improvements across a wide spectrum of tasks.
We believe that these empirical results show the importance of our assumptions at the most basic neuronal level of neural representation.
arXiv Detail & Related papers (2024-10-17T17:47:54Z) - Learning Generalized Hamiltonians using fully Symplectic Mappings [0.32985979395737786]
Hamiltonian systems have the important property of being conservative, that is, energy is conserved throughout the evolution.
In particular Hamiltonian Neural Networks have emerged as a mechanism to incorporate structural inductive bias into the NN model.
We show that symplectic schemes are robust to noise and provide a good approximation of the system Hamiltonian when the state variables are sampled from a noisy observation.
arXiv Detail & Related papers (2024-09-17T12:45:49Z) - Injecting Hamiltonian Architectural Bias into Deep Graph Networks for Long-Range Propagation [55.227976642410766]
dynamics of information diffusion within graphs is a critical open issue that heavily influences graph representation learning.
Motivated by this, we introduce (port-)Hamiltonian Deep Graph Networks.
We reconcile under a single theoretical and practical framework both non-dissipative long-range propagation and non-conservative behaviors.
arXiv Detail & Related papers (2024-05-27T13:36:50Z) - Coarse-Graining Hamiltonian Systems Using WSINDy [0.0]
We show that WSINDy can successfully identify a reduced Hamiltonian system in the presence of large intrinsics.
WSINDy naturally preserves the Hamiltonian structure by restricting to a trial basis of Hamiltonian vector fields.
We also provide a contribution to averaging theory by proving that first-order averaging at the level of vector fields preserves Hamiltonian structure in nearly-periodic Hamiltonian systems.
arXiv Detail & Related papers (2023-10-09T17:20:04Z) - Applications of Machine Learning to Modelling and Analysing Dynamical
Systems [0.0]
We propose an architecture which combines existing Hamiltonian Neural Network structures into Adaptable Symplectic Recurrent Neural Networks.
This architecture is found to significantly outperform previously proposed neural networks when predicting Hamiltonian dynamics.
We show that this method works efficiently for single parameter potentials and provides accurate predictions even over long periods of time.
arXiv Detail & Related papers (2023-07-22T19:04:17Z) - Robust Hamiltonian Engineering for Interacting Qudit Systems [50.591267188664666]
We develop a formalism for the robust dynamical decoupling and Hamiltonian engineering of strongly interacting qudit systems.
We experimentally demonstrate these techniques in a strongly-interacting, disordered ensemble of spin-1 nitrogen-vacancy centers.
arXiv Detail & Related papers (2023-05-16T19:12:41Z) - Learning Trajectories of Hamiltonian Systems with Neural Networks [81.38804205212425]
We propose to enhance Hamiltonian neural networks with an estimation of a continuous-time trajectory of the modeled system.
We demonstrate that the proposed integration scheme works well for HNNs, especially with low sampling rates, noisy and irregular observations.
arXiv Detail & Related papers (2022-04-11T13:25:45Z) - Learning Neural Hamiltonian Dynamics: A Methodological Overview [109.40968389896639]
Hamiltonian dynamics endows neural networks with accurate long-term prediction, interpretability, and data-efficient learning.
We systematically survey recently proposed Hamiltonian neural network models, with a special emphasis on methodologies.
arXiv Detail & Related papers (2022-02-28T22:54:39Z) - Learning Hamiltonians of constrained mechanical systems [0.0]
Hamiltonian systems are an elegant and compact formalism in classical mechanics.
We propose new approaches for the accurate approximation of the Hamiltonian function of constrained mechanical systems.
arXiv Detail & Related papers (2022-01-31T14:03:17Z) - Sparse Symplectically Integrated Neural Networks [15.191984347149667]
We introduce Sparselectically Integrated Neural Networks (SSINNs)
SSINNs are a novel model for learning Hamiltonian dynamical systems from data.
We evaluate SSINNs on four classical Hamiltonian dynamical problems.
arXiv Detail & Related papers (2020-06-10T03:33:37Z) - On dissipative symplectic integration with applications to
gradient-based optimization [77.34726150561087]
We propose a geometric framework in which discretizations can be realized systematically.
We show that a generalization of symplectic to nonconservative and in particular dissipative Hamiltonian systems is able to preserve rates of convergence up to a controlled error.
arXiv Detail & Related papers (2020-04-15T00:36:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.