SympNets: Intrinsic structure-preserving symplectic networks for
identifying Hamiltonian systems
- URL: http://arxiv.org/abs/2001.03750v3
- Date: Wed, 19 Aug 2020 06:14:49 GMT
- Title: SympNets: Intrinsic structure-preserving symplectic networks for
identifying Hamiltonian systems
- Authors: Pengzhan Jin, Zhen Zhang, Aiqing Zhu, Yifa Tang and George Em
Karniadakis
- Abstract summary: We propose new symplectic networks (SympNets) for identifying Hamiltonian systems from data based on a composition of linear, activation and gradient modules.
In particular, we define two classes of SympNets: the LA-SympNets composed of linear and activation modules, and the G-SympNets composed of gradient modules.
- Score: 2.6016814327894466
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose new symplectic networks (SympNets) for identifying Hamiltonian
systems from data based on a composition of linear, activation and gradient
modules. In particular, we define two classes of SympNets: the LA-SympNets
composed of linear and activation modules, and the G-SympNets composed of
gradient modules. Correspondingly, we prove two new universal approximation
theorems that demonstrate that SympNets can approximate arbitrary symplectic
maps based on appropriate activation functions. We then perform several
experiments including the pendulum, double pendulum and three-body problems to
investigate the expressivity and the generalization ability of SympNets. The
simulation results show that even very small size SympNets can generalize well,
and are able to handle both separable and non-separable Hamiltonian systems
with data points resulting from short or long time steps. In all the test
cases, SympNets outperform the baseline models, and are much faster in training
and prediction. We also develop an extended version of SympNets to learn the
dynamics from irregularly sampled data. This extended version of SympNets can
be thought of as a universal model representing the solution to an arbitrary
Hamiltonian system.
Related papers
- SympGNNs: Symplectic Graph Neural Networks for identifiying high-dimensional Hamiltonian systems and node classification [4.275204859038151]
Symplectic Graph Neural Networks (SympGNNs) can effectively handle system identification in high-dimensional Hamiltonian systems.
We show that SympGNN can overcome the oversmoothing and heterophily problems, two key challenges in the field of graph neural networks.
arXiv Detail & Related papers (2024-08-29T16:47:58Z) - CLPNets: Coupled Lie-Poisson Neural Networks for Multi-Part Hamiltonian Systems with Symmetries [0.0]
We develop a novel method of data-based computation and complete phase space learning of Hamiltonian systems.
We derive a novel system of mappings that are built into neural networks for coupled systems.
Our method shows good resistance to the curse of dimensionality, requiring only a few thousand data points for all cases studied.
arXiv Detail & Related papers (2024-08-28T22:45:15Z) - Symplectic Neural Networks Based on Dynamical Systems [0.0]
We present and analyze a framework for Symplectic neural networks (SympNets) based on geometric for Hamiltonian differential equations.
The SympNets are universal approximators in the space of Hamiltonian diffeomorphisms, interpretable and have a non-vanishing property.
arXiv Detail & Related papers (2024-08-19T09:18:28Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - RU-Net: Regularized Unrolling Network for Scene Graph Generation [92.95032610978511]
Scene graph generation (SGG) aims to detect objects and predict the relationships between each pair of objects.
Existing SGG methods usually suffer from several issues, including 1) ambiguous object representations, and 2) low diversity in relationship predictions.
We propose a regularized unrolling network (RU-Net) to address both problems.
arXiv Detail & Related papers (2022-05-03T04:21:15Z) - Locally-symplectic neural networks for learning volume-preserving
dynamics [0.0]
We propose locally-symplectic neural networks LocSympNets for learning volume-preserving dynamics.
The construction of LocSympNets stems from the theorem of local Hamiltonian description of the vector field of a volume-preserving dynamical system.
arXiv Detail & Related papers (2021-09-19T15:58:09Z) - The Separation Capacity of Random Neural Networks [78.25060223808936]
We show that a sufficiently large two-layer ReLU-network with standard Gaussian weights and uniformly distributed biases can solve this problem with high probability.
We quantify the relevant structure of the data in terms of a novel notion of mutual complexity.
arXiv Detail & Related papers (2021-07-31T10:25:26Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - A Multi-Semantic Metapath Model for Large Scale Heterogeneous Network
Representation Learning [52.83948119677194]
We propose a multi-semantic metapath (MSM) model for large scale heterogeneous representation learning.
Specifically, we generate multi-semantic metapath-based random walks to construct the heterogeneous neighborhood to handle the unbalanced distributions.
We conduct systematical evaluations for the proposed framework on two challenging datasets: Amazon and Alibaba.
arXiv Detail & Related papers (2020-07-19T22:50:20Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.