Learning Poisson systems and trajectories of autonomous systems via
Poisson neural networks
- URL: http://arxiv.org/abs/2012.03133v1
- Date: Sat, 5 Dec 2020 22:18:29 GMT
- Title: Learning Poisson systems and trajectories of autonomous systems via
Poisson neural networks
- Authors: Pengzhan Jin, Zhen Zhang, Ioannis G. Kevrekidis and George Em
Karniadakis
- Abstract summary: We propose the Poisson neural networks (PNNs) to learn Poisson systems and trajectories of autonomous systems from data.
Based on the Darboux-Lie theorem, the phase flow of a Poisson system can be written as the composition of (1) a coordinate transformation, (2) an extended symplectic map and (3) the inverse of the transformation.
- Score: 3.225972620435058
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose the Poisson neural networks (PNNs) to learn Poisson systems and
trajectories of autonomous systems from data. Based on the Darboux-Lie theorem,
the phase flow of a Poisson system can be written as the composition of (1) a
coordinate transformation, (2) an extended symplectic map and (3) the inverse
of the transformation. In this work, we extend this result to the unknotted
trajectories of autonomous systems. We employ structured neural networks with
physical priors to approximate the three aforementioned maps. We demonstrate
through several simulations that PNNs are capable of handling very accurately
several challenging tasks, including the motion of a particle in the
electromagnetic potential, the nonlinear Schr{\"o}dinger equation, and pixel
observations of the two-body problem.
Related papers
- Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - CLPNets: Coupled Lie-Poisson Neural Networks for Multi-Part Hamiltonian Systems with Symmetries [0.0]
We develop a novel method of data-based computation and complete phase space learning of Hamiltonian systems.
We derive a novel system of mappings that are built into neural networks for coupled systems.
Our method shows good resistance to the curse of dimensionality, requiring only a few thousand data points for all cases studied.
arXiv Detail & Related papers (2024-08-28T22:45:15Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Learning Governing Equations of Unobserved States in Dynamical Systems [0.0]
We employ a hybrid neural ODE structure to learn governing equations of partially-observed dynamical systems.
We demonstrate that the method is capable of successfully learning the true underlying governing equations of unobserved states within these systems.
arXiv Detail & Related papers (2024-04-29T10:28:14Z) - Lie-Poisson Neural Networks (LPNets): Data-Based Computing of
Hamiltonian Systems with Symmetries [0.0]
An accurate data-based prediction of the long-term evolution of Hamiltonian systems requires a network that preserves the appropriate structure under each time step.
We present two flavors of such systems: one, where the parameters of transformations are computed from data using a dense neural network (LPNets), and another, where the composition of transformations is used as building blocks (G-LPNets)
The resulting methods are important for the construction of accurate data-based methods for simulating the long-term dynamics of physical systems.
arXiv Detail & Related papers (2023-08-29T14:45:23Z) - Pseudo-Hamiltonian neural networks for learning partial differential
equations [0.0]
Pseudo-Hamiltonian neural networks (PHNN) were recently introduced for learning dynamical systems that can be modelled by ordinary differential equations.
In this paper, we extend the method to partial differential equations.
The resulting model is comprised of up to three neural networks, modelling terms representing conservation, dissipation and external forces, and discrete convolution operators that can either be learned or be given as input.
arXiv Detail & Related papers (2023-04-27T17:46:00Z) - Deep learning of spatial densities in inhomogeneous correlated quantum
systems [0.0]
We show that we can learn to predict densities using convolutional neural networks trained on random potentials.
We show that our approach can handle well the interplay of interference and interactions and the behaviour of models with phase transitions in inhomogeneous situations.
arXiv Detail & Related papers (2022-11-16T17:10:07Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - A Predictive Coding Account for Chaotic Itinerancy [68.8204255655161]
We show how a recurrent neural network implementing predictive coding can generate neural trajectories similar to chaotic itinerancy in the presence of input noise.
We propose two scenarios generating random and past-independent attractor switching trajectories using our model.
arXiv Detail & Related papers (2021-06-16T16:48:14Z) - Linear embedding of nonlinear dynamical systems and prospects for
efficient quantum algorithms [74.17312533172291]
We describe a method for mapping any finite nonlinear dynamical system to an infinite linear dynamical system (embedding)
We then explore an approach for approximating the resulting infinite linear system with finite linear systems (truncation)
arXiv Detail & Related papers (2020-12-12T00:01:10Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.