Learning Hamiltonian neural Koopman operator and simultaneously sustaining and discovering conservation law
- URL: http://arxiv.org/abs/2406.02154v1
- Date: Tue, 4 Jun 2024 09:42:34 GMT
- Title: Learning Hamiltonian neural Koopman operator and simultaneously sustaining and discovering conservation law
- Authors: Jingdong Zhang, Qunxi Zhu, Wei Lin,
- Abstract summary: We propose the Hamiltonian Neural Koopman Operator (HNKO), integrating the knowledge of mathematical physics in learning the Koopman operator.
We demonstrate the outperformance of the HNKO and its extension using a number of representative physical systems.
Our results suggest that feeding the prior knowledge of the underlying system and the mathematical theory appropriately to the learning framework can reinforce the capability of machine learning in solving physical problems.
- Score: 13.310284460452918
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Accurately finding and predicting dynamics based on the observational data with noise perturbations is of paramount significance but still a major challenge presently. Here, for the Hamiltonian mechanics, we propose the Hamiltonian Neural Koopman Operator (HNKO), integrating the knowledge of mathematical physics in learning the Koopman operator, and making it automatically sustain and even discover the conservation laws. We demonstrate the outperformance of the HNKO and its extension using a number of representative physical systems even with hundreds or thousands of freedoms. Our results suggest that feeding the prior knowledge of the underlying system and the mathematical theory appropriately to the learning framework can reinforce the capability of machine learning in solving physical problems.
Related papers
- Physics-informed neural networks viewpoint for solving the Dyson-Schwinger equations of quantum electrodynamics [0.0]
Physics-informed neural networks (PINNs) are employed to solve the Dyson--Schwinger equations of quantum electrodynamics (QED) in Euclidean space.
By inserting the integral equation directly into the loss function, our PINN framework enables a single neural network to learn a continuous and differentiable representation of the mass function over a spectrum of momenta.
arXiv Detail & Related papers (2024-11-04T15:36:17Z) - Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - Addressing the Non-perturbative Regime of the Quantum Anharmonic Oscillator by Physics-Informed Neural Networks [0.9374652839580183]
In quantum realm, such approach paves the way to a novel approach to solve the Schroedinger equation for non-integrable systems.
We investigate systems with real and imaginary frequency, laying the foundation for novel numerical methods to tackle problems emerging in quantum field theory.
arXiv Detail & Related papers (2024-05-22T08:34:52Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - On Robust Numerical Solver for ODE via Self-Attention Mechanism [82.95493796476767]
We explore training efficient and robust AI-enhanced numerical solvers with a small data size by mitigating intrinsic noise disturbances.
We first analyze the ability of the self-attention mechanism to regulate noise in supervised learning and then propose a simple-yet-effective numerical solver, Attr, which introduces an additive self-attention mechanism to the numerical solution of differential equations.
arXiv Detail & Related papers (2023-02-05T01:39:21Z) - Learning dynamical systems: an example from open quantum system dynamics [0.0]
We will study the dynamics of a small spin chain coupled with dephasing gates.
We show how Koopman operator learning is an approach to efficiently learn not only the evolution of the density matrix, but also of every physical observable associated to the system.
arXiv Detail & Related papers (2022-11-12T14:36:13Z) - Constants of motion network [0.0]
We present a neural network that can simultaneously learn the dynamics of the system and the constants of motion from data.
By exploiting the discovered constants of motion, it can produce better predictions on dynamics.
arXiv Detail & Related papers (2022-08-22T15:07:48Z) - Learning Neural Hamiltonian Dynamics: A Methodological Overview [109.40968389896639]
Hamiltonian dynamics endows neural networks with accurate long-term prediction, interpretability, and data-efficient learning.
We systematically survey recently proposed Hamiltonian neural network models, with a special emphasis on methodologies.
arXiv Detail & Related papers (2022-02-28T22:54:39Z) - Robust and Efficient Hamiltonian Learning [2.121963121603413]
We present a robust and efficient Hamiltonian learning method that circumvents limitations based on mild assumptions.
The proposed method can efficiently learn any Hamiltonian that is sparse on the Pauli basis using only short-time dynamics and local operations.
We numerically test the scaling and the estimation accuracy of the method for transverse field Ising Hamiltonian with random interaction strengths and molecular Hamiltonians.
arXiv Detail & Related papers (2022-01-01T13:48:15Z) - A Free Lunch from the Noise: Provable and Practical Exploration for
Representation Learning [55.048010996144036]
We show that under some noise assumption, we can obtain the linear spectral feature of its corresponding Markov transition operator in closed-form for free.
We propose Spectral Dynamics Embedding (SPEDE), which breaks the trade-off and completes optimistic exploration for representation learning by exploiting the structure of the noise.
arXiv Detail & Related papers (2021-11-22T19:24:57Z) - Symplectic Learning for Hamiltonian Neural Networks [0.0]
Hamiltonian Neural Networks (HNNs) took a first step towards a unified "gray box" approach.
We exploit the symplectic structure of Hamiltonian systems with a different loss function.
We mathematically guarantee the existence of an exact Hamiltonian function which the HNN can learn.
arXiv Detail & Related papers (2021-06-22T13:33:12Z) - Measuring and modeling the motor system with machine learning [117.44028458220427]
The utility of machine learning in understanding the motor system is promising a revolution in how to collect, measure, and analyze data.
We discuss the growing use of machine learning: from pose estimation, kinematic analyses, dimensionality reduction, and closed-loop feedback, to its use in understanding neural correlates and untangling sensorimotor systems.
arXiv Detail & Related papers (2021-03-22T12:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.