Neural Hamilton: Can A.I. Understand Hamiltonian Mechanics?
- URL: http://arxiv.org/abs/2410.20951v1
- Date: Mon, 28 Oct 2024 12:10:44 GMT
- Title: Neural Hamilton: Can A.I. Understand Hamiltonian Mechanics?
- Authors: Tae-Geun Kim, Seong Chan Park,
- Abstract summary: We propose a novel framework based on neural network that reformulates classical mechanics as an operator learning problem.
A machine directly maps a potential function to its corresponding trajectory in phase space without solving the Hamilton equations.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a novel framework based on neural network that reformulates classical mechanics as an operator learning problem. A machine directly maps a potential function to its corresponding trajectory in phase space without solving the Hamilton equations. Most notably, while conventional methods tend to accumulate errors over time through iterative time integration, our approach prevents error propagation. Two newly developed neural network architectures, namely VaRONet and MambONet, are introduced to adapt the Variational LSTM sequence-to-sequence model and leverage the Mamba model for efficient temporal dynamics processing. We tested our approach with various 1D physics problems: harmonic oscillation, double-well potentials, Morse potential, and other potential models outside the training data. Compared to traditional numerical methods based on the fourth-order Runge-Kutta (RK4) algorithm, our model demonstrates improved computational efficiency and accuracy. Code is available at: https://github.com/Axect/Neural_Hamilton
Related papers
- Hamiltonian Neural Networks approach to fuzzball geodesics [39.58317527488534]
Hamiltonian Neural Networks (HNNs) are tools that minimize a loss function to solve Hamilton equations of motion.
In this work, we implement several HNNs trained to solve, with high accuracy, the Hamilton equations for a massless probe moving inside a smooth and horizonless geometry known as D1-D5 circular fuzzball.
arXiv Detail & Related papers (2025-02-28T09:25:49Z) - Promises and Pitfalls of Generative Masked Language Modeling: Theoretical Framework and Practical Guidelines [74.42485647685272]
We focus on Generative Masked Language Models (GMLMs)
We train a model to fit conditional probabilities of the data distribution via masking, which are subsequently used as inputs to a Markov Chain to draw samples from the model.
We adapt the T5 model for iteratively-refined parallel decoding, achieving 2-3x speedup in machine translation with minimal sacrifice in quality.
arXiv Detail & Related papers (2024-07-22T18:00:00Z) - Foundational Inference Models for Dynamical Systems [5.549794481031468]
We offer a fresh perspective on the classical problem of imputing missing time series data, whose underlying dynamics are assumed to be determined by ODEs.
We propose a novel supervised learning framework for zero-shot time series imputation, through parametric functions satisfying some (hidden) ODEs.
We empirically demonstrate that one and the same (pretrained) recognition model can perform zero-shot imputation across 63 distinct time series with missing values.
arXiv Detail & Related papers (2024-02-12T11:48:54Z) - Equivariant Graph Neural Operator for Modeling 3D Dynamics [148.98826858078556]
We propose Equivariant Graph Neural Operator (EGNO) to directly models dynamics as trajectories instead of just next-step prediction.
EGNO explicitly learns the temporal evolution of 3D dynamics where we formulate the dynamics as a function over time and learn neural operators to approximate it.
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods.
arXiv Detail & Related papers (2024-01-19T21:50:32Z) - Human Trajectory Prediction via Neural Social Physics [63.62824628085961]
Trajectory prediction has been widely pursued in many fields, and many model-based and model-free methods have been explored.
We propose a new method combining both methodologies based on a new Neural Differential Equation model.
Our new model (Neural Social Physics or NSP) is a deep neural network within which we use an explicit physics model with learnable parameters.
arXiv Detail & Related papers (2022-07-21T12:11:18Z) - On the balance between the training time and interpretability of neural
ODE for time series modelling [77.34726150561087]
The paper shows that modern neural ODE cannot be reduced to simpler models for time-series modelling applications.
The complexity of neural ODE is compared to or exceeds the conventional time-series modelling tools.
We propose a new view on time-series modelling using combined neural networks and an ODE system approach.
arXiv Detail & Related papers (2022-06-07T13:49:40Z) - Learning Neural Hamiltonian Dynamics: A Methodological Overview [109.40968389896639]
Hamiltonian dynamics endows neural networks with accurate long-term prediction, interpretability, and data-efficient learning.
We systematically survey recently proposed Hamiltonian neural network models, with a special emphasis on methodologies.
arXiv Detail & Related papers (2022-02-28T22:54:39Z) - Symplectic Learning for Hamiltonian Neural Networks [0.0]
Hamiltonian Neural Networks (HNNs) took a first step towards a unified "gray box" approach.
We exploit the symplectic structure of Hamiltonian systems with a different loss function.
We mathematically guarantee the existence of an exact Hamiltonian function which the HNN can learn.
arXiv Detail & Related papers (2021-06-22T13:33:12Z) - Sparse Symplectically Integrated Neural Networks [15.191984347149667]
We introduce Sparselectically Integrated Neural Networks (SSINNs)
SSINNs are a novel model for learning Hamiltonian dynamical systems from data.
We evaluate SSINNs on four classical Hamiltonian dynamical problems.
arXiv Detail & Related papers (2020-06-10T03:33:37Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Hamiltonian Simulation Algorithms for Near-Term Quantum Hardware [6.445605125467574]
We develop quantum algorithms for Hamiltonian simulation "one level below" the circuit model.
We analyse the impact of these techniques under the standard error model.
We derive analytic circuit identities for efficiently synthesising multi-qubit evolutions from two-qubit interactions.
arXiv Detail & Related papers (2020-03-15T18:22:02Z) - Hamiltonian neural networks for solving equations of motion [3.1498833540989413]
We present a Hamiltonian neural network that solves the differential equations that govern dynamical systems.
A symplectic Euler integrator requires two orders more evaluation points than the Hamiltonian network in order to achieve the same order of the numerical error.
arXiv Detail & Related papers (2020-01-29T21:48:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.