Learning Hamiltonian Dynamics at Scale: A Differential-Geometric Approach
- URL: http://arxiv.org/abs/2509.24627v1
- Date: Mon, 29 Sep 2025 11:36:35 GMT
- Title: Learning Hamiltonian Dynamics at Scale: A Differential-Geometric Approach
- Authors: Katharina Friedl, NoƩmie Jaquier, Mika Liao, Danica Kragic,
- Abstract summary: This paper introduces a novel physics-inspired neural network that combines the conservation laws of Hamiltonian mechanics with the scalability of model order reduction.<n>Our experiments demonstrate that RO-HNN provides physically-consistent, stable, and generalizable predictions of complex high-dimensional dynamics.
- Score: 15.500592651570384
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: By embedding physical intuition, network architectures enforce fundamental properties, such as energy conservation laws, leading to plausible predictions. Yet, scaling these models to intrinsically high-dimensional systems remains a significant challenge. This paper introduces Geometric Reduced-order Hamiltonian Neural Network (RO-HNN), a novel physics-inspired neural network that combines the conservation laws of Hamiltonian mechanics with the scalability of model order reduction. RO-HNN is built on two core components: a novel geometrically-constrained symplectic autoencoder that learns a low-dimensional, structure-preserving symplectic submanifold, and a geometric Hamiltonian neural network that models the dynamics on the submanifold. Our experiments demonstrate that RO-HNN provides physically-consistent, stable, and generalizable predictions of complex high-dimensional dynamics, thereby effectively extending the scope of Hamiltonian neural networks to high-dimensional physical systems.
Related papers
- Reduced-order modeling of Hamiltonian dynamics based on symplectic neural networks [9.050817345496709]
We introduce a novel data-driven symplectic induced-order modeling (ROM) framework for high-dimensional Hamiltonian systems.<n>Rom unifies latent-space discovery and dynamics learning within a single, end-to-end neural architecture.
arXiv Detail & Related papers (2025-08-16T05:09:28Z) - GeoHNNs: Geometric Hamiltonian Neural Networks [3.0846824529023382]
We introduce textitGeometric Hamiltonian Neural Networks (GeoHNN), a framework that learns dynamics by explicitly encoding the geometric priors inherent to physical laws.<n>We demonstrate through experiments on systems ranging from coupled oscillators to high-dimensional deformable objects that GeoHNN significantly outperforms existing models.
arXiv Detail & Related papers (2025-07-21T14:42:39Z) - Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - Random Matrix Theory for Deep Learning: Beyond Eigenvalues of Linear Models [51.85815025140659]
Modern Machine Learning (ML) and Deep Neural Networks (DNNs) often operate on high-dimensional data.<n>In particular, the proportional regime where the data dimension, sample size, and number of model parameters are all large gives rise to novel and sometimes counterintuitive behaviors.<n>This paper extends traditional Random Matrix Theory (RMT) beyond eigenvalue-based analysis of linear models to address the challenges posed by nonlinear ML models.
arXiv Detail & Related papers (2025-06-16T06:54:08Z) - Generalized Factor Neural Network Model for High-dimensional Regression [50.554377879576066]
We tackle the challenges of modeling high-dimensional data sets with latent low-dimensional structures hidden within complex, non-linear, and noisy relationships.<n>Our approach enables a seamless integration of concepts from non-parametric regression, factor models, and neural networks for high-dimensional regression.
arXiv Detail & Related papers (2025-02-16T23:13:55Z) - Recurrent convolutional neural networks for modeling non-adiabatic dynamics of quantum-classical systems [1.23088383881821]
We present a RNN model based on convolution neural networks for modeling the non-adiabatic dynamics of hybrid quantum-classical systems.<n>We demonstrate that the PARC-CNN architecture can effectively learn the statistical climate of the Holstein model under deep-quench conditions.
arXiv Detail & Related papers (2024-12-09T16:23:25Z) - A Riemannian Framework for Learning Reduced-order Lagrangian Dynamics [18.151022395233152]
We propose a novel geometric network architecture to learn physically-consistent reduced-order dynamic parameters.<n>Our approach enables accurate long-term predictions of the high-dimensional dynamics of rigid and deformable systems.
arXiv Detail & Related papers (2024-10-24T15:53:21Z) - Applications of Machine Learning to Modelling and Analysing Dynamical
Systems [0.0]
We propose an architecture which combines existing Hamiltonian Neural Network structures into Adaptable Symplectic Recurrent Neural Networks.
This architecture is found to significantly outperform previously proposed neural networks when predicting Hamiltonian dynamics.
We show that this method works efficiently for single parameter potentials and provides accurate predictions even over long periods of time.
arXiv Detail & Related papers (2023-07-22T19:04:17Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - Learning Trajectories of Hamiltonian Systems with Neural Networks [81.38804205212425]
We propose to enhance Hamiltonian neural networks with an estimation of a continuous-time trajectory of the modeled system.
We demonstrate that the proposed integration scheme works well for HNNs, especially with low sampling rates, noisy and irregular observations.
arXiv Detail & Related papers (2022-04-11T13:25:45Z) - Hyperbolic Neural Networks++ [66.16106727715061]
We generalize the fundamental components of neural networks in a single hyperbolic geometry model, namely, the Poincar'e ball model.
Experiments show the superior parameter efficiency of our methods compared to conventional hyperbolic components, and stability and outperformance over their Euclidean counterparts.
arXiv Detail & Related papers (2020-06-15T08:23:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.