GeoHNNs: Geometric Hamiltonian Neural Networks
- URL: http://arxiv.org/abs/2507.15678v1
- Date: Mon, 21 Jul 2025 14:42:39 GMT
- Title: GeoHNNs: Geometric Hamiltonian Neural Networks
- Authors: Amine Mohamed Aboussalah, Abdessalam Ed-dib,
- Abstract summary: We introduce textitGeometric Hamiltonian Neural Networks (GeoHNN), a framework that learns dynamics by explicitly encoding the geometric priors inherent to physical laws.<n>We demonstrate through experiments on systems ranging from coupled oscillators to high-dimensional deformable objects that GeoHNN significantly outperforms existing models.
- Score: 3.0846824529023382
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The fundamental laws of physics are intrinsically geometric, dictating the evolution of systems through principles of symmetry and conservation. While modern machine learning offers powerful tools for modeling complex dynamics from data, common methods often ignore this underlying geometric fabric. Physics-informed neural networks, for instance, can violate fundamental physical principles, leading to predictions that are unstable over long periods, particularly for high-dimensional and chaotic systems. Here, we introduce \textit{Geometric Hamiltonian Neural Networks (GeoHNN)}, a framework that learns dynamics by explicitly encoding the geometric priors inherent to physical laws. Our approach enforces two fundamental structures: the Riemannian geometry of inertia, by parameterizing inertia matrices in their natural mathematical space of symmetric positive-definite matrices, and the symplectic geometry of phase space, using a constrained autoencoder to ensure the preservation of phase space volume in a reduced latent space. We demonstrate through experiments on systems ranging from coupled oscillators to high-dimensional deformable objects that GeoHNN significantly outperforms existing models. It achieves superior long-term stability, accuracy, and energy conservation, confirming that embedding the geometry of physics is not just a theoretical appeal but a practical necessity for creating robust and generalizable models of the physical world.
Related papers
- Learning Physical Systems: Symplectification via Gauge Fixing in Dirac Structures [8.633430288397376]
We introduce Presymplectification Networks (PSNs), the first framework to learn the symplectification lift via Dirac structures.<n>Our architecture combines a recurrent encoder with a flow-matching objective to learn the augmented phase-space dynamics end-to-end.<n>We then attach a lightweight Symplectic Network (SympNet) to forecast constrained trajectories while preserving energy, momentum, and constraint satisfaction.
arXiv Detail & Related papers (2025-06-23T16:23:37Z) - Random Matrix Theory for Deep Learning: Beyond Eigenvalues of Linear Models [51.85815025140659]
Modern Machine Learning (ML) and Deep Neural Networks (DNNs) often operate on high-dimensional data.<n>In particular, the proportional regime where the data dimension, sample size, and number of model parameters are all large gives rise to novel and sometimes counterintuitive behaviors.<n>This paper extends traditional Random Matrix Theory (RMT) beyond eigenvalue-based analysis of linear models to address the challenges posed by nonlinear ML models.
arXiv Detail & Related papers (2025-06-16T06:54:08Z) - Geometric Principles for Machine Learning of Dynamical Systems [0.0]
This paper proposes leveraging structure-rich geometric spaces for machine learning to achieve structural generalization.<n>We illustrate this view through the machine learning of linear time-invariant dynamical systems.
arXiv Detail & Related papers (2025-02-19T17:28:40Z) - Conservation-informed Graph Learning for Spatiotemporal Dynamics Prediction [84.26340606752763]
In this paper, we introduce the conservation-informed GNN (CiGNN), an end-to-end explainable learning framework.<n>The network is designed to conform to the general symmetry conservation law via symmetry where conservative and non-conservative information passes over a multiscale space by a latent temporal marching strategy.<n>Results demonstrate that CiGNN exhibits remarkable baseline accuracy and generalizability, and is readily applicable to learning for prediction of varioustemporal dynamics.
arXiv Detail & Related papers (2024-12-30T13:55:59Z) - Geometric Trajectory Diffusion Models [58.853975433383326]
Generative models have shown great promise in generating 3D geometric systems.
Existing approaches only operate on static structures, neglecting the fact that physical systems are always dynamic in nature.
We propose geometric trajectory diffusion models (GeoTDM), the first diffusion model for modeling the temporal distribution of 3D geometric trajectories.
arXiv Detail & Related papers (2024-10-16T20:36:41Z) - Form-Finding and Physical Property Predictions of Tensegrity Structures Using Deep Neural Networks [39.19016806159609]
We develop a deep neural network (DNN) approach to predict the geometric configurations and physical properties of tensegrity structures.
For validation, we analyze three tensegrity structures, including a tensegrity D-bar, prism, and lander.
arXiv Detail & Related papers (2024-06-15T16:39:53Z) - Geometry-aware framework for deep energy method: an application to structural mechanics with hyperelastic materials [2.271910267215261]
We introduce a physics-informed framework named the Geometry-Aware Deep Energy Method (GADEM) for solving structural mechanics problems.
Different ways to represent the geometric information and to encode the geometric latent vectors are investigated in this work.
We present some applications of GADEM to solve solid mechanics problems, including a loading simulation of a toy tire.
arXiv Detail & Related papers (2024-05-06T12:47:16Z) - A quatum inspired neural network for geometric modeling [14.214656118952178]
We introduce an innovative equivariant Matrix Product State (MPS)-based message-passing strategy.
Our method effectively models complex many-body relationships, suppressing mean-field approximations.
It seamlessly replaces the standard message-passing and layer-aggregation modules intrinsic to geometric GNNs.
arXiv Detail & Related papers (2024-01-03T15:59:35Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - PAC-NeRF: Physics Augmented Continuum Neural Radiance Fields for
Geometry-Agnostic System Identification [64.61198351207752]
Existing approaches to system identification (estimating the physical parameters of an object) from videos assume known object geometries.
In this work, we aim to identify parameters characterizing a physical system from a set of multi-view videos without any assumption on object geometry or topology.
We propose "Physics Augmented Continuum Neural Radiance Fields" (PAC-NeRF), to estimate both the unknown geometry and physical parameters of highly dynamic objects from multi-view videos.
arXiv Detail & Related papers (2023-03-09T18:59:50Z) - Hyperbolic Neural Networks++ [66.16106727715061]
We generalize the fundamental components of neural networks in a single hyperbolic geometry model, namely, the Poincar'e ball model.
Experiments show the superior parameter efficiency of our methods compared to conventional hyperbolic components, and stability and outperformance over their Euclidean counterparts.
arXiv Detail & Related papers (2020-06-15T08:23:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.