Graph neural network force fields for adiabatic dynamics of lattice Hamiltonians
- URL: http://arxiv.org/abs/2603.02039v1
- Date: Mon, 02 Mar 2026 16:23:25 GMT
- Title: Graph neural network force fields for adiabatic dynamics of lattice Hamiltonians
- Authors: Yunhao Fan, Gia-Wei Chern,
- Abstract summary: We develop a graph neural network (GNN)-based force-field framework for the adiabatic dynamics of lattice Hamiltonians.<n>Trained on exact-diagonalization data, the GNN achieves high force accuracy, strict linear scaling with system size, and directability to large lattices.<n>These results establish GNNs as an elegant and efficient architecture for symmetry-aware, large-scale dynamical simulations of correlated lattice systems.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Scalable and symmetry-consistent force-field models are essential for extending quantum-accurate simulations to large spatiotemporal scales. While descriptor-based neural networks can incorporate lattice symmetries through carefully engineered features, we show that graph neural networks (GNNs) provide a conceptually simpler and more unified alternative in which discrete lattice translation and point-group symmetries are enforced directly through local message passing and weight sharing. We develop a GNN-based force-field framework for the adiabatic dynamics of lattice Hamiltonians and demonstrate it for the semiclassical Holstein model. Trained on exact-diagonalization data, the GNN achieves high force accuracy, strict linear scaling with system size, and direct transferability to large lattices. Enabled by this scalability, we perform large-scale Langevin simulations of charge-density-wave ordering following thermal quenches, revealing dynamical scaling and anomalously slow sub--Allen--Cahn coarsening. These results establish GNNs as an elegant and efficient architecture for symmetry-aware, large-scale dynamical simulations of correlated lattice systems.
Related papers
- Equivariant Neural Networks for Force-Field Models of Lattice Systems [0.0]
We introduce a symmetry-preserving framework based on equivariant neural networks (ENNs)<n>Our approach aims to embed the discrete point-group and internal symmetries intrinsic to lattice models directly into the neural-network representation of the force field.<n>The resulting ML-enabled large-scale dynamical simulations faithfully capture mesoscale evolution of the symmetry-breaking phase.
arXiv Detail & Related papers (2026-01-07T17:09:04Z) - Tensor Network Framework for Forecasting Nonlinear and Chaotic Dynamics [1.790605517028706]
We present a tensor network model (TNM) for forecasting nonlinear and chaotic dynamics.<n>We show that the TNM accurately reconstructs short-term trajectories and faithfully captures the attractor geometry.
arXiv Detail & Related papers (2025-11-12T11:49:38Z) - Learning Hamiltonian Dynamics at Scale: A Differential-Geometric Approach [15.500592651570384]
This paper introduces a novel physics-inspired neural network that combines the conservation laws of Hamiltonian mechanics with the scalability of model order reduction.<n>Our experiments demonstrate that RO-HNN provides physically-consistent, stable, and generalizable predictions of complex high-dimensional dynamics.
arXiv Detail & Related papers (2025-09-29T11:36:35Z) - The GINN framework: a stochastic QED correspondence for stability and chaos in deep neural networks [0.0]
We develop a Euclidean field-theoretic approach that maps deep neural networks (DNNs) to quantum electrodynamics (QED)<n> Neural activations and weights are represented by fermionic matter and gauge fields.<n>We validate the theoretical predictions through numerical simulations of standard multilayer perceptrons.
arXiv Detail & Related papers (2025-08-26T11:41:11Z) - Spatiotemporal Graph Learning with Direct Volumetric Information Passing and Feature Enhancement [62.91536661584656]
We propose a dual-module framework, Cell-embedded and Feature-enhanced Graph Neural Network (aka, CeFeGNN) for learning.<n>We embed learnable cell attributions to the common node-edge message passing process, which better captures the spatial dependency of regional features.<n>Experiments on various PDE systems and one real-world dataset demonstrate that CeFeGNN achieves superior performance compared with other baselines.
arXiv Detail & Related papers (2024-09-26T16:22:08Z) - Neural P$^3$M: A Long-Range Interaction Modeling Enhancer for Geometric
GNNs [66.98487644676906]
We introduce Neural P$3$M, a versatile enhancer of geometric GNNs to expand the scope of their capabilities.
It exhibits flexibility across a wide range of molecular systems and demonstrates remarkable accuracy in predicting energies and forces.
It also achieves an average improvement of 22% on the OE62 dataset while integrating with various architectures.
arXiv Detail & Related papers (2024-09-26T08:16:59Z) - The Role of Fibration Symmetries in Geometric Deep Learning [0.0]
Geometric Deep Learning (GDL) unifies a broad class of machine learning techniques from the perspectives of symmetries.
We propose to relax GDL to allow for local symmetries, specifically fibration symmetries in graphs, to leverage regularities of realistic instances.
GNNs apply the inductive bias of fibration symmetries and derive a tighter upper bound for their expressive power.
arXiv Detail & Related papers (2024-08-28T16:04:40Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Message-Passing Neural Quantum States for the Homogeneous Electron Gas [41.94295877935867]
We introduce a message-passing-neural-network-based wave function Ansatz to simulate extended, strongly interacting fermions in continuous space.
We demonstrate its accuracy by simulating the ground state of the homogeneous electron gas in three spatial dimensions.
arXiv Detail & Related papers (2023-05-12T04:12:04Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Slow semiclassical dynamics of a two-dimensional Hubbard model in
disorder-free potentials [77.34726150561087]
We show that introduction of harmonic and spin-dependent linear potentials sufficiently validates fTWA for longer times.
In particular, we focus on a finite two-dimensional system and show that at intermediate linear potential strength, the addition of a harmonic potential and spin dependence of the tilt, results in subdiffusive dynamics.
arXiv Detail & Related papers (2022-10-03T16:51:25Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.