Graph Attention Hamiltonian Neural Networks: A Lattice System Analysis Model Based on Structural Learning
- URL: http://arxiv.org/abs/2412.10821v1
- Date: Sat, 14 Dec 2024 13:03:15 GMT
- Title: Graph Attention Hamiltonian Neural Networks: A Lattice System Analysis Model Based on Structural Learning
- Authors: Ru Geng, Yixian Gao, Jian Zu, Hong-Kun Zhang,
- Abstract summary: We propose Graph Attention Hamiltonian Neural Network (GAHN), a neural network method that can understand the underlying structure of lattice Hamiltonian systems.
We can determine which particles in the system interact with each other, the proportion of interactions between different particles, and whether the potential energy of interactions between particles exhibits even symmetry or not.
The obtained structure helps the neural network model to continue predicting the trajectory of the system and further understand the dynamic properties of the system.
- Score: 0.0
- License:
- Abstract: A deep understanding of the intricate interactions between particles within a system is a key approach to revealing the essential characteristics of the system, whether it is an in-depth analysis of molecular properties in the field of chemistry or the design of new materials for specific performance requirements in materials science. To this end, we propose Graph Attention Hamiltonian Neural Network (GAHN), a neural network method that can understand the underlying structure of lattice Hamiltonian systems solely through the dynamic trajectories of particles. We can determine which particles in the system interact with each other, the proportion of interactions between different particles, and whether the potential energy of interactions between particles exhibits even symmetry or not. The obtained structure helps the neural network model to continue predicting the trajectory of the system and further understand the dynamic properties of the system. In addition to understanding the underlying structure of the system, it can be used for detecting lattice structural abnormalities, such as link defects, abnormal interactions, etc. These insights benefit system optimization, design, and detection of aging or damage. Moreover, this approach can integrate other components to deduce the link structure needed for specific parts, showcasing its scalability and potential. We tested it on a challenging molecular dynamics dataset, and the results proved its ability to accurately infer molecular bond connectivity, highlighting its scientific research potential.
Related papers
- Decomposing heterogeneous dynamical systems with graph neural networks [0.16492989697868887]
We show that graph neural networks can be designed to jointly learn the interaction rules and the structure of the heterogeneous system.
The learned latent structure and dynamics can be used to virtually decompose the complex system.
arXiv Detail & Related papers (2024-07-27T04:03:12Z) - Inferring Relational Potentials in Interacting Systems [56.498417950856904]
We propose Neural Interaction Inference with Potentials (NIIP) as an alternative approach to discover such interactions.
NIIP assigns low energy to the subset of trajectories which respect the relational constraints observed.
It allows trajectory manipulation, such as interchanging interaction types across separately trained models, as well as trajectory forecasting.
arXiv Detail & Related papers (2023-10-23T00:44:17Z) - Imaginary components of out-of-time correlators and information
scrambling for navigating the learning landscape of a quantum machine
learning model [0.0]
We analytically illustrate that hitherto unexplored imaginary components of out-of-time correlators can provide unprecedented insight into the information scrambling capacity of a graph neural network.
Such an analysis demystifies the training of quantum machine learning models by unraveling how quantum information is scrambled through such a network.
arXiv Detail & Related papers (2022-08-29T06:17:28Z) - Graph neural networks for the prediction of molecular structure-property
relationships [59.11160990637615]
Graph neural networks (GNNs) are a novel machine learning method that directly work on the molecular graph.
GNNs allow to learn properties in an end-to-end fashion, thereby avoiding the need for informative descriptors.
We describe the fundamentals of GNNs and demonstrate the application of GNNs via two examples for molecular property prediction.
arXiv Detail & Related papers (2022-07-25T11:30:44Z) - Transferring Chemical and Energetic Knowledge Between Molecular Systems
with Machine Learning [5.27145343046974]
We propose a novel methodology for transferring knowledge obtained from simple molecular systems to a more complex one.
We focus on the classification of high and low free-energy states.
Our results show a remarkable AUC of 0.92 for transfer learning from tri-alanine to the deca-alanine system.
arXiv Detail & Related papers (2022-05-06T16:21:00Z) - NeuroFluid: Fluid Dynamics Grounding with Particle-Driven Neural
Radiance Fields [65.07940731309856]
Deep learning has shown great potential for modeling the physical dynamics of complex particle systems such as fluids.
In this paper, we consider a partially observable scenario known as fluid dynamics grounding.
We propose a differentiable two-stage network named NeuroFluid.
It is shown to reasonably estimate the underlying physics of fluids with different initial shapes, viscosity, and densities.
arXiv Detail & Related papers (2022-03-03T15:13:29Z) - Equivariant Graph Attention Networks for Molecular Property Prediction [0.34376560669160383]
Learning about 3D molecular structures with varying size is an emerging challenge in machine learning and especially in drug discovery.
We propose an equivariant Graph Neural Networks (GNN) that operates with Cartesian coordinates to incorporate directionality.
We demonstrate the efficacy of our architecture on predicting quantum mechanical properties of small molecules and its benefit on problems that concern macromolecular structures such as protein complexes.
arXiv Detail & Related papers (2022-02-20T19:07:29Z) - Structural Landmarking and Interaction Modelling: on Resolution Dilemmas
in Graph Classification [50.83222170524406]
We study the intrinsic difficulty in graph classification under the unified concept of resolution dilemmas''
We propose SLIM'', an inductive neural network model for Structural Landmarking and Interaction Modelling.
arXiv Detail & Related papers (2020-06-29T01:01:42Z) - Graph Neural Network for Hamiltonian-Based Material Property Prediction [56.94118357003096]
We present and compare several different graph convolution networks that are able to predict the band gap for inorganic materials.
The models are developed to incorporate two different features: the information of each orbital itself and the interaction between each other.
The results show that our model can get a promising prediction accuracy with cross-validation.
arXiv Detail & Related papers (2020-05-27T13:32:10Z) - Multi-View Graph Neural Networks for Molecular Property Prediction [67.54644592806876]
We present Multi-View Graph Neural Network (MV-GNN), a multi-view message passing architecture.
In MV-GNN, we introduce a shared self-attentive readout component and disagreement loss to stabilize the training process.
We further boost the expressive power of MV-GNN by proposing a cross-dependent message passing scheme.
arXiv Detail & Related papers (2020-05-17T04:46:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.