SE(3)-Equivariant Graph Neural Networks for Data-Efficient and Accurate
Interatomic Potentials
- URL: http://arxiv.org/abs/2101.03164v1
- Date: Fri, 8 Jan 2021 18:49:10 GMT
- Title: SE(3)-Equivariant Graph Neural Networks for Data-Efficient and Accurate
Interatomic Potentials
- Authors: Simon Batzner, Tess E. Smidt, Lixin Sun, Jonathan P. Mailoa, Mordechai
Kornbluth, Nicola Molinari, and Boris Kozinsky
- Abstract summary: NequIP is a SE(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations.
The method achieves state-of-the-art accuracy on a challenging set of diverse molecules and materials while exhibiting remarkable data efficiency.
- Score: 0.17590081165362778
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work presents Neural Equivariant Interatomic Potentials (NequIP), a
SE(3)-equivariant neural network approach for learning interatomic potentials
from ab-initio calculations for molecular dynamics simulations. While most
contemporary symmetry-aware models use invariant convolutions and only act on
scalars, NequIP employs SE(3)-equivariant convolutions for interactions of
geometric tensors, resulting in a more information-rich and faithful
representation of atomic environments. The method achieves state-of-the-art
accuracy on a challenging set of diverse molecules and materials while
exhibiting remarkable data efficiency. NequIP outperforms existing models with
up to three orders of magnitude fewer training data, challenging the widely
held belief that deep neural networks require massive training sets. The high
data efficiency of the method allows for the construction of accurate
potentials using high-order quantum chemical level of theory as reference and
enables high-fidelity molecular dynamics simulations over long time scales.
Related papers
- Equivariant Graph Neural Operator for Modeling 3D Dynamics [148.98826858078556]
We propose Equivariant Graph Neural Operator (EGNO) to directly models dynamics as trajectories instead of just next-step prediction.
EGNO explicitly learns the temporal evolution of 3D dynamics where we formulate the dynamics as a function over time and learn neural operators to approximate it.
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods.
arXiv Detail & Related papers (2024-01-19T21:50:32Z) - Synthetic pre-training for neural-network interatomic potentials [0.0]
We show that synthetic atomistic data, themselves obtained at scale with an existing machine learning potential, constitute a useful pre-training task for neural-network interatomic potential models.
Once pre-trained with a large synthetic dataset, these models can be fine-tuned on a much smaller, quantum-mechanical one, improving numerical accuracy and stability in computational practice.
arXiv Detail & Related papers (2023-07-24T17:16:24Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Denoise Pretraining on Nonequilibrium Molecules for Accurate and
Transferable Neural Potentials [8.048439531116367]
We propose denoise pretraining on nonequilibrium molecular conformations to achieve more accurate and transferable GNN potential predictions.
Our models pretrained on small molecules demonstrate remarkable transferability, improving performance when fine-tuned on diverse molecular systems.
arXiv Detail & Related papers (2023-03-03T21:15:22Z) - Learning Physical Dynamics with Subequivariant Graph Neural Networks [99.41677381754678]
Graph Neural Networks (GNNs) have become a prevailing tool for learning physical dynamics.
Physical laws abide by symmetry, which is a vital inductive bias accounting for model generalization.
Our model achieves on average over 3% enhancement in contact prediction accuracy across 8 scenarios on Physion and 2X lower rollout MSE on RigidFall.
arXiv Detail & Related papers (2022-10-13T10:00:30Z) - Physically constrained neural networks to solve the inverse problem for
neuron models [0.29005223064604074]
Systems biology and systems neurophysiology are powerful tools for a number of key applications in the biomedical sciences.
Recent developments in the field of deep neural networks have demonstrated the possibility of formulating nonlinear, universal approximators.
arXiv Detail & Related papers (2022-09-24T12:51:15Z) - Multi-fidelity Hierarchical Neural Processes [79.0284780825048]
Multi-fidelity surrogate modeling reduces the computational cost by fusing different simulation outputs.
We propose Multi-fidelity Hierarchical Neural Processes (MF-HNP), a unified neural latent variable model for multi-fidelity surrogate modeling.
We evaluate MF-HNP on epidemiology and climate modeling tasks, achieving competitive performance in terms of accuracy and uncertainty estimation.
arXiv Detail & Related papers (2022-06-10T04:54:13Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Fast and Sample-Efficient Interatomic Neural Network Potentials for
Molecules and Materials Based on Gaussian Moments [3.1829446824051195]
We present an improved NN architecture based on the previous GM-NN model.
The improved methodology is a pre-requisite for training-heavy such as active learning or learning-on-the-fly.
arXiv Detail & Related papers (2021-09-20T14:23:34Z) - Learning the exchange-correlation functional from nature with fully
differentiable density functional theory [0.0]
We train a neural network to replace the exchange-correlation functional within a fully-differentiable three-dimensional Kohn-Sham density functional theory framework.
Our trained exchange-correlation network provided improved prediction of atomization and ionization energies across a collection of 110 molecules.
arXiv Detail & Related papers (2021-02-08T14:25:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.