ForceNet: A Graph Neural Network for Large-Scale Quantum Calculations
- URL: http://arxiv.org/abs/2103.01436v1
- Date: Tue, 2 Mar 2021 03:09:06 GMT
- Title: ForceNet: A Graph Neural Network for Large-Scale Quantum Calculations
- Authors: Weihua Hu, Muhammed Shuaibi, Abhishek Das, Siddharth Goyal, Anuroop
Sriram, Jure Leskovec, Devi Parikh, C. Lawrence Zitnick
- Abstract summary: We develop a scalable and expressive Graph Neural Networks model, ForceNet, to approximate atomic forces.
Our proposed ForceNet is able to predict atomic forces more accurately than state-of-the-art physics-based GNNs.
- Score: 86.41674945012369
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: With massive amounts of atomic simulation data available, there is a huge
opportunity to develop fast and accurate machine learning models to approximate
expensive physics-based calculations. The key quantity to estimate is atomic
forces, where the state-of-the-art Graph Neural Networks (GNNs) explicitly
enforce basic physical constraints such as rotation-covariance. However, to
strictly satisfy the physical constraints, existing models have to make
tradeoffs between computational efficiency and model expressiveness. Here we
explore an alternative approach. By not imposing explicit physical constraints,
we can flexibly design expressive models while maintaining their computational
efficiency. Physical constraints are implicitly imposed by training the models
using physics-based data augmentation. To evaluate the approach, we carefully
design a scalable and expressive GNN model, ForceNet, and apply it to OC20
(Chanussot et al., 2020), an unprecedentedly-large dataset of quantum physics
calculations. Our proposed ForceNet is able to predict atomic forces more
accurately than state-of-the-art physics-based GNNs while being faster both in
training and inference. Overall, our promising and counter-intuitive results
open up an exciting avenue for future research.
Related papers
- The Importance of Being Scalable: Improving the Speed and Accuracy of Neural Network Interatomic Potentials Across Chemical Domains [4.340917737559795]
We study scaling in Neural Network Interatomic Potentials (NNIPs)
NNIPs act as surrogate models for ab initio quantum mechanical calculations.
We develop an NNIP architecture designed for scaling: the Efficiently Scaled Attention Interatomic Potential (EScAIP)
arXiv Detail & Related papers (2024-10-31T17:35:57Z) - Physics-Informed Neural Networks with Hard Linear Equality Constraints [9.101849365688905]
This work proposes a novel physics-informed neural network, KKT-hPINN, which rigorously guarantees hard linear equality constraints.
Experiments on Aspen models of a stirred-tank reactor unit, an extractive distillation subsystem, and a chemical plant demonstrate that this model can further enhance the prediction accuracy.
arXiv Detail & Related papers (2024-02-11T17:40:26Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Learning Physical Dynamics with Subequivariant Graph Neural Networks [99.41677381754678]
Graph Neural Networks (GNNs) have become a prevailing tool for learning physical dynamics.
Physical laws abide by symmetry, which is a vital inductive bias accounting for model generalization.
Our model achieves on average over 3% enhancement in contact prediction accuracy across 8 scenarios on Physion and 2X lower rollout MSE on RigidFall.
arXiv Detail & Related papers (2022-10-13T10:00:30Z) - Human Trajectory Prediction via Neural Social Physics [63.62824628085961]
Trajectory prediction has been widely pursued in many fields, and many model-based and model-free methods have been explored.
We propose a new method combining both methodologies based on a new Neural Differential Equation model.
Our new model (Neural Social Physics or NSP) is a deep neural network within which we use an explicit physics model with learnable parameters.
arXiv Detail & Related papers (2022-07-21T12:11:18Z) - Physics-Embedded Neural Networks: Graph Neural PDE Solvers with Mixed
Boundary Conditions [3.04585143845864]
Graph neural network (GNN) is a promising approach to learning and predicting physical phenomena.
We present a physics-embedded GNN that considers boundary conditions and predicts the state after a long time.
Our model can be a useful standard for realizing reliable, fast, and accurate GNN-based PDE solvers.
arXiv Detail & Related papers (2022-05-24T09:17:27Z) - Towards Quantum Graph Neural Networks: An Ego-Graph Learning Approach [47.19265172105025]
We propose a novel hybrid quantum-classical algorithm for graph-structured data, which we refer to as the Ego-graph based Quantum Graph Neural Network (egoQGNN)
egoQGNN implements the GNN theoretical framework using the tensor product and unity matrix representation, which greatly reduces the number of model parameters required.
The architecture is based on a novel mapping from real-world data to Hilbert space.
arXiv Detail & Related papers (2022-01-13T16:35:45Z) - Fast and Sample-Efficient Interatomic Neural Network Potentials for
Molecules and Materials Based on Gaussian Moments [3.1829446824051195]
We present an improved NN architecture based on the previous GM-NN model.
The improved methodology is a pre-requisite for training-heavy such as active learning or learning-on-the-fly.
arXiv Detail & Related papers (2021-09-20T14:23:34Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.