Rotation Invariant Graph Neural Networks using Spin Convolutions
- URL: http://arxiv.org/abs/2106.09575v1
- Date: Thu, 17 Jun 2021 14:59:34 GMT
- Title: Rotation Invariant Graph Neural Networks using Spin Convolutions
- Authors: Muhammed Shuaibi, Adeesh Kolluru, Abhishek Das, Aditya Grover, Anuroop
Sriram, Zachary Ulissi, C. Lawrence Zitnick
- Abstract summary: Machine learning approaches have the potential to approximate Density Functional Theory (DFT) in a computationally efficient manner.
We introduce a novel approach to modeling angular information between sets of neighboring atoms in a graph neural network.
Results are demonstrated on the large-scale Open Catalyst 2020 dataset.
- Score: 28.4962005849904
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Progress towards the energy breakthroughs needed to combat climate change can
be significantly accelerated through the efficient simulation of atomic
systems. Simulation techniques based on first principles, such as Density
Functional Theory (DFT), are limited in their practical use due to their high
computational expense. Machine learning approaches have the potential to
approximate DFT in a computationally efficient manner, which could dramatically
increase the impact of computational simulations on real-world problems.
Approximating DFT poses several challenges. These include accurately modeling
the subtle changes in the relative positions and angles between atoms, and
enforcing constraints such as rotation invariance or energy conservation. We
introduce a novel approach to modeling angular information between sets of
neighboring atoms in a graph neural network. Rotation invariance is achieved
for the network's edge messages through the use of a per-edge local coordinate
frame and a novel spin convolution over the remaining degree of freedom. Two
model variants are proposed for the applications of structure relaxation and
molecular dynamics. State-of-the-art results are demonstrated on the
large-scale Open Catalyst 2020 dataset. Comparisons are also performed on the
MD17 and QM9 datasets.
Related papers
- Physics aware machine learning for micromagnetic energy minimization: recent algorithmic developments [0.0]
Building on Brown's bounds for magnetostatic self-energy, we revisit their application in the context of variational formulations of the transmission problems.
We reformulate these bounds on a finite domain, making the method more efficient and scalable for numerical simulation.
Results highlight the potential of mesh-free Physics-Informed Neural Networks (PINNs) and Extreme Learning Machines (ELMs) when integrated with hard constraints.
arXiv Detail & Related papers (2024-09-19T16:22:40Z) - Aero-Nef: Neural Fields for Rapid Aircraft Aerodynamics Simulations [1.1932047172700866]
This paper presents a methodology to learn surrogate models of steady state fluid dynamics simulations on meshed domains.
The proposed models can be applied directly to unstructured domains for different flow conditions.
Remarkably, the method can perform inference five order of magnitude faster than the high fidelity solver on the RANS transonic airfoil dataset.
arXiv Detail & Related papers (2024-07-29T11:48:44Z) - Predicting Energy Budgets in Droplet Dynamics: A Recurrent Neural Network Approach [0.0]
This study applies Long Short-Term Memory to predict transient and static outputs for fluid flows under surface tension effects.
Using only dimensionless numbers and geometric time series data from numerical simulations, LSTM predicts the energy budget.
Using a recurrent neural network (RNN) architecture fed with time series data derived from geometrical parameters, our study shows the accuracy of our approach in predicting energy budgets.
arXiv Detail & Related papers (2024-03-24T13:32:42Z) - Equivariant Graph Neural Operator for Modeling 3D Dynamics [148.98826858078556]
We propose Equivariant Graph Neural Operator (EGNO) to directly models dynamics as trajectories instead of just next-step prediction.
EGNO explicitly learns the temporal evolution of 3D dynamics where we formulate the dynamics as a function over time and learn neural operators to approximate it.
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods.
arXiv Detail & Related papers (2024-01-19T21:50:32Z) - Gradual Optimization Learning for Conformational Energy Minimization [69.36925478047682]
Gradual Optimization Learning Framework (GOLF) for energy minimization with neural networks significantly reduces the required additional data.
Our results demonstrate that the neural network trained with GOLF performs on par with the oracle on a benchmark of diverse drug-like molecules.
arXiv Detail & Related papers (2023-11-05T11:48:08Z) - Accelerating Particle and Fluid Simulations with Differentiable Graph
Networks for Solving Forward and Inverse Problems [2.153852088624324]
We leverage physics-embedded differentiable graph network simulators to solve particulate and fluid simulations.
GNS represents the domain as a graph with particles as nodes and learned interactions as edges.
GNS achieves over 165x speedup for granular flow prediction compared to parallel CPU numerical simulations.
arXiv Detail & Related papers (2023-09-23T11:52:43Z) - Geometry-Informed Neural Operator for Large-Scale 3D PDEs [76.06115572844882]
We propose the geometry-informed neural operator (GINO) to learn the solution operator of large-scale partial differential equations.
We successfully trained GINO to predict the pressure on car surfaces using only five hundred data points.
arXiv Detail & Related papers (2023-09-01T16:59:21Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Molecular Geometry-aware Transformer for accurate 3D Atomic System
modeling [51.83761266429285]
We propose a novel Transformer architecture that takes nodes (atoms) and edges (bonds and nonbonding atom pairs) as inputs and models the interactions among them.
Moleformer achieves state-of-the-art on the initial state to relaxed energy prediction of OC20 and is very competitive in QM9 on predicting quantum chemical properties.
arXiv Detail & Related papers (2023-02-02T03:49:57Z) - NeuralNEB -- Neural Networks can find Reaction Paths Fast [7.7365628406567675]
Quantum mechanical methods like Density Functional Theory (DFT) are used with great success alongside efficient search algorithms for studying kinetics of reactive systems.
Machine Learning (ML) models have turned out to be excellent emulators of small molecule DFT calculations and could possibly replace DFT in such tasks.
In this paper we train state of the art equivariant Graph Neural Network (GNN)-based models on around 10.000 elementary reactions from the Transition1x dataset.
arXiv Detail & Related papers (2022-07-20T15:29:45Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.