Equivariant vector field network for many-body system modeling
- URL: http://arxiv.org/abs/2110.14811v1
- Date: Tue, 26 Oct 2021 14:26:25 GMT
- Title: Equivariant vector field network for many-body system modeling
- Authors: Weitao Du, He Zhang, Yuanqi Du, Qi Meng, Wei Chen, Bin Shao, Tie-Yan
Liu
- Abstract summary: Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
- Score: 65.22203086172019
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Modeling many-body systems has been a long-standing challenge in science,
from classical and quantum physics to computational biology. Equivariance is a
critical physical symmetry for many-body dynamic systems, which enables robust
and accurate prediction under arbitrary reference transformations. In light of
this, great efforts have been put on encoding this symmetry into deep neural
networks, which significantly boosts the prediction performance of
down-streaming tasks. Some general equivariant models which are computationally
efficient have been proposed, however, these models have no guarantee on the
approximation power and may have information loss. In this paper, we leverage
insights from the scalarization technique in differential geometry to model
many-body systems by learning the gradient vector fields, which are SE(3) and
permutation equivariant. Specifically, we propose the Equivariant Vector Field
Network (EVFN), which is built on a novel tuple of equivariant basis and the
associated scalarization and vectorization layers. Since our tuple equivariant
basis forms a complete basis, learning the dynamics with our EVFN has no
information loss and no tensor operations are involved before the final
vectorization, which reduces the complex optimization on tensors to a minimum.
We evaluate our method on predicting trajectories of simulated Newton mechanics
systems with both full and partially observed data, as well as the equilibrium
state of small molecules (molecular conformation) evolving as a statistical
mechanics system. Experimental results across multiple tasks demonstrate that
our model achieves best or competitive performance on baseline models in
various types of datasets.
Related papers
- Deconstructing equivariant representations in molecular systems [6.841858294458366]
We report on experiments using a simple equivariant graph convolution model on the QM9 dataset.
Our key finding is that, for a scalar prediction task, many of the irreducible representations are simply ignored during training.
We empirically show that removing some unused orders of spherical harmonics improves model performance.
arXiv Detail & Related papers (2024-10-10T17:15:46Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Similarity Equivariant Graph Neural Networks for Homogenization of Metamaterials [3.6443770850509423]
Soft, porous mechanical metamaterials exhibit pattern transformations that may have important applications in soft robotics, sound reduction and biomedicine.
We develop a machine learning-based approach that scales favorably to serve as a surrogate model.
We show that this network is more accurate and data-efficient than graph neural networks with fewer symmetries.
arXiv Detail & Related papers (2024-04-26T12:30:32Z) - Equivariant Graph Neural Operator for Modeling 3D Dynamics [148.98826858078556]
We propose Equivariant Graph Neural Operator (EGNO) to directly models dynamics as trajectories instead of just next-step prediction.
EGNO explicitly learns the temporal evolution of 3D dynamics where we formulate the dynamics as a function over time and learn neural operators to approximate it.
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods.
arXiv Detail & Related papers (2024-01-19T21:50:32Z) - E($3$) Equivariant Graph Neural Networks for Particle-Based Fluid
Mechanics [2.1401663582288144]
We demonstrate that equivariant graph neural networks have the potential to learn more accurate dynamic-interaction models.
We benchmark two well-studied fluid flow systems, namely the 3D decaying Taylor-Green vortex and the 3D reverse Poiseuille flow.
arXiv Detail & Related papers (2023-03-31T21:56:35Z) - Lorentz group equivariant autoencoders [6.858459233149096]
Lorentz group autoencoder (LGAE)
We develop an autoencoder model equivariant with respect to the proper, orthochronous Lorentz group $mathrmSO+(2,1)$, with a latent space living in the representations of the group.
We present our architecture and several experimental results on jets at the LHC and find it outperforms graph and convolutional neural network baseline models on several compression, reconstruction, and anomaly detection metrics.
arXiv Detail & Related papers (2022-12-14T17:19:46Z) - Learning Physical Dynamics with Subequivariant Graph Neural Networks [99.41677381754678]
Graph Neural Networks (GNNs) have become a prevailing tool for learning physical dynamics.
Physical laws abide by symmetry, which is a vital inductive bias accounting for model generalization.
Our model achieves on average over 3% enhancement in contact prediction accuracy across 8 scenarios on Physion and 2X lower rollout MSE on RigidFall.
arXiv Detail & Related papers (2022-10-13T10:00:30Z) - Equivariant Deep Dynamical Model for Motion Prediction [0.0]
Deep generative modeling is a powerful approach for dynamical modeling to discover the most simplified and compressed underlying description of the data.
Most learning tasks have intrinsic symmetries, i.e., the input transformations leave the output unchanged, or the output undergoes a similar transformation.
We propose an SO(3) equivariant deep dynamical model (EqDDM) for motion prediction that learns a structured representation of the input space in the sense that the embedding varies with symmetry transformations.
arXiv Detail & Related papers (2021-11-02T21:01:43Z) - Frame Averaging for Invariant and Equivariant Network Design [50.87023773850824]
We introduce Frame Averaging (FA), a framework for adapting known (backbone) architectures to become invariant or equivariant to new symmetry types.
We show that FA-based models have maximal expressive power in a broad setting.
We propose a new class of universal Graph Neural Networks (GNNs), universal Euclidean motion invariant point cloud networks, and Euclidean motion invariant Message Passing (MP) GNNs.
arXiv Detail & Related papers (2021-10-07T11:05:23Z) - E(n) Equivariant Graph Neural Networks [86.75170631724548]
This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reflections and permutations called E(n)-Equivariant Graph Neural Networks (EGNNs)
In contrast with existing methods, our work does not require computationally expensive higher-order representations in intermediate layers while it still achieves competitive or better performance.
arXiv Detail & Related papers (2021-02-19T10:25:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.