Equivariant Graph Mechanics Networks with Constraints
- URL: http://arxiv.org/abs/2203.06442v1
- Date: Sat, 12 Mar 2022 14:22:14 GMT
- Title: Equivariant Graph Mechanics Networks with Constraints
- Authors: Wenbing Huang, Jiaqi Han, Yu Rong, Tingyang Xu, Fuchun Sun, Junzhou
Huang
- Abstract summary: We propose Graph Mechanics Network (GMN) which is efficient, equivariant and constraint-aware.
GMN represents, by generalized coordinates, the forward kinematics information (positions and velocities) of a structural object.
Extensive experiments support the advantages of GMN compared to the state-of-the-art GNNs in terms of prediction accuracy, constraint satisfaction and data efficiency.
- Score: 83.38709956935095
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Learning to reason about relations and dynamics over multiple interacting
objects is a challenging topic in machine learning. The challenges mainly stem
from that the interacting systems are exponentially-compositional, symmetrical,
and commonly geometrically-constrained. Current methods, particularly the ones
based on equivariant Graph Neural Networks (GNNs), have targeted on the first
two challenges but remain immature for constrained systems. In this paper, we
propose Graph Mechanics Network (GMN) which is combinatorially efficient,
equivariant and constraint-aware. The core of GMN is that it represents, by
generalized coordinates, the forward kinematics information (positions and
velocities) of a structural object. In this manner, the geometrical constraints
are implicitly and naturally encoded in the forward kinematics. Moreover, to
allow equivariant message passing in GMN, we have developed a general form of
orthogonality-equivariant functions, given that the dynamics of constrained
systems are more complicated than the unconstrained counterparts.
Theoretically, the proposed equivariant formulation is proved to be universally
expressive under certain conditions. Extensive experiments support the
advantages of GMN compared to the state-of-the-art GNNs in terms of prediction
accuracy, constraint satisfaction and data efficiency on the simulated systems
consisting of particles, sticks and hinges, as well as two real-world datasets
for molecular dynamics prediction and human motion capture.
Related papers
- Relaxing Continuous Constraints of Equivariant Graph Neural Networks for Physical Dynamics Learning [39.25135680793105]
We propose a general Discrete Equivariant Graph Neural Network (DEGNN) that guarantees equivariance to a given discrete point group.
Specifically, we show that such discrete equivariant message passing could be constructed by transforming geometric features into permutation-invariant embeddings.
We show that DEGNN is data efficient, learning with less data, and can generalize across scenarios such as unobserved orientation.
arXiv Detail & Related papers (2024-06-24T03:37:51Z) - Graph Neural PDE Solvers with Conservation and Similarity-Equivariance [6.077284832583712]
This study introduces a novel machine-learning architecture that is highly generalizable and adheres to conservation laws and physical symmetries.
The foundation of this architecture is graph neural networks (GNNs), which are adept at accommodating a variety of shapes and forms.
arXiv Detail & Related papers (2024-05-25T11:18:27Z) - SEGNO: Generalizing Equivariant Graph Neural Networks with Physical
Inductive Biases [66.61789780666727]
We show how the second-order continuity can be incorporated into GNNs while maintaining the equivariant property.
We also offer theoretical insights into SEGNO, highlighting that it can learn a unique trajectory between adjacent states.
Our model yields a significant improvement over the state-of-the-art baselines.
arXiv Detail & Related papers (2023-08-25T07:15:58Z) - Spatial Attention Kinetic Networks with E(n)-Equivariance [0.951828574518325]
Neural networks that are equivariant to rotations, translations, reflections, and permutations on n-dimensional geometric space have shown promise in physical modeling.
We propose a simple alternative functional form that uses neurally parametrized linear combinations of edge vectors to achieve equivariance.
We design spatial attention kinetic networks with E(n)-equivariance, or SAKE, which are competitive in many-body system modeling tasks while being significantly faster.
arXiv Detail & Related papers (2023-01-21T05:14:29Z) - Learning Physical Dynamics with Subequivariant Graph Neural Networks [99.41677381754678]
Graph Neural Networks (GNNs) have become a prevailing tool for learning physical dynamics.
Physical laws abide by symmetry, which is a vital inductive bias accounting for model generalization.
Our model achieves on average over 3% enhancement in contact prediction accuracy across 8 scenarios on Physion and 2X lower rollout MSE on RigidFall.
arXiv Detail & Related papers (2022-10-13T10:00:30Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Simplifying Hamiltonian and Lagrangian Neural Networks via Explicit
Constraints [49.66841118264278]
We introduce a series of challenging chaotic and extended-body systems to push the limits of current approaches.
Our experiments show that Cartesian coordinates with explicit constraints lead to a 100x improvement in accuracy and data efficiency.
arXiv Detail & Related papers (2020-10-26T13:35:16Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.