Geometric and Physical Quantities improve E(3) Equivariant Message
Passing
- URL: http://arxiv.org/abs/2110.02905v1
- Date: Wed, 6 Oct 2021 16:34:26 GMT
- Title: Geometric and Physical Quantities improve E(3) Equivariant Message
Passing
- Authors: Johannes Brandstetter, Rob Hesselink, Elise van der Pol, Erik Bekkers,
Max Welling
- Abstract summary: We introduce Steerable E(3) Equivariant Graph Neural Networks (SEGNNs) that generalise equivariant graph networks.
This model, composed of steerables, is able to incorporate geometric and physical information in both the message and update functions.
We demonstrate the effectiveness of our method on several tasks in computational physics and chemistry.
- Score: 59.98327062664975
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Including covariant information, such as position, force, velocity or spin is
important in many tasks in computational physics and chemistry. We introduce
Steerable E(3) Equivariant Graph Neural Networks (SEGNNs) that generalise
equivariant graph networks, such that node and edge attributes are not
restricted to invariant scalars, but can contain covariant information, such as
vectors or tensors. This model, composed of steerable MLPs, is able to
incorporate geometric and physical information in both the message and update
functions. Through the definition of steerable node attributes, the MLPs
provide a new class of activation functions for general use with steerable
feature fields. We discuss ours and related work through the lens of
equivariant non-linear convolutions, which further allows us to pin-point the
successful components of SEGNNs: non-linear message aggregation improves upon
classic linear (steerable) point convolutions; steerable messages improve upon
recent equivariant graph networks that send invariant messages. We demonstrate
the effectiveness of our method on several tasks in computational physics and
chemistry and provide extensive ablation studies.
Related papers
- Relaxing Continuous Constraints of Equivariant Graph Neural Networks for Physical Dynamics Learning [39.25135680793105]
We propose a general Discrete Equivariant Graph Neural Network (DEGNN) that guarantees equivariance to a given discrete point group.
Specifically, we show that such discrete equivariant message passing could be constructed by transforming geometric features into permutation-invariant embeddings.
We show that DEGNN is data efficient, learning with less data, and can generalize across scenarios such as unobserved orientation.
arXiv Detail & Related papers (2024-06-24T03:37:51Z) - A Characterization Theorem for Equivariant Networks with Point-wise
Activations [13.00676132572457]
We prove that rotation-equivariant networks can only be invariant, as it happens for any network which is equivariant with respect to connected compact groups.
We show that feature spaces of disentangled steerable convolutional neural networks are trivial representations.
arXiv Detail & Related papers (2024-01-17T14:30:46Z) - An Exploration of Conditioning Methods in Graph Neural Networks [8.532288965425805]
In computational tasks such as physics and chemistry usage of edge attributes such as relative position or distance proved to be essential.
We consider three types of conditioning; weak, strong, and pure, which respectively relate to concatenation-based conditioning, gating, and transformations that are causally dependent on the attributes.
This categorization provides a unifying viewpoint on different classes of GNNs, from separable convolutions to various forms of message passing networks.
arXiv Detail & Related papers (2023-05-03T07:14:12Z) - Equivariant Graph Mechanics Networks with Constraints [83.38709956935095]
We propose Graph Mechanics Network (GMN) which is efficient, equivariant and constraint-aware.
GMN represents, by generalized coordinates, the forward kinematics information (positions and velocities) of a structural object.
Extensive experiments support the advantages of GMN compared to the state-of-the-art GNNs in terms of prediction accuracy, constraint satisfaction and data efficiency.
arXiv Detail & Related papers (2022-03-12T14:22:14Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Frame Averaging for Invariant and Equivariant Network Design [50.87023773850824]
We introduce Frame Averaging (FA), a framework for adapting known (backbone) architectures to become invariant or equivariant to new symmetry types.
We show that FA-based models have maximal expressive power in a broad setting.
We propose a new class of universal Graph Neural Networks (GNNs), universal Euclidean motion invariant point cloud networks, and Euclidean motion invariant Message Passing (MP) GNNs.
arXiv Detail & Related papers (2021-10-07T11:05:23Z) - Equivariant Point Network for 3D Point Cloud Analysis [17.689949017410836]
We propose an effective and practical SE(3) (3D translation and rotation) equivariant network for point cloud analysis.
First, we present SE(3) separable point convolution, a novel framework that breaks down the 6D convolution into two separable convolutional operators.
Second, we introduce an attention layer to effectively harness the expressiveness of the equivariant features.
arXiv Detail & Related papers (2021-03-25T21:57:10Z) - E(n) Equivariant Graph Neural Networks [86.75170631724548]
This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reflections and permutations called E(n)-Equivariant Graph Neural Networks (EGNNs)
In contrast with existing methods, our work does not require computationally expensive higher-order representations in intermediate layers while it still achieves competitive or better performance.
arXiv Detail & Related papers (2021-02-19T10:25:33Z) - Building powerful and equivariant graph neural networks with structural
message-passing [74.93169425144755]
We propose a powerful and equivariant message-passing framework based on two ideas.
First, we propagate a one-hot encoding of the nodes, in addition to the features, in order to learn a local context matrix around each node.
Second, we propose methods for the parametrization of the message and update functions that ensure permutation equivariance.
arXiv Detail & Related papers (2020-06-26T17:15:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.