E(n) Equivariant Message Passing Simplicial Networks
- URL: http://arxiv.org/abs/2305.07100v2
- Date: Sun, 22 Oct 2023 16:36:55 GMT
- Title: E(n) Equivariant Message Passing Simplicial Networks
- Authors: Floor Eijkelboom, Rob Hesselink, Erik Bekkers
- Abstract summary: We present $mathrmE(n)$ Equivariant Message Passing Simplicial Networks (EMPSNs)
EMPSNs learn high-dimensional simplex features in graphs (e.g. triangles)
We show that EMPSNs are on par with state-of-the-art approaches for learning on geometric graphs.
- Score: 1.6243562700235228
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents $\mathrm{E}(n)$ Equivariant Message Passing Simplicial
Networks (EMPSNs), a novel approach to learning on geometric graphs and point
clouds that is equivariant to rotations, translations, and reflections. EMPSNs
can learn high-dimensional simplex features in graphs (e.g. triangles), and use
the increase of geometric information of higher-dimensional simplices in an
$\mathrm{E}(n)$ equivariant fashion. EMPSNs simultaneously generalize
$\mathrm{E}(n)$ Equivariant Graph Neural Networks to a topologically more
elaborate counterpart and provide an approach for including geometric
information in Message Passing Simplicial Networks. The results indicate that
EMPSNs can leverage the benefits of both approaches, leading to a general
increase in performance when compared to either method. Furthermore, the
results suggest that incorporating geometric information serves as an effective
measure against over-smoothing in message passing networks, especially when
operating on high-dimensional simplicial structures. Last, we show that EMPSNs
are on par with state-of-the-art approaches for learning on geometric graphs.
Related papers
- On the Expressive Power of Sparse Geometric MPNNs [3.396731589928944]
We study the expressive power of message-passing neural networks for geometric graphs.
We show that generic pairs of non-isomorphic geometric graphs can be separated by message-passing networks.
arXiv Detail & Related papers (2024-07-02T07:48:22Z) - E(n) Equivariant Message Passing Cellular Networks [21.95746068487803]
We introduce E(n) Equivariant Message Passing Cellular Networks (EMPCNs)
EMPCNs are an extension of E(n) Equivariant Graph Networks to CW-complexes.
We show that EMPCNs achieve close to state-of-the-art performance on multiple tasks without the need for steerability.
arXiv Detail & Related papers (2024-06-05T11:00:27Z) - A Survey of Geometric Graph Neural Networks: Data Structures, Models and
Applications [67.33002207179923]
This paper presents a survey of data structures, models, and applications related to geometric GNNs.
We provide a unified view of existing models from the geometric message passing perspective.
We also summarize the applications as well as the related datasets to facilitate later research for methodology development and experimental evaluation.
arXiv Detail & Related papers (2024-03-01T12:13:04Z) - A quatum inspired neural network for geometric modeling [14.214656118952178]
We introduce an innovative equivariant Matrix Product State (MPS)-based message-passing strategy.
Our method effectively models complex many-body relationships, suppressing mean-field approximations.
It seamlessly replaces the standard message-passing and layer-aggregation modules intrinsic to geometric GNNs.
arXiv Detail & Related papers (2024-01-03T15:59:35Z) - Simplicial Representation Learning with Neural $k$-Forms [14.566552361705499]
This paper focuses on leveraging geometric information from simplicial complexes embedded in $mathbbRn$ using node coordinates.
We use differential k-forms in mathbbRn to create representations of simplices, offering interpretability and geometric consistency without message passing.
Our method is efficient, versatile, and applicable to various input complexes, including graphs, simplicial complexes, and cell complexes.
arXiv Detail & Related papers (2023-12-13T21:03:39Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - Modeling Graphs Beyond Hyperbolic: Graph Neural Networks in Symmetric
Positive Definite Matrices [8.805129821507046]
Real-world graph data is characterized by multiple types of geometric and topological features.
We construct graph neural networks that can robustly handle complex graphs.
arXiv Detail & Related papers (2023-06-24T21:50:53Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - E(n) Equivariant Graph Neural Networks [86.75170631724548]
This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reflections and permutations called E(n)-Equivariant Graph Neural Networks (EGNNs)
In contrast with existing methods, our work does not require computationally expensive higher-order representations in intermediate layers while it still achieves competitive or better performance.
arXiv Detail & Related papers (2021-02-19T10:25:33Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z) - Gauge Equivariant Mesh CNNs: Anisotropic convolutions on geometric
graphs [81.12344211998635]
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs)
We propose Gauge Equivariant Mesh CNNs which generalize GCNs to apply anisotropic gauge equivariant kernels.
Our experiments validate the significantly improved expressivity of the proposed model over conventional GCNs and other methods.
arXiv Detail & Related papers (2020-03-11T17:21:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.