E(n) Equivariant Message Passing Cellular Networks
- URL: http://arxiv.org/abs/2406.03145v3
- Date: Thu, 18 Jul 2024 08:21:34 GMT
- Title: E(n) Equivariant Message Passing Cellular Networks
- Authors: Veljko Kovač, Erik J. Bekkers, Pietro Liò, Floor Eijkelboom,
- Abstract summary: We introduce E(n) Equivariant Message Passing Cellular Networks (EMPCNs)
EMPCNs are an extension of E(n) Equivariant Graph Networks to CW-complexes.
We show that EMPCNs achieve close to state-of-the-art performance on multiple tasks without the need for steerability.
- Score: 21.95746068487803
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper introduces E(n) Equivariant Message Passing Cellular Networks (EMPCNs), an extension of E(n) Equivariant Graph Neural Networks to CW-complexes. Our approach addresses two aspects of geometric message passing networks: 1) enhancing their expressiveness by incorporating arbitrary cells, and 2) achieving this in a computationally efficient way with a decoupled EMPCNs technique. We demonstrate that EMPCNs achieve close to state-of-the-art performance on multiple tasks without the need for steerability, including many-body predictions and motion capture. Moreover, ablation studies confirm that decoupled EMPCNs exhibit stronger generalization capabilities than their non-topologically informed counterparts. These findings show that EMPCNs can be used as a scalable and expressive framework for higher-order message passing in geometric and topological graphs
Related papers
- Mew: Multiplexed Immunofluorescence Image Analysis through an Efficient Multiplex Network [84.88767228835928]
We introduce Mew, a novel framework designed to efficiently process mIF images through the lens of multiplex network.
Mew innovatively constructs a multiplex network comprising two distinct layers: a Voronoi network for geometric information and a Cell-type network for capturing cell-wise homogeneity.
This framework equips a scalable and efficient Graph Neural Network (GNN), capable of processing the entire graph during training.
arXiv Detail & Related papers (2024-07-25T08:22:30Z) - Topological Neural Networks go Persistent, Equivariant, and Continuous [6.314000948709255]
We introduce TopNets as a broad framework that subsumes and unifies various methods in the intersection of GNNs/TNNs and PH.
TopNets achieve strong performance across diverse tasks, including antibody design, molecular dynamics simulation, and drug property prediction.
arXiv Detail & Related papers (2024-06-05T11:56:54Z) - E(n) Equivariant Topological Neural Networks [10.603892843083173]
Graph neural networks excel at modeling pairwise interactions, but they cannot flexibly accommodate higher-order interactions and features.
Topological deep learning (TDL) has emerged recently as a promising tool for addressing this issue.
This paper introduces E(n)-Equivariant Topological Neural Networks (ETNNs)
ETNNs incorporate geometric node features while respecting rotation, reflection, and translation.
arXiv Detail & Related papers (2024-05-24T10:55:38Z) - E(n) Equivariant Message Passing Simplicial Networks [1.6243562700235228]
We present $mathrmE(n)$ Equivariant Message Passing Simplicial Networks (EMPSNs)
EMPSNs learn high-dimensional simplex features in graphs (e.g. triangles)
We show that EMPSNs are on par with state-of-the-art approaches for learning on geometric graphs.
arXiv Detail & Related papers (2023-05-11T19:10:26Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Equivariant Graph Hierarchy-Based Neural Networks [53.60804845045526]
We propose Equivariant Hierarchy-based Graph Networks (EGHNs)
EGHNs consist of the three key components: generalized Equivariant Matrix Message Passing (EMMP), E-Pool and E-UpPool.
Considerable experimental evaluations verify the effectiveness of our EGHN on several applications including multi-object dynamics simulation, motion capture, and protein dynamics modeling.
arXiv Detail & Related papers (2022-02-22T03:11:47Z) - Lightweight, Dynamic Graph Convolutional Networks for AMR-to-Text
Generation [56.73834525802723]
Lightweight Dynamic Graph Convolutional Networks (LDGCNs) are proposed.
LDGCNs capture richer non-local interactions by synthesizing higher order information from the input graphs.
We develop two novel parameter saving strategies based on the group graph convolutions and weight tied convolutions to reduce memory usage and model complexity.
arXiv Detail & Related papers (2020-10-09T06:03:46Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Interpretable and Efficient Heterogeneous Graph Convolutional Network [27.316334213279973]
We propose an interpretable and efficient Heterogeneous Graph Convolutional Network (ie-HGCN) to learn the representations of objects in Heterogeneous Information Network (HINs)
ie-HGCN can automatically extract useful meta-paths for each object from all possible meta-paths within a length limit.
It can also reduce the computational cost by avoiding intermediate HIN transformation and neighborhood attention.
arXiv Detail & Related papers (2020-05-27T06:06:00Z) - Multi-View Graph Neural Networks for Molecular Property Prediction [67.54644592806876]
We present Multi-View Graph Neural Network (MV-GNN), a multi-view message passing architecture.
In MV-GNN, we introduce a shared self-attentive readout component and disagreement loss to stabilize the training process.
We further boost the expressive power of MV-GNN by proposing a cross-dependent message passing scheme.
arXiv Detail & Related papers (2020-05-17T04:46:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.