Going Deeper into Permutation-Sensitive Graph Neural Networks
- URL: http://arxiv.org/abs/2205.14368v1
- Date: Sat, 28 May 2022 08:22:10 GMT
- Title: Going Deeper into Permutation-Sensitive Graph Neural Networks
- Authors: Zhongyu Huang, Yingheng Wang, Chaozhuo Li, Huiguang He
- Abstract summary: We devise an efficient permutation-sensitive aggregation mechanism via permutation groups.
We prove that our approach is strictly more powerful than the 2-dimensional Weisfeiler-Lehman (2-WL) graph isomorphism test.
- Score: 6.685139672294716
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The invariance to permutations of the adjacency matrix, i.e., graph
isomorphism, is an overarching requirement for Graph Neural Networks (GNNs).
Conventionally, this prerequisite can be satisfied by the invariant operations
over node permutations when aggregating messages. However, such an invariant
manner may ignore the relationships among neighboring nodes, thereby hindering
the expressivity of GNNs. In this work, we devise an efficient
permutation-sensitive aggregation mechanism via permutation groups, capturing
pairwise correlations between neighboring nodes. We prove that our approach is
strictly more powerful than the 2-dimensional Weisfeiler-Lehman (2-WL) graph
isomorphism test and not less powerful than the 3-WL test. Moreover, we prove
that our approach achieves the linear sampling complexity. Comprehensive
experiments on multiple synthetic and real-world datasets demonstrate the
superiority of our model.
Related papers
- Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - Discrete Graph Auto-Encoder [52.50288418639075]
We introduce a new framework named Discrete Graph Auto-Encoder (DGAE)
We first use a permutation-equivariant auto-encoder to convert graphs into sets of discrete latent node representations.
In the second step, we sort the sets of discrete latent representations and learn their distribution with a specifically designed auto-regressive model.
arXiv Detail & Related papers (2023-06-13T12:40:39Z) - Learnable Commutative Monoids for Graph Neural Networks [0.0]
Graph neural networks (GNNs) are highly sensitive to the choice of aggregation function.
We show that GNNs equipped with recurrent aggregators are competitive with state-of-the-art permutation-invariant aggregators.
We propose a framework for constructing learnable, commutative, associative binary operators.
arXiv Detail & Related papers (2022-12-16T15:43:41Z) - EDEN: A Plug-in Equivariant Distance Encoding to Beyond the 1-WL Test [17.027460848621434]
We propose a plug-in Equivariant Distance ENcoding (EDEN) for graph neural networks (MPNNs)
EDEN is derived from a series of interpretable transformations on the graph's distance matrix.
Experiments on real-world datasets show that combining EDEN with conventional GNNs surpasses recent advanced GNNs.
arXiv Detail & Related papers (2022-11-19T16:36:28Z) - E(n) Equivariant Graph Neural Networks [86.75170631724548]
This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reflections and permutations called E(n)-Equivariant Graph Neural Networks (EGNNs)
In contrast with existing methods, our work does not require computationally expensive higher-order representations in intermediate layers while it still achieves competitive or better performance.
arXiv Detail & Related papers (2021-02-19T10:25:33Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Graph Neural Networks with Heterophily [40.23690407583509]
We propose a novel framework called CPGNN that generalizes GNNs for graphs with either homophily or heterophily.
We show that replacing the compatibility matrix in our framework with the identity (which represents pure homophily) reduces to GCN.
arXiv Detail & Related papers (2020-09-28T18:29:36Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Permutation Invariant Graph Generation via Score-Based Generative
Modeling [114.12935776726606]
We propose a permutation invariant approach to modeling graphs, using the recent framework of score-based generative modeling.
In particular, we design a permutation equivariant, multi-channel graph neural network to model the gradient of the data distribution at the input graph.
For graph generation, we find that our learning approach achieves better or comparable results to existing models on benchmark datasets.
arXiv Detail & Related papers (2020-03-02T03:06:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.