EDEN: A Plug-in Equivariant Distance Encoding to Beyond the 1-WL Test
- URL: http://arxiv.org/abs/2211.10739v1
- Date: Sat, 19 Nov 2022 16:36:28 GMT
- Title: EDEN: A Plug-in Equivariant Distance Encoding to Beyond the 1-WL Test
- Authors: Chang Liu, Yuwen Yang, Yue Ding, Hongtao Lu
- Abstract summary: We propose a plug-in Equivariant Distance ENcoding (EDEN) for graph neural networks (MPNNs)
EDEN is derived from a series of interpretable transformations on the graph's distance matrix.
Experiments on real-world datasets show that combining EDEN with conventional GNNs surpasses recent advanced GNNs.
- Score: 17.027460848621434
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The message-passing scheme is the core of graph representation learning.
While most existing message-passing graph neural networks (MPNNs) are
permutation-invariant in graph-level representation learning and
permutation-equivariant in node- and edge-level representation learning, their
expressive power is commonly limited by the 1-Weisfeiler-Lehman (1-WL) graph
isomorphism test. Recently proposed expressive graph neural networks (GNNs)
with specially designed complex message-passing mechanisms are not practical.
To bridge the gap, we propose a plug-in Equivariant Distance ENcoding (EDEN)
for MPNNs. EDEN is derived from a series of interpretable transformations on
the graph's distance matrix. We theoretically prove that EDEN is
permutation-equivariant for all level graph representation learning, and we
empirically illustrate that EDEN's expressive power can reach up to the 3-WL
test. Extensive experiments on real-world datasets show that combining EDEN
with conventional GNNs surpasses recent advanced GNNs.
Related papers
- Generalization Limits of Graph Neural Networks in Identity Effects
Learning [12.302336258860116]
Graph Neural Networks (GNNs) have emerged as a powerful tool for data-driven learning on various graph domains.
We establish new generalization properties and fundamental limits of GNNs in the context of learning so-called identity effects.
Our study is motivated by the need to understand the capabilities of GNNs when performing simple cognitive tasks.
arXiv Detail & Related papers (2023-06-30T20:56:38Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Going Deeper into Permutation-Sensitive Graph Neural Networks [6.685139672294716]
We devise an efficient permutation-sensitive aggregation mechanism via permutation groups.
We prove that our approach is strictly more powerful than the 2-dimensional Weisfeiler-Lehman (2-WL) graph isomorphism test.
arXiv Detail & Related papers (2022-05-28T08:22:10Z) - Representation Power of Graph Neural Networks: Improved Expressivity via
Algebraic Analysis [124.97061497512804]
We show that standard Graph Neural Networks (GNNs) produce more discriminative representations than the Weisfeiler-Lehman (WL) algorithm.
We also show that simple convolutional architectures with white inputs, produce equivariant features that count the closed paths in the graph.
arXiv Detail & Related papers (2022-05-19T18:40:25Z) - Graph Representation Learning with Individualization and Refinement [19.436520792345064]
Graph Neural Networks (GNNs) have emerged as prominent models for representation learning on graph structured data.
In this work, we follow the classical approach of Individualization and Refinement (IR)
Our technique lets us learn richer node embeddings while keeping the computational complexity manageable.
arXiv Detail & Related papers (2022-03-17T07:50:48Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Expressive Power of Invariant and Equivariant Graph Neural Networks [10.419350129060598]
We show that Folklore Graph Neural Networks (FGNN) are the most expressive architectures proposed so far for a given tensor order.
FGNNs are able to learn how to solve the problem, leading to much better average performances than existing algorithms.
arXiv Detail & Related papers (2020-06-28T16:35:45Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.