Weisfeiler and Leman go Hyperbolic: Learning Distance Preserving Node
Representations
- URL: http://arxiv.org/abs/2211.02501v1
- Date: Fri, 4 Nov 2022 15:03:41 GMT
- Title: Weisfeiler and Leman go Hyperbolic: Learning Distance Preserving Node
Representations
- Authors: Giannis Nikolentzos, Michail Chatzianastasis, Michalis Vazirgiannis
- Abstract summary: Graph neural networks (GNNs) have emerged as a promising tool for solving machine learning problems on graphs.
In this paper, we define a distance function between nodes which is based on the hierarchy produced by the Weisfeiler-Leman (WL) algorithm.
We propose a model that learns representations which preserve those distances between nodes.
- Score: 26.77596449192451
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: In recent years, graph neural networks (GNNs) have emerged as a promising
tool for solving machine learning problems on graphs. Most GNNs are members of
the family of message passing neural networks (MPNNs). There is a close
connection between these models and the Weisfeiler-Leman (WL) test of
isomorphism, an algorithm that can successfully test isomorphism for a broad
class of graphs. Recently, much research has focused on measuring the
expressive power of GNNs. For instance, it has been shown that standard MPNNs
are at most as powerful as WL in terms of distinguishing non-isomorphic graphs.
However, these studies have largely ignored the distances between the
representations of nodes/graphs which are of paramount importance for learning
tasks. In this paper, we define a distance function between nodes which is
based on the hierarchy produced by the WL algorithm, and propose a model that
learns representations which preserve those distances between nodes. Since the
emerging hierarchy corresponds to a tree, to learn these representations, we
capitalize on recent advances in the field of hyperbolic neural networks. We
empirically evaluate the proposed model on standard node and graph
classification datasets where it achieves competitive performance with
state-of-the-art models.
Related papers
- Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - Graph-level Neural Networks: Current Progress and Future Directions [61.08696673768116]
Graph-level Neural Networks (GLNNs, deep learning-based graph-level learning methods) have been attractive due to their superiority in modeling high-dimensional data.
We propose a systematic taxonomy covering GLNNs upon deep neural networks, graph neural networks, and graph pooling.
arXiv Detail & Related papers (2022-05-31T06:16:55Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Graph Neural Networks for Graphs with Heterophily: A Survey [98.45621222357397]
We provide a comprehensive review of graph neural networks (GNNs) for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs existing heterophilic GNN models.
We discuss the correlation between graph heterophily and various graph research domains, aiming to facilitate the development of more effective GNNs.
arXiv Detail & Related papers (2022-02-14T23:07:47Z) - Graph Neural Networks with Parallel Neighborhood Aggregations for Graph
Classification [14.112444998191698]
We focus on graph classification using a graph neural network (GNN) model that precomputes the node features using a bank of neighborhood aggregation graph operators arranged in parallel.
These GNN models have a natural advantage of reduced training and inference time due to the precomputations.
We demonstrate via numerical experiments that the developed model achieves state-of-the-art performance on many diverse real-world datasets.
arXiv Detail & Related papers (2021-11-22T19:19:40Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z) - How hard is to distinguish graphs with graph neural networks? [32.09819774228997]
This study derives hardness results for the classification variant of graph isomorphism in the message-passing model (MPNN)
MPNN encompasses the majority of graph neural networks used today and is universal when nodes are given unique features.
An empirical study involving 12 graph classification tasks and 420 networks reveals strong alignment between actual performance and theoretical predictions.
arXiv Detail & Related papers (2020-05-13T22:28:46Z) - Let's Agree to Degree: Comparing Graph Convolutional Networks in the
Message-Passing Framework [5.835421173589977]
We cast neural networks defined on graphs as message-passing neural networks (MPNNs)
We consider two variants of MPNNs: anonymous MPNNs and degree-aware MPNNs.
We obtain lower and upper bounds on the distinguishing power of MPNNs in terms of the distinguishing power of the Weisfeiler-Lehman (WL) algorithm.
arXiv Detail & Related papers (2020-04-06T12:14:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.