Graph Representation Learning with Individualization and Refinement
- URL: http://arxiv.org/abs/2203.09141v1
- Date: Thu, 17 Mar 2022 07:50:48 GMT
- Title: Graph Representation Learning with Individualization and Refinement
- Authors: Mohammed Haroon Dupty, Wee Sun Lee
- Abstract summary: Graph Neural Networks (GNNs) have emerged as prominent models for representation learning on graph structured data.
In this work, we follow the classical approach of Individualization and Refinement (IR)
Our technique lets us learn richer node embeddings while keeping the computational complexity manageable.
- Score: 19.436520792345064
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have emerged as prominent models for
representation learning on graph structured data. GNNs follow an approach of
message passing analogous to 1-dimensional Weisfeiler Lehman (1-WL) test for
graph isomorphism and consequently are limited by the distinguishing power of
1-WL. More expressive higher-order GNNs which operate on k-tuples of nodes need
increased computational resources in order to process higher-order tensors.
Instead of the WL approach, in this work, we follow the classical approach of
Individualization and Refinement (IR), a technique followed by most practical
isomorphism solvers. Individualization refers to artificially distinguishing a
node in the graph and refinement is the propagation of this information to
other nodes through message passing. We learn to adaptively select nodes to
individualize and to aggregate the resulting graphs after refinement to help
handle the complexity. Our technique lets us learn richer node embeddings while
keeping the computational complexity manageable. Theoretically, we show that
our procedure is more expressive than the 1-WL test. Experiments show that our
method outperforms prominent 1-WL GNN models as well as competitive
higher-order baselines on several benchmark synthetic and real datasets.
Furthermore, our method opens new doors for exploring the paradigm of learning
on graph structures with individualization and refinement.
Related papers
- Graph as a feature: improving node classification with non-neural graph-aware logistic regression [2.952177779219163]
Graph-aware Logistic Regression (GLR) is a non-neural model designed for node classification tasks.
Unlike traditional graph algorithms that use only a fraction of the information accessible to GNNs, our proposed model simultaneously leverages both node features and the relationships between entities.
arXiv Detail & Related papers (2024-11-19T08:32:14Z) - Enhanced Expressivity in Graph Neural Networks with Lanczos-Based Linear Constraints [7.605749412696919]
Graph Neural Networks (GNNs) excel in handling graph-structured data but often underperform in link prediction tasks.
We present a novel method to enhance the expressivity of GNNs by embedding induced subgraphs into the graph Laplacian matrix's eigenbasis.
Our method achieves 20x and 10x speedup by only requiring 5% and 10% data from the PubMed and OGBL-Vessel datasets.
arXiv Detail & Related papers (2024-08-22T12:22:00Z) - SPGNN: Recognizing Salient Subgraph Patterns via Enhanced Graph Convolution and Pooling [25.555741218526464]
Graph neural networks (GNNs) have revolutionized the field of machine learning on non-Euclidean data such as graphs and networks.
We propose a concatenation-based graph convolution mechanism that injectively updates node representations.
We also design a novel graph pooling module, called WL-SortPool, to learn important subgraph patterns in a deep-learning manner.
arXiv Detail & Related papers (2024-04-21T13:11:59Z) - Uplifting the Expressive Power of Graph Neural Networks through Graph
Partitioning [3.236774847052122]
We study the expressive power of graph neural networks through the lens of graph partitioning.
We introduce a novel GNN architecture, namely Graph Partitioning Neural Networks (GPNNs)
arXiv Detail & Related papers (2023-12-14T06:08:35Z) - Beyond 1-WL with Local Ego-Network Encodings [0.42970700836450487]
We show that ego-networks can produce a structural encoding scheme for arbitrary graphs with greater expressivity than the Weisfeiler-Lehman (1-WL) test.
We introduce IGEL, a preprocessing step to produce features that augment node representations by encoding ego-networks into sparse vectors.
arXiv Detail & Related papers (2022-11-27T18:14:03Z) - Representation Power of Graph Neural Networks: Improved Expressivity via
Algebraic Analysis [124.97061497512804]
We show that standard Graph Neural Networks (GNNs) produce more discriminative representations than the Weisfeiler-Lehman (WL) algorithm.
We also show that simple convolutional architectures with white inputs, produce equivariant features that count the closed paths in the graph.
arXiv Detail & Related papers (2022-05-19T18:40:25Z) - GraphSVX: Shapley Value Explanations for Graph Neural Networks [81.83769974301995]
Graph Neural Networks (GNNs) achieve significant performance for various learning tasks on geometric data.
In this paper, we propose a unified framework satisfied by most existing GNN explainers.
We introduce GraphSVX, a post hoc local model-agnostic explanation method specifically designed for GNNs.
arXiv Detail & Related papers (2021-04-18T10:40:37Z) - Uniting Heterogeneity, Inductiveness, and Efficiency for Graph
Representation Learning [68.97378785686723]
graph neural networks (GNNs) have greatly advanced the performance of node representation learning on graphs.
A majority class of GNNs are only designed for homogeneous graphs, leading to inferior adaptivity to the more informative heterogeneous graphs.
We propose a novel inductive, meta path-free message passing scheme that packs up heterogeneous node features with their associated edges from both low- and high-order neighbor nodes.
arXiv Detail & Related papers (2021-04-04T23:31:39Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Iterative Deep Graph Learning for Graph Neural Networks: Better and
Robust Node Embeddings [53.58077686470096]
We propose an end-to-end graph learning framework, namely Iterative Deep Graph Learning (IDGL) for jointly and iteratively learning graph structure and graph embedding.
Our experiments show that our proposed IDGL models can consistently outperform or match the state-of-the-art baselines.
arXiv Detail & Related papers (2020-06-21T19:49:15Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.