PF-GNN: Differentiable particle filtering based approximation of
universal graph representations
- URL: http://arxiv.org/abs/2401.17752v1
- Date: Wed, 31 Jan 2024 11:26:03 GMT
- Title: PF-GNN: Differentiable particle filtering based approximation of
universal graph representations
- Authors: Mohammed Haroon Dupty, Yanfei Dong, Wee Sun Lee
- Abstract summary: Graph Neural Networks (GNNs) are known to be limited in expressive power by the 1-WL color-refinement test for graph isomorphism.
In this work, we propose to make GNNs universal by guiding the learning process with exact isomorphism solver techniques.
Our algorithm is end-to-end differentiable, can be applied with any GNN as backbone and learns richer graph representations with only linear increase in runtime.
- Score: 20.544160526945234
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Message passing Graph Neural Networks (GNNs) are known to be limited in
expressive power by the 1-WL color-refinement test for graph isomorphism. Other
more expressive models either are computationally expensive or need
preprocessing to extract structural features from the graph. In this work, we
propose to make GNNs universal by guiding the learning process with exact
isomorphism solver techniques which operate on the paradigm of
Individualization and Refinement (IR), a method to artificially introduce
asymmetry and further refine the coloring when 1-WL stops. Isomorphism solvers
generate a search tree of colorings whose leaves uniquely identify the graph.
However, the tree grows exponentially large and needs hand-crafted pruning
techniques which are not desirable from a learning perspective. We take a
probabilistic view and approximate the search tree of colorings (i.e.
embeddings) by sampling multiple paths from root to leaves of the search tree.
To learn more discriminative representations, we guide the sampling process
with particle filter updates, a principled approach for sequential state
estimation. Our algorithm is end-to-end differentiable, can be applied with any
GNN as backbone and learns richer graph representations with only linear
increase in runtime. Experimental evaluation shows that our approach
consistently outperforms leading GNN models on both synthetic benchmarks for
isomorphism detection as well as real-world datasets.
Related papers
- SPGNN: Recognizing Salient Subgraph Patterns via Enhanced Graph Convolution and Pooling [25.555741218526464]
Graph neural networks (GNNs) have revolutionized the field of machine learning on non-Euclidean data such as graphs and networks.
We propose a concatenation-based graph convolution mechanism that injectively updates node representations.
We also design a novel graph pooling module, called WL-SortPool, to learn important subgraph patterns in a deep-learning manner.
arXiv Detail & Related papers (2024-04-21T13:11:59Z) - How Graph Neural Networks Learn: Lessons from Training Dynamics [80.41778059014393]
We study the training dynamics in function space of graph neural networks (GNNs)
We find that the gradient descent optimization of GNNs implicitly leverages the graph structure to update the learned function.
This finding offers new interpretable insights into when and why the learned GNN functions generalize.
arXiv Detail & Related papers (2023-10-08T10:19:56Z) - Structural Node Embeddings with Homomorphism Counts [2.0131144893314232]
homomorphism counts capture local structural information.
We experimentally show the effectiveness of homomorphism count based node embeddings.
Our approach capitalises on the efficient computability of graph homomorphism counts for bounded treewidth graph classes.
arXiv Detail & Related papers (2023-08-29T13:14:53Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Representation Power of Graph Neural Networks: Improved Expressivity via
Algebraic Analysis [124.97061497512804]
We show that standard Graph Neural Networks (GNNs) produce more discriminative representations than the Weisfeiler-Lehman (WL) algorithm.
We also show that simple convolutional architectures with white inputs, produce equivariant features that count the closed paths in the graph.
arXiv Detail & Related papers (2022-05-19T18:40:25Z) - Graph Representation Learning with Individualization and Refinement [19.436520792345064]
Graph Neural Networks (GNNs) have emerged as prominent models for representation learning on graph structured data.
In this work, we follow the classical approach of Individualization and Refinement (IR)
Our technique lets us learn richer node embeddings while keeping the computational complexity manageable.
arXiv Detail & Related papers (2022-03-17T07:50:48Z) - Learning to Learn Graph Topologies [27.782971146122218]
We learn a mapping from node data to the graph structure based on the idea of learning to optimise (L2O)
The model is trained in an end-to-end fashion with pairs of node data and graph samples.
Experiments on both synthetic and real-world data demonstrate that our model is more efficient than classic iterative algorithms in learning a graph with specific topological properties.
arXiv Detail & Related papers (2021-10-19T08:42:38Z) - Scaling up graph homomorphism for classification via sampling [1.662966122370634]
We study the use of graph homomorphism density features as a scalable alternative to homomorphism numbers.
We propose a high-performance implementation of a simple sampling algorithm which computes additive approximations of homomorphism densities.
arXiv Detail & Related papers (2021-04-08T20:25:37Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.