Pointer Graph Networks
- URL: http://arxiv.org/abs/2006.06380v2
- Date: Sun, 18 Oct 2020 20:00:59 GMT
- Title: Pointer Graph Networks
- Authors: Petar Veli\v{c}kovi\'c, Lars Buesing, Matthew C. Overlan, Razvan
Pascanu, Oriol Vinyals, Charles Blundell
- Abstract summary: Graph neural networks (GNNs) are typically applied to static graphs that are assumed to be known upfront.
Pointer Graph Networks (PGNs) augment sets or graphs with additional inferred edges for improved model generalisation ability.
PGNs allow each node to dynamically point to another node, followed by message passing over these pointers.
- Score: 48.44209547013781
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) are typically applied to static graphs that are
assumed to be known upfront. This static input structure is often informed
purely by insight of the machine learning practitioner, and might not be
optimal for the actual task the GNN is solving. In absence of reliable domain
expertise, one might resort to inferring the latent graph structure, which is
often difficult due to the vast search space of possible graphs. Here we
introduce Pointer Graph Networks (PGNs) which augment sets or graphs with
additional inferred edges for improved model generalisation ability. PGNs allow
each node to dynamically point to another node, followed by message passing
over these pointers. The sparsity of this adaptable graph structure makes
learning tractable while still being sufficiently expressive to simulate
complex algorithms. Critically, the pointing mechanism is directly supervised
to model long-term sequences of operations on classical data structures,
incorporating useful structural inductive biases from theoretical computer
science. Qualitatively, we demonstrate that PGNs can learn parallelisable
variants of pointer-based data structures, namely disjoint set unions and
link/cut trees. PGNs generalise out-of-distribution to 5x larger test inputs on
dynamic graph connectivity tasks, outperforming unrestricted GNNs and Deep
Sets.
Related papers
- NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Inferential SIR-GN: Scalable Graph Representation Learning [0.4699313647907615]
Graph representation learning methods generate numerical vector representations for the nodes in a network.
In this work, we propose Inferential SIR-GN, a model which is pre-trained on random graphs, then computes node representations rapidly.
We demonstrate that the model is able to capture node's structural role information, and show excellent performance at node and graph classification tasks, on unseen networks.
arXiv Detail & Related papers (2021-11-08T20:56:37Z) - Hierarchical graph neural nets can capture long-range interactions [8.067880298298185]
We study hierarchical message passing models that leverage a multi-resolution representation of a given graph.
This facilitates learning of features that span large receptive fields without loss of local information.
We introduce Hierarchical Graph Net (HGNet), which for any two connected nodes guarantees existence of message-passing paths of at most logarithmic length.
arXiv Detail & Related papers (2021-07-15T16:24:22Z) - SLAPS: Self-Supervision Improves Structure Learning for Graph Neural
Networks [14.319159694115655]
We propose the Simultaneous Learning of Adjacency and GNN Parameters with Self-supervision, or SLAPS, a method that provides more supervision for inferring a graph structure through self-supervision.
A comprehensive experimental study demonstrates that SLAPS scales to large graphs with hundreds of thousands of nodes and outperforms several models that have been proposed to learn a task-specific graph structure on established benchmarks.
arXiv Detail & Related papers (2021-02-09T18:56:01Z) - Scalable Graph Neural Networks for Heterogeneous Graphs [12.44278942365518]
Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data.
Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks.
In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities.
arXiv Detail & Related papers (2020-11-19T06:03:35Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z) - Eigen-GNN: A Graph Structure Preserving Plug-in for GNNs [95.63153473559865]
Graph Neural Networks (GNNs) are emerging machine learning models on graphs.
Most existing GNN models in practice are shallow and essentially feature-centric.
We show empirically and analytically that the existing shallow GNNs cannot preserve graph structures well.
We propose Eigen-GNN, a plug-in module to boost GNNs ability in preserving graph structures.
arXiv Detail & Related papers (2020-06-08T02:47:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.