Asynchronous Algorithmic Alignment with Cocycles
- URL: http://arxiv.org/abs/2306.15632v3
- Date: Fri, 12 Jan 2024 16:13:26 GMT
- Title: Asynchronous Algorithmic Alignment with Cocycles
- Authors: Andrew Dudzik, Tamara von Glehn, Razvan Pascanu, Petar
Veli\v{c}kovi\'c
- Abstract summary: State-of-the-art neural algorithmic reasoners make use of message passing in graph neural networks (GNNs)
Typical GNNs blur the distinction between the definition and invocation of the message function, forcing a node to send messages to its neighbours at every layer, synchronously.
In this work, we explicitly separate the concepts of node state update and message function invocation.
- Score: 22.993659485873245
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: State-of-the-art neural algorithmic reasoners make use of message passing in
graph neural networks (GNNs). But typical GNNs blur the distinction between the
definition and invocation of the message function, forcing a node to send
messages to its neighbours at every layer, synchronously. When applying GNNs to
learn to execute dynamic programming algorithms, however, on most steps only a
handful of the nodes would have meaningful updates to send. One, hence, runs
the risk of inefficiencies by sending too much irrelevant data across the
graph. But more importantly, many intermediate GNN steps have to learn the
identity functions, which is a non-trivial learning problem. In this work, we
explicitly separate the concepts of node state update and message function
invocation. With this separation, we obtain a mathematical formulation that
allows us to reason about asynchronous computation in both algorithms and
neural networks. Our analysis yields several practical implementations of
synchronous scalable GNN layers that are provably invariant under various forms
of asynchrony.
Related papers
- Towards Dynamic Message Passing on Graphs [104.06474765596687]
We propose a novel dynamic message-passing mechanism for graph neural networks (GNNs)
It projects graph nodes and learnable pseudo nodes into a common space with measurable spatial relations between them.
With nodes moving in the space, their evolving relations facilitate flexible pathway construction for a dynamic message-passing process.
arXiv Detail & Related papers (2024-10-31T07:20:40Z) - Graph Neural Networks Gone Hogwild [14.665528337423249]
Message passing graph neural networks (GNNs) generate catastrophically incorrect predictions when nodes update asynchronously during inference.
In this work we identify "implicitly-defined" GNNs as a class of architectures which is provably robust to partially asynchronous "hogwild" inference.
We then propose a novel implicitly-defined GNN architecture, which we call an energy GNN.
arXiv Detail & Related papers (2024-06-29T17:11:09Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Graph Ordering Attention Networks [22.468776559433614]
Graph Neural Networks (GNNs) have been successfully used in many problems involving graph-structured data.
We introduce the Graph Ordering Attention (GOAT) layer, a novel GNN component that captures interactions between nodes in a neighborhood.
GOAT layer demonstrates its increased performance in modeling graph metrics that capture complex information.
arXiv Detail & Related papers (2022-04-11T18:13:19Z) - AEGNN: Asynchronous Event-based Graph Neural Networks [54.528926463775946]
Event-based Graph Neural Networks generalize standard GNNs to process events as "evolving"-temporal graphs.
AEGNNs are easily trained on synchronous inputs and can be converted to efficient, "asynchronous" networks at test time.
arXiv Detail & Related papers (2022-03-31T16:21:12Z) - Graph Neural Networks are Dynamic Programmers [0.0]
Graph neural networks (GNNs) are claimed to align with dynamic programming (DP)
Here we show, using methods from theory and abstract algebra, that there exists an intricate connection between GNNs and DP.
arXiv Detail & Related papers (2022-03-29T13:27:28Z) - VQ-GNN: A Universal Framework to Scale up Graph Neural Networks using
Vector Quantization [70.8567058758375]
VQ-GNN is a universal framework to scale up any convolution-based GNNs using Vector Quantization (VQ) without compromising the performance.
Our framework avoids the "neighbor explosion" problem of GNNs using quantized representations combined with a low-rank version of the graph convolution matrix.
arXiv Detail & Related papers (2021-10-27T11:48:50Z) - Identity-aware Graph Neural Networks [63.6952975763946]
We develop a class of message passing Graph Neural Networks (ID-GNNs) with greater expressive power than the 1-WL test.
ID-GNN extends existing GNN architectures by inductively considering nodes' identities during message passing.
We show that transforming existing GNNs to ID-GNNs yields on average 40% accuracy improvement on challenging node, edge, and graph property prediction tasks.
arXiv Detail & Related papers (2021-01-25T18:59:01Z) - Towards Deeper Graph Neural Networks with Differentiable Group
Normalization [61.20639338417576]
Graph neural networks (GNNs) learn the representation of a node by aggregating its neighbors.
Over-smoothing is one of the key issues which limit the performance of GNNs as the number of layers increases.
We introduce two over-smoothing metrics and a novel technique, i.e., differentiable group normalization (DGN)
arXiv Detail & Related papers (2020-06-12T07:18:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.