Temporal Graph Networks for Deep Learning on Dynamic Graphs
- URL: http://arxiv.org/abs/2006.10637v3
- Date: Fri, 9 Oct 2020 11:39:32 GMT
- Title: Temporal Graph Networks for Deep Learning on Dynamic Graphs
- Authors: Emanuele Rossi, Ben Chamberlain, Fabrizio Frasca, Davide Eynard,
Federico Monti, Michael Bronstein
- Abstract summary: We present Temporal Graph Networks (TGNs), a generic, efficient framework for deep learning on dynamic graphs represented as sequences of timed events.
Thanks to a novel combination of memory modules and graph-based operators, TGNs are able to significantly outperform previous approaches being at the same time more computationally efficient.
- Score: 4.5158585619109495
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have recently become increasingly popular due to
their ability to learn complex systems of relations or interactions arising in
a broad spectrum of problems ranging from biology and particle physics to
social networks and recommendation systems. Despite the plethora of different
models for deep learning on graphs, few approaches have been proposed thus far
for dealing with graphs that present some sort of dynamic nature (e.g. evolving
features or connectivity over time). In this paper, we present Temporal Graph
Networks (TGNs), a generic, efficient framework for deep learning on dynamic
graphs represented as sequences of timed events. Thanks to a novel combination
of memory modules and graph-based operators, TGNs are able to significantly
outperform previous approaches being at the same time more computationally
efficient. We furthermore show that several previous models for learning on
dynamic graphs can be cast as specific instances of our framework. We perform a
detailed ablation study of different components of our framework and devise the
best configuration that achieves state-of-the-art performance on several
transductive and inductive prediction tasks for dynamic graphs.
Related papers
- Information propagation dynamics in Deep Graph Networks [1.8130068086063336]
Deep Graph Networks (DGNs) have emerged as a family of deep learning models that can process and learn structured information.
This thesis investigates the dynamics of information propagation within DGNs for static and dynamic graphs, focusing on their design as dynamical systems.
arXiv Detail & Related papers (2024-10-14T12:55:51Z) - Node-Time Conditional Prompt Learning In Dynamic Graphs [14.62182210205324]
We propose DYGPROMPT, a novel pre-training and prompt learning framework for dynamic graph modeling.
We recognize that node and time features mutually characterize each other, and propose dual condition-nets to model the evolving node-time patterns in downstream tasks.
arXiv Detail & Related papers (2024-05-22T19:10:24Z) - Decoupled Graph Neural Networks for Large Dynamic Graphs [14.635923016087503]
We propose a decoupled graph neural network for large dynamic graphs.
We show that our algorithm achieves state-of-the-art performance in both kinds of dynamic graphs.
arXiv Detail & Related papers (2023-05-14T23:00:10Z) - Dynamic Graph Representation Learning with Neural Networks: A Survey [0.0]
Dynamic graph representations have emerged as a new machine learning problem.
This paper aims at providing a review of problems and models related to dynamic graph learning.
arXiv Detail & Related papers (2023-04-12T09:39:17Z) - Time-aware Dynamic Graph Embedding for Asynchronous Structural Evolution [60.695162101159134]
Existing works merely view a dynamic graph as a sequence of changes.
We formulate dynamic graphs as temporal edge sequences associated with joining time of.
vertex and timespan of edges.
A time-aware Transformer is proposed to embed.
vertex' dynamic connections and ToEs into the learned.
vertex representations.
arXiv Detail & Related papers (2022-07-01T15:32:56Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Efficient Dynamic Graph Representation Learning at Scale [66.62859857734104]
We propose Efficient Dynamic Graph lEarning (EDGE), which selectively expresses certain temporal dependency via training loss to improve the parallelism in computations.
We show that EDGE can scale to dynamic graphs with millions of nodes and hundreds of millions of temporal events and achieve new state-of-the-art (SOTA) performance.
arXiv Detail & Related papers (2021-12-14T22:24:53Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.