Graph Neural Networks for temporal graphs: State of the art, open
challenges, and opportunities
- URL: http://arxiv.org/abs/2302.01018v4
- Date: Sat, 8 Jul 2023 12:43:42 GMT
- Title: Graph Neural Networks for temporal graphs: State of the art, open
challenges, and opportunities
- Authors: Antonio Longa, Veronica Lachi, Gabriele Santin, Monica Bianchini,
Bruno Lepri, Pietro Lio, Franco Scarselli and Andrea Passerini
- Abstract summary: Graph Neural Networks (GNNs) have become the leading paradigm for learning on (static) graph-structured data.
Recent years, GNN-based models for temporal graphs have emerged as a promising area of research to extend the capabilities of GNNs.
We provide the first comprehensive overview of the current state-of-the-art of temporal GNN, introducing a rigorous formalization of learning settings and tasks.
We conclude the survey with a discussion of the most relevant open challenges for the field, from both research and application perspectives.
- Score: 15.51428011794213
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have become the leading paradigm for learning on
(static) graph-structured data. However, many real-world systems are dynamic in
nature, since the graph and node/edge attributes change over time. In recent
years, GNN-based models for temporal graphs have emerged as a promising area of
research to extend the capabilities of GNNs. In this work, we provide the first
comprehensive overview of the current state-of-the-art of temporal GNN,
introducing a rigorous formalization of learning settings and tasks and a novel
taxonomy categorizing existing approaches in terms of how the temporal aspect
is represented and processed. We conclude the survey with a discussion of the
most relevant open challenges for the field, from both research and application
perspectives.
Related papers
- A survey of dynamic graph neural networks [26.162035361191805]
Graph neural networks (GNNs) have emerged as a powerful tool for effectively mining and learning from graph-structured data.
This paper provides a comprehensive review of the fundamental concepts, key techniques, and state-of-the-art dynamic GNN models.
arXiv Detail & Related papers (2024-04-28T15:07:48Z) - On The Temporal Domain of Differential Equation Inspired Graph Neural
Networks [14.779420473274737]
We show that our model, called TDE-GNN, can capture a wide range of temporal dynamics that go beyond typical first or second-order methods.
We demonstrate the benefit of learning the temporal dependencies using our method rather than using pre-defined temporal dynamics on several graph benchmarks.
arXiv Detail & Related papers (2024-01-20T01:12:57Z) - LasTGL: An Industrial Framework for Large-Scale Temporal Graph Learning [61.4707298969173]
We introduce LasTGL, an industrial framework that integrates unified and unified implementations of common temporal graph learning algorithms.
LasTGL provides comprehensive temporal graph datasets, TGNN models and utilities along with well-documented tutorials.
arXiv Detail & Related papers (2023-11-28T08:45:37Z) - DyExplainer: Explainable Dynamic Graph Neural Networks [37.16783248212211]
We present DyExplainer, a novel approach to explaining dynamic Graph Neural Networks (GNNs) on the fly.
DyExplainer trains a dynamic GNN backbone to extract representations of the graph at each snapshot.
We also augment our approach with contrastive learning techniques to provide priori-guided regularization.
arXiv Detail & Related papers (2023-10-25T05:26:33Z) - Towards Graph Foundation Models: A Survey and Beyond [66.37994863159861]
Foundation models have emerged as critical components in a variety of artificial intelligence applications.
The capabilities of foundation models to generalize and adapt motivate graph machine learning researchers to discuss the potential of developing a new graph learning paradigm.
This article introduces the concept of Graph Foundation Models (GFMs), and offers an exhaustive explanation of their key characteristics and underlying technologies.
arXiv Detail & Related papers (2023-10-18T09:31:21Z) - Deep learning for dynamic graphs: models and benchmarks [16.851689741256912]
Recent progress in research on Deep Graph Networks (DGNs) has led to a maturation of the domain of learning on graphs.
Despite the growth of this research field, there are still important challenges that are yet unsolved.
arXiv Detail & Related papers (2023-07-12T12:02:36Z) - A Survey on Graph Neural Networks for Time Series: Forecasting, Classification, Imputation, and Anomaly Detection [98.41798478488101]
Time series analytics is crucial to unlocking the wealth of information implicit in available data.
Recent advancements in graph neural networks (GNNs) have led to a surge in GNN-based approaches for time series analysis.
This survey brings together a vast array of knowledge on GNN-based time series research, highlighting foundations, practical applications, and opportunities of graph neural networks for time series analysis.
arXiv Detail & Related papers (2023-07-07T08:05:03Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - An Explainer for Temporal Graph Neural Networks [27.393641343203363]
Temporal graph neural networks (TGNNs) have been widely used for modeling time-evolving graph-related tasks.
We propose a novel explainer framework for TGNN models.
arXiv Detail & Related papers (2022-09-02T04:12:40Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.