TPGNN: Learning High-order Information in Dynamic Graphs via Temporal
Propagation
- URL: http://arxiv.org/abs/2210.01171v2
- Date: Thu, 13 Apr 2023 23:41:39 GMT
- Title: TPGNN: Learning High-order Information in Dynamic Graphs via Temporal
Propagation
- Authors: Zehong Wang, Qi Li, Donghua Yu
- Abstract summary: We propose a temporal propagation-based graph neural network, namely TPGNN.
Propagator propagates messages from anchor node to temporal neighbors within $k$-hop, and then simultaneously update the state of neighborhoods.
To prevent over-smoothing, the model compels the messages from $n$-hop neighbors to update the $n$-hop memory vector preserved on the anchor.
- Score: 7.616789069832552
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporal graph is an abstraction for modeling dynamic systems that consist of
evolving interaction elements. In this paper, we aim to solve an important yet
neglected problem -- how to learn information from high-order neighbors in
temporal graphs? -- to enhance the informativeness and discriminativeness for
the learned node representations. We argue that when learning high-order
information from temporal graphs, we encounter two challenges, i.e.,
computational inefficiency and over-smoothing, that cannot be solved by
conventional techniques applied on static graphs. To remedy these deficiencies,
we propose a temporal propagation-based graph neural network, namely TPGNN. To
be specific, the model consists of two distinct components, i.e., propagator
and node-wise encoder. The propagator is leveraged to propagate messages from
the anchor node to its temporal neighbors within $k$-hop, and then
simultaneously update the state of neighborhoods, which enables efficient
computation, especially for a deep model. In addition, to prevent
over-smoothing, the model compels the messages from $n$-hop neighbors to update
the $n$-hop memory vector preserved on the anchor. The node-wise encoder adopts
transformer architecture to learn node representations by explicitly learning
the importance of memory vectors preserved on the node itself, that is,
implicitly modeling the importance of messages from neighbors at different
layers, thus mitigating the over-smoothing. Since the encoding process will not
query temporal neighbors, we can dramatically save time consumption in
inference. Extensive experiments on temporal link prediction and node
classification demonstrate the superiority of TPGNN over state-of-the-art
baselines in efficiency and robustness.
Related papers
- Higher-order Spatio-temporal Physics-incorporated Graph Neural Network for Multivariate Time Series Imputation [9.450743095412896]
Missing values is an essential but challenging issue due to the complex latent-temporal correlation and dynamic nature of time series.
We propose a higher-ordertemporal physics-incorporated Graph Neural Networks (HSPGNN) to address this problem.
HSPGNN provides better dynamic analysis and explanation than traditional data-driven models.
arXiv Detail & Related papers (2024-05-16T16:35:43Z) - CTRL: Continuous-Time Representation Learning on Temporal Heterogeneous Information Network [32.42051167404171]
We propose a Continuous-Time Representation Learning model on temporal HINs.
We train the model with a future event (a subgraph) prediction task to capture the evolution of the high-order network structure.
The results demonstrate that our model significantly boosts performance and outperforms various state-of-the-art approaches.
arXiv Detail & Related papers (2024-05-11T03:39:22Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Self-Supervised Temporal Graph learning with Temporal and Structural Intensity Alignment [53.72873672076391]
Temporal graph learning aims to generate high-quality representations for graph-based tasks with dynamic information.
We propose a self-supervised method called S2T for temporal graph learning, which extracts both temporal and structural information.
S2T achieves at most 10.13% performance improvement compared with the state-of-the-art competitors on several datasets.
arXiv Detail & Related papers (2023-02-15T06:36:04Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Efficient-Dyn: Dynamic Graph Representation Learning via Event-based
Temporal Sparse Attention Network [2.0047096160313456]
Dynamic graph neural networks have received more and more attention from researchers.
We propose a novel dynamic graph neural network, Efficient-Dyn.
It adaptively encodes temporal information into a sequence of patches with an equal amount of temporal-topological structure.
arXiv Detail & Related papers (2022-01-04T23:52:24Z) - Uniting Heterogeneity, Inductiveness, and Efficiency for Graph
Representation Learning [68.97378785686723]
graph neural networks (GNNs) have greatly advanced the performance of node representation learning on graphs.
A majority class of GNNs are only designed for homogeneous graphs, leading to inferior adaptivity to the more informative heterogeneous graphs.
We propose a novel inductive, meta path-free message passing scheme that packs up heterogeneous node features with their associated edges from both low- and high-order neighbor nodes.
arXiv Detail & Related papers (2021-04-04T23:31:39Z) - Spatio-Temporal Inception Graph Convolutional Networks for
Skeleton-Based Action Recognition [126.51241919472356]
We design a simple and highly modularized graph convolutional network architecture for skeleton-based action recognition.
Our network is constructed by repeating a building block that aggregates multi-granularity information from both the spatial and temporal paths.
arXiv Detail & Related papers (2020-11-26T14:43:04Z) - Inductive Representation Learning on Temporal Graphs [33.44276155380476]
temporal dynamic graphs require handling new nodes as well as capturing temporal patterns.
We propose the temporal graph attention layer to efficiently aggregate temporal-topological neighborhood features.
By stacking TGAT layers, the network recognizes the node embeddings as functions of time and is able to inductively infer embeddings for both new and observed nodes.
arXiv Detail & Related papers (2020-02-19T02:05:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.