Efficient-Dyn: Dynamic Graph Representation Learning via Event-based
Temporal Sparse Attention Network
- URL: http://arxiv.org/abs/2201.01384v1
- Date: Tue, 4 Jan 2022 23:52:24 GMT
- Title: Efficient-Dyn: Dynamic Graph Representation Learning via Event-based
Temporal Sparse Attention Network
- Authors: Yan Pang, Chao Liu
- Abstract summary: Dynamic graph neural networks have received more and more attention from researchers.
We propose a novel dynamic graph neural network, Efficient-Dyn.
It adaptively encodes temporal information into a sequence of patches with an equal amount of temporal-topological structure.
- Score: 2.0047096160313456
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Static graph neural networks have been widely used in modeling and
representation learning of graph structure data. However, many real-world
problems, such as social networks, financial transactions, recommendation
systems, etc., are dynamic, that is, nodes and edges are added or deleted over
time. Therefore, in recent years, dynamic graph neural networks have received
more and more attention from researchers. In this work, we propose a novel
dynamic graph neural network, Efficient-Dyn. It adaptively encodes temporal
information into a sequence of patches with an equal amount of
temporal-topological structure. Therefore, while avoiding the use of snapshots
to cause information loss, it also achieves a finer time granularity, which is
close to what continuous networks could provide. In addition, we also designed
a lightweight module, Sparse Temporal Transformer, to compute node
representations through both structural neighborhoods and temporal dynamics.
Since the fully-connected attention conjunction is simplified, the computation
cost is far lower than the current state-of-the-arts. Link prediction
experiments are conducted on both continuous and discrete graph datasets.
Through comparing with several state-of-the-art graph embedding baselines, the
experimental results demonstrate that Efficient-Dyn has a faster inference
speed while having competitive performance.
Related papers
- TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Scalable Spatiotemporal Graph Neural Networks [14.415967477487692]
Graph neural networks (GNNs) are often the core component of the forecasting architecture.
In most pretemporal GNNs, the computational complexity scales up to a quadratic factor with the length of the sequence times the number of links in the graph.
We propose a scalable architecture that exploits an efficient encoding of both temporal and spatial dynamics.
arXiv Detail & Related papers (2022-09-14T09:47:38Z) - Scaling Up Dynamic Graph Representation Learning via Spiking Neural
Networks [23.01100055999135]
We present a scalable framework, namely SpikeNet, to efficiently capture the temporal and structural patterns of temporal graphs.
As a low-power alternative to RNNs, SNNs explicitly model graph dynamics as spike trains of neuron populations.
SpikeNet generalizes to a large temporal graph with significantly fewer parameters and computation overheads.
arXiv Detail & Related papers (2022-08-15T09:22:15Z) - Continuous Temporal Graph Networks for Event-Based Graph Data [41.786721257905555]
We propose Continuous Temporal Graph Networks (CTGNs) to capture the continuous dynamics of temporal graph data.
Key idea is to use neural ordinary differential equations (ODE) to characterize the continuous dynamics of node representations over dynamic graphs.
Experiment results on both transductive and inductive tasks demonstrate the effectiveness of our proposed approach.
arXiv Detail & Related papers (2022-05-31T16:17:02Z) - Deep Dynamic Effective Connectivity Estimation from Multivariate Time
Series [0.0]
We develop dynamic effective connectivity estimation via neural network training (DECENNT)
DECENNT outperforms state-of-the-art (SOTA) methods on five different tasks and infers interpretable task-specific dynamic graphs.
arXiv Detail & Related papers (2022-02-04T21:14:21Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - Graph Structure of Neural Networks [104.33754950606298]
We show how the graph structure of neural networks affect their predictive performance.
A "sweet spot" of relational graphs leads to neural networks with significantly improved predictive performance.
Top-performing neural networks have graph structure surprisingly similar to those of real biological neural networks.
arXiv Detail & Related papers (2020-07-13T17:59:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.