Tensor Graph Convolutional Network for Dynamic Graph Representation
Learning
- URL: http://arxiv.org/abs/2401.07065v1
- Date: Sat, 13 Jan 2024 12:49:56 GMT
- Title: Tensor Graph Convolutional Network for Dynamic Graph Representation
Learning
- Authors: Ling Wang, Ye Yuan
- Abstract summary: Dynamic graphs (DG) describe dynamic interactions between entities in many practical scenarios.
Most existing DG representation learning models combine graph convolutional network and sequence neural network.
We propose a tensor graph convolutional network to learn DG representations in one convolution framework.
- Score: 12.884025972321316
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Dynamic graphs (DG) describe dynamic interactions between entities in many
practical scenarios. Most existing DG representation learning models combine
graph convolutional network and sequence neural network, which model
spatial-temporal dependencies through two different types of neural networks.
However, this hybrid design cannot well capture the spatial-temporal continuity
of a DG. In this paper, we propose a tensor graph convolutional network to
learn DG representations in one convolution framework based on the tensor
product with the following two-fold ideas: a) representing the information of
DG by tensor form; b) adopting tensor product to design a tensor graph
convolutional network modeling spatial-temporal feature simultaneously.
Experiments on real-world DG datasets demonstrate that our model obtains
state-of-the-art performance.
Related papers
- Spatial-temporal Graph Convolutional Networks with Diversified Transformation for Dynamic Graph Representation Learning [6.9243139068960895]
This study proposes a spatial-temporal graph convolutional networks with diversified transformation (STGCNDT)
It includes three aspects: a) constructing a unified graph tensor convolutional network (GTCN) using tensor M-products without the need to representtemporal information separately; b) introducing three transformation schemes in GTN to model complex temporal patterns to aggregate temporal information; and c) constructing an ensemble of diversified transformations to obtain higher representation capabilities.
arXiv Detail & Related papers (2024-08-05T09:40:47Z) - DTFormer: A Transformer-Based Method for Discrete-Time Dynamic Graph Representation Learning [38.53424185696828]
The representation learning of Discrete-Time Dynamic Graphs (DTDGs) has been extensively applied to model the dynamics of temporally changing entities and their evolving connections.
This paper introduces a novel representation learning method DTFormer for DTDGs, pivoting from the traditional GNN+RNN framework to a Transformer-based architecture.
arXiv Detail & Related papers (2024-07-26T05:46:23Z) - A survey of dynamic graph neural networks [26.162035361191805]
Graph neural networks (GNNs) have emerged as a powerful tool for effectively mining and learning from graph-structured data.
This paper provides a comprehensive review of the fundamental concepts, key techniques, and state-of-the-art dynamic GNN models.
arXiv Detail & Related papers (2024-04-28T15:07:48Z) - Learning Dynamic Graph Embeddings with Neural Controlled Differential
Equations [21.936437653875245]
This paper focuses on representation learning for dynamic graphs with temporal interactions.
We propose a generic differential model for dynamic graphs that characterises the continuously dynamic evolution of node embedding trajectories.
Our framework exhibits several desirable characteristics, including the ability to express dynamics on evolving graphs without integration by segments.
arXiv Detail & Related papers (2023-02-22T12:59:38Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z) - Pre-Training on Dynamic Graph Neural Networks [26.139844652756334]
This paper proposes a pre-training method on dynamic graph neural networks (PT-DGNN)
It uses dynamic attributed graph generation tasks to simultaneously learn the structure, semantics, and evolution features of the graph.
arXiv Detail & Related papers (2021-02-24T16:06:32Z) - E(n) Equivariant Graph Neural Networks [86.75170631724548]
This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reflections and permutations called E(n)-Equivariant Graph Neural Networks (EGNNs)
In contrast with existing methods, our work does not require computationally expensive higher-order representations in intermediate layers while it still achieves competitive or better performance.
arXiv Detail & Related papers (2021-02-19T10:25:33Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.