Learning Dynamic Graphs via Tensorized and Lightweight Graph Convolutional Networks
- URL: http://arxiv.org/abs/2504.15613v1
- Date: Tue, 22 Apr 2025 06:13:32 GMT
- Title: Learning Dynamic Graphs via Tensorized and Lightweight Graph Convolutional Networks
- Authors: Minglian Han,
- Abstract summary: A dynamic graph convolutional network (DGCN) has been successfully applied to perform precise representation learning on a dynamic graph.<n>This study proposes a novelized Lightweight Graph Conal Network (TLGCN) for accurate dynamic graph learning.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A dynamic graph (DG) is frequently encountered in numerous real-world scenarios. Consequently, A dynamic graph convolutional network (DGCN) has been successfully applied to perform precise representation learning on a DG. However, conventional DGCNs typically consist of a static GCN coupled with a sequence neural network (SNN) to model spatial and temporal patterns separately. This decoupled modeling mechanism inherently disrupts the intricate spatio-temporal dependencies. To address the issue, this study proposes a novel Tensorized Lightweight Graph Convolutional Network (TLGCN) for accurate dynamic graph learning. It mainly contains the following two key concepts: a) designing a novel spatio-temporal information propagation method for joint propagation of spatio-temporal information based on the tensor M-product framework; b) proposing a tensorized lightweight graph convolutional network based on the above method, which significantly reduces the memory occupation of the model by omitting complex feature transformation and nonlinear activation. Numerical experiments on four real-world datasets demonstrate that the proposed TLGCN outperforms the state-of-the-art models in the weight estimation task on DGs.
Related papers
- Spatial-temporal Graph Convolutional Networks with Diversified Transformation for Dynamic Graph Representation Learning [6.9243139068960895]
This study proposes a spatial-temporal graph convolutional networks with diversified transformation (STGCNDT)
It includes three aspects: a) constructing a unified graph tensor convolutional network (GTCN) using tensor M-products without the need to representtemporal information separately; b) introducing three transformation schemes in GTN to model complex temporal patterns to aggregate temporal information; and c) constructing an ensemble of diversified transformations to obtain higher representation capabilities.
arXiv Detail & Related papers (2024-08-05T09:40:47Z) - DTFormer: A Transformer-Based Method for Discrete-Time Dynamic Graph Representation Learning [38.53424185696828]
The representation learning of Discrete-Time Dynamic Graphs (DTDGs) has been extensively applied to model the dynamics of temporally changing entities and their evolving connections.
This paper introduces a novel representation learning method DTFormer for DTDGs, pivoting from the traditional GNN+RNN framework to a Transformer-based architecture.
arXiv Detail & Related papers (2024-07-26T05:46:23Z) - Dynamic Graph Unlearning: A General and Efficient Post-Processing Method via Gradient Transformation [24.20087360102464]
We study the dynamic graph unlearning for the first time and propose an effective, efficient, general, and post-processing method to implement DGNN unlearning.<n>Our method has the potential to handle future unlearning requests with significant performance gains.
arXiv Detail & Related papers (2024-05-23T10:26:18Z) - A survey of dynamic graph neural networks [26.162035361191805]
Graph neural networks (GNNs) have emerged as a powerful tool for effectively mining and learning from graph-structured data.
This paper provides a comprehensive review of the fundamental concepts, key techniques, and state-of-the-art dynamic GNN models.
arXiv Detail & Related papers (2024-04-28T15:07:48Z) - Tensor Graph Convolutional Network for Dynamic Graph Representation
Learning [12.884025972321316]
Dynamic graphs (DG) describe dynamic interactions between entities in many practical scenarios.
Most existing DG representation learning models combine graph convolutional network and sequence neural network.
We propose a tensor graph convolutional network to learn DG representations in one convolution framework.
arXiv Detail & Related papers (2024-01-13T12:49:56Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Dynamic Graph Representation Learning via Edge Temporal States Modeling and Structure-reinforced Transformer [5.093187534912688]
We introduce the Recurrent Structure-reinforced Graph Transformer (RSGT), a novel framework for dynamic graph representation learning.
RSGT captures temporal node representations encoding both graph topology and evolving dynamics through a recurrent learning paradigm.
We show RSGT's superior performance in discrete dynamic graph representation learning, consistently outperforming existing methods in dynamic link prediction tasks.
arXiv Detail & Related papers (2023-04-20T04:12:50Z) - DG-STGCN: Dynamic Spatial-Temporal Modeling for Skeleton-based Action
Recognition [77.87404524458809]
We propose a new framework for skeleton-based action recognition, namely Dynamic Group Spatio-Temporal GCN (DG-STGCN)
It consists of two modules, DG-GCN and DG-TCN, respectively, for spatial and temporal modeling.
DG-STGCN consistently outperforms state-of-the-art methods, often by a notable margin.
arXiv Detail & Related papers (2022-10-12T03:17:37Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z) - On the spatial attention in Spatio-Temporal Graph Convolutional Networks
for skeleton-based human action recognition [97.14064057840089]
Graphal networks (GCNs) promising performance in skeleton-based human action recognition by modeling a sequence of skeletons as a graph.
Most of the recently proposed G-temporal-based methods improve the performance by learning the graph structure at each layer of the network.
arXiv Detail & Related papers (2020-11-07T19:03:04Z) - Lightweight, Dynamic Graph Convolutional Networks for AMR-to-Text
Generation [56.73834525802723]
Lightweight Dynamic Graph Convolutional Networks (LDGCNs) are proposed.
LDGCNs capture richer non-local interactions by synthesizing higher order information from the input graphs.
We develop two novel parameter saving strategies based on the group graph convolutions and weight tied convolutions to reduce memory usage and model complexity.
arXiv Detail & Related papers (2020-10-09T06:03:46Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.