Condensation-Concatenation Framework for Dynamic Graph Continual Learning
- URL: http://arxiv.org/abs/2512.11317v1
- Date: Fri, 12 Dec 2025 06:32:16 GMT
- Title: Condensation-Concatenation Framework for Dynamic Graph Continual Learning
- Authors: Tingxu Yan, Ye Yuan,
- Abstract summary: We propose a novel framework for continual learning on dynamic graphs, named Condensation-Concatenation-based Continual Learning (CCC)<n>CCC first condenses historical graph snapshots into compact representations while aiming to preserve the original label distribution and topological properties.<n> CCC demonstrates superior performance over state-of-the-art baselines across four real-world datasets.
- Score: 5.183200614613901
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Dynamic graphs are prevalent in real-world scenarios, where continuous structural changes induce catastrophic forgetting in graph neural networks (GNNs). While continual learning has been extended to dynamic graphs, existing methods overlook the effects of topological changes on existing nodes. To address it, we propose a novel framework for continual learning on dynamic graphs, named Condensation-Concatenation-based Continual Learning (CCC). Specifically, CCC first condenses historical graph snapshots into compact semantic representations while aiming to preserve the original label distribution and topological properties. Then it concatenates these historical embeddings with current graph representations selectively. Moreover, we refine the forgetting measure (FM) to better adapt to dynamic graph scenarios by quantifying the predictive performance degradation of existing nodes caused by structural updates. CCC demonstrates superior performance over state-of-the-art baselines across four real-world datasets in extensive experiments.
Related papers
- Dynamic Graph Condensation [40.099854631984556]
temporal extension in dynamic graphs poses significant data efficiency challenges.<n>We propose DyGC, a framework that condenses the real dynamic graph into a compact version.<n>Our method retains up to 96.2% DGNN performance with only 0.5% of the original graph size, and achieves up to 1846 times training speedup.
arXiv Detail & Related papers (2025-06-16T05:11:29Z) - Retrieval Augmented Generation for Dynamic Graph Modeling [15.09162213134372]
We propose a novel framework, Retrieval-Augmented Generation for Dynamic Graph modeling (RAG4DyG)<n>RAG4DyG enhances dynamic graph predictions by incorporating contextually and temporally relevant examples from broader graph structures.<n>The proposed framework is designed to be effective in both transductive and inductive scenarios.
arXiv Detail & Related papers (2024-08-26T09:23:35Z) - Graph Condensation for Open-World Graph Learning [48.38802327346445]
Graph condensation (GC) has emerged as a promising acceleration solution for efficiently training graph neural networks (GNNs)
Existing GC methods are limited to aligning the condensed graph with merely the observed static graph distribution.
In real-world scenarios, however, graphs are dynamic and constantly evolving, with new nodes and edges being continually integrated.
We propose OpenGC, a robust GC framework that integrates structure-aware distribution shift to simulate evolving graph patterns.
arXiv Detail & Related papers (2024-05-27T09:47:09Z) - Deep Temporal Graph Clustering [77.02070768950145]
We propose a general framework for deep Temporal Graph Clustering (GC)
GC introduces deep clustering techniques to suit the interaction sequence-based batch-processing pattern of temporal graphs.
Our framework can effectively improve the performance of existing temporal graph learning methods.
arXiv Detail & Related papers (2023-05-18T06:17:50Z) - Dynamic Graph Representation Learning via Edge Temporal States Modeling and Structure-reinforced Transformer [5.093187534912688]
We introduce the Recurrent Structure-reinforced Graph Transformer (RSGT), a novel framework for dynamic graph representation learning.
RSGT captures temporal node representations encoding both graph topology and evolving dynamics through a recurrent learning paradigm.
We show RSGT's superior performance in discrete dynamic graph representation learning, consistently outperforming existing methods in dynamic link prediction tasks.
arXiv Detail & Related papers (2023-04-20T04:12:50Z) - Self-Supervised Temporal Graph learning with Temporal and Structural Intensity Alignment [53.72873672076391]
Temporal graph learning aims to generate high-quality representations for graph-based tasks with dynamic information.
We propose a self-supervised method called S2T for temporal graph learning, which extracts both temporal and structural information.
S2T achieves at most 10.13% performance improvement compared with the state-of-the-art competitors on several datasets.
arXiv Detail & Related papers (2023-02-15T06:36:04Z) - Scene Graph Modification as Incremental Structure Expanding [61.84291817776118]
We focus on scene graph modification (SGM), where the system is required to learn how to update an existing scene graph based on a natural language query.
We frame SGM as a graph expansion task by introducing the incremental structure expanding (ISE)
We construct a challenging dataset that contains more complicated queries and larger scene graphs than existing datasets.
arXiv Detail & Related papers (2022-09-15T16:26:14Z) - Time-aware Dynamic Graph Embedding for Asynchronous Structural Evolution [60.695162101159134]
Existing works merely view a dynamic graph as a sequence of changes.
We formulate dynamic graphs as temporal edge sequences associated with joining time of.
vertex and timespan of edges.
A time-aware Transformer is proposed to embed.
vertex' dynamic connections and ToEs into the learned.
vertex representations.
arXiv Detail & Related papers (2022-07-01T15:32:56Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - DyGCN: Dynamic Graph Embedding with Graph Convolutional Network [25.02329024926518]
We propose an efficient dynamic graph embedding approach, Dynamic Graph Convolutional Network (DyGCN)
Our model can update the node embeddings in a time-saving and performance-preserving way.
arXiv Detail & Related papers (2021-04-07T07:28:44Z) - K-Core based Temporal Graph Convolutional Network for Dynamic Graphs [19.237377882738063]
We propose a novel k-core based temporal graph convolutional network, the CTGCN, to learn node representations for dynamic graphs.
In contrast to previous dynamic graph embedding methods, CTGCN can preserve both local connective proximity and global structural similarity.
Experimental results on 7 real-world graphs demonstrate that the CTGCN outperforms existing state-of-the-art graph embedding methods in several tasks.
arXiv Detail & Related papers (2020-03-22T14:15:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.