Representation Integrity in Temporal Graph Learning Methods
- URL: http://arxiv.org/abs/2511.20873v1
- Date: Tue, 25 Nov 2025 21:37:00 GMT
- Title: Representation Integrity in Temporal Graph Learning Methods
- Authors: Elahe Kooshafar,
- Abstract summary: Real-world systems are naturally modelled as dynamic graphs whose topology changes over time.<n>We formalize this requirement as representation integrity and derive a family of indexes that measure how closely embedding changes follow graph changes.<n>We then use this index to do a comparative study on representation integrity of common dynamic graph learning models.
- Score: 0.40611352512781873
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Real-world systems ranging from airline routes to cryptocurrency transfers are naturally modelled as dynamic graphs whose topology changes over time. Conventional benchmarks judge dynamic-graph learners by a handful of task-specific scores, yet seldom ask whether the embeddings themselves remain a truthful, interpretable reflection of the evolving network. We formalize this requirement as representation integrity and derive a family of indexes that measure how closely embedding changes follow graph changes. Three synthetic scenarios, Gradual Merge, Abrupt Move, and Periodic Re-wiring, are used to screen forty-two candidate indexes. Based on which we recommend one index that passes all of our theoretical and empirical tests. In particular, this validated metric consistently ranks the provably stable UASE and IPP models highest. We then use this index to do a comparative study on representation integrity of common dynamic graph learning models. This study exposes the scenario-specific strengths of neural methods, and shows a strong positive rank correlation with one-step link-prediction AUC. The proposed integrity framework, therefore, offers a task-agnostic and interpretable evaluation tool for dynamic-graph representation quality, providing more explicit guidance for model selection and future architecture design.
Related papers
- DyGnROLE: Modeling Asymmetry in Dynamic Graphs with Node-Role-Oriented Latent Encoding [9.701178273182691]
Real-world dynamic graphs are often directed, with source and destination nodes exhibiting asymmetrical behavioral patterns and temporal dynamics.<n>We propose DyGnROLE, a transformer-based architecture that explicitly disentangles source and destination representations.<n>By using separate embedding vocabularies and role-semantic positional encodings, the model captures the distinct structural and temporal contexts unique to each role.
arXiv Detail & Related papers (2026-02-26T15:51:51Z) - Combating Spurious Correlations in Graph Interpretability via Self-Reflection [4.81017678027464]
Interpretable graph learning is a popular research topic in machine learning.<n>One of the most challenging is the Spurious-Motif benchmark, introduced at ICLR 2022.<n>We propose a self-reflection framework that can be integrated with existing interpretable graph learning methods.
arXiv Detail & Related papers (2026-01-16T06:31:16Z) - Retrieval Augmented Generation for Dynamic Graph Modeling [15.09162213134372]
We propose a novel framework, Retrieval-Augmented Generation for Dynamic Graph modeling (RAG4DyG)<n>RAG4DyG enhances dynamic graph predictions by incorporating contextually and temporally relevant examples from broader graph structures.<n>The proposed framework is designed to be effective in both transductive and inductive scenarios.
arXiv Detail & Related papers (2024-08-26T09:23:35Z) - DyTed: Disentangled Representation Learning for Discrete-time Dynamic
Graph [59.583555454424]
We propose a novel disenTangled representation learning framework for discrete-time Dynamic graphs, namely DyTed.
We specially design a temporal-clips contrastive learning task together with a structure contrastive learning to effectively identify the time-invariant and time-varying representations respectively.
arXiv Detail & Related papers (2022-10-19T14:34:12Z) - Similarity-aware Positive Instance Sampling for Graph Contrastive
Pre-training [82.68805025636165]
We propose to select positive graph instances directly from existing graphs in the training set.
Our selection is based on certain domain-specific pair-wise similarity measurements.
Besides, we develop an adaptive node-level pre-training method to dynamically mask nodes to distribute them evenly in the graph.
arXiv Detail & Related papers (2022-06-23T20:12:51Z) - Joint Graph Learning and Matching for Semantic Feature Correspondence [69.71998282148762]
We propose a joint emphgraph learning and matching network, named GLAM, to explore reliable graph structures for boosting graph matching.
The proposed method is evaluated on three popular visual matching benchmarks (Pascal VOC, Willow Object and SPair-71k)
It outperforms previous state-of-the-art graph matching methods by significant margins on all benchmarks.
arXiv Detail & Related papers (2021-09-01T08:24:02Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.