Event-Aware Prompt Learning for Dynamic Graphs
- URL: http://arxiv.org/abs/2510.11339v1
- Date: Mon, 13 Oct 2025 12:37:53 GMT
- Title: Event-Aware Prompt Learning for Dynamic Graphs
- Authors: Xingtong Yu, Ruijuan Liang, Xinming Zhang, Yuan Fang,
- Abstract summary: We propose EVP, an event-aware dynamic graph prompt learning framework.<n>We introduce an event adaptation mechanism to align the fine-grained characteristics of these events with downstream tasks.<n>Second, we propose an event aggregation mechanism to effectively integrate historical knowledge into node representations.
- Score: 14.17492028221679
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Real-world graph typically evolve via a series of events, modeling dynamic interactions between objects across various domains. For dynamic graph learning, dynamic graph neural networks (DGNNs) have emerged as popular solutions. Recently, prompt learning methods have been explored on dynamic graphs. However, existing methods generally focus on capturing the relationship between nodes and time, while overlooking the impact of historical events. In this paper, we propose EVP, an event-aware dynamic graph prompt learning framework that can serve as a plug-in to existing methods, enhancing their ability to leverage historical events knowledge. First, we extract a series of historical events for each node and introduce an event adaptation mechanism to align the fine-grained characteristics of these events with downstream tasks. Second, we propose an event aggregation mechanism to effectively integrate historical knowledge into node representations. Finally, we conduct extensive experiments on four public datasets to evaluate and analyze EVP.
Related papers
- Node-Time Conditional Prompt Learning In Dynamic Graphs [14.62182210205324]
We propose DYGPROMPT, a novel pre-training and prompt learning framework for dynamic graph modeling.<n>We recognize that node and time features mutually characterize each other, and propose dual condition-nets to model the evolving node-time patterns in downstream tasks.
arXiv Detail & Related papers (2024-05-22T19:10:24Z) - TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - Exploring the Limits of Historical Information for Temporal Knowledge
Graph Extrapolation [59.417443739208146]
We propose a new event forecasting model based on a novel training framework of historical contrastive learning.
CENET learns both the historical and non-historical dependency to distinguish the most potential entities.
We evaluate our proposed model on five benchmark graphs.
arXiv Detail & Related papers (2023-08-29T03:26:38Z) - Local-Global Information Interaction Debiasing for Dynamic Scene Graph
Generation [51.92419880088668]
We propose a novel DynSGG model based on multi-task learning, DynSGG-MTL, which introduces the local interaction information and global human-action interaction information.
Long-temporal human actions supervise the model to generate multiple scene graphs that conform to the global constraints and avoid the model being unable to learn the tail predicates.
arXiv Detail & Related papers (2023-08-10T01:24:25Z) - Deep graph kernel point processes [17.74234892097879]
This paper presents a novel point process model for discrete event data over graphs, where the event interaction occurs within a latent graph structure.
The key idea is to represent the influence kernel by Graph Neural Networks (GNN) to capture the underlying graph structure.
Compared with prior works focusing on directly modeling the conditional intensity function using neural networks, our kernel presentation herds the repeated event influence patterns more effectively.
arXiv Detail & Related papers (2023-06-20T06:15:19Z) - Learning Action-Effect Dynamics from Pairs of Scene-graphs [50.72283841720014]
We propose a novel method that leverages scene-graph representation of images to reason about the effects of actions described in natural language.
Our proposed approach is effective in terms of performance, data efficiency, and generalization capability compared to existing models.
arXiv Detail & Related papers (2022-12-07T03:36:37Z) - Temporal Knowledge Graph Reasoning with Historical Contrastive Learning [24.492458924487863]
We propose a new event forecasting model called Contrastive Event Network (CENET)
CENET learns both the historical and non-historical dependency to distinguish the most potential entities that can best match the given query.
During the inference process, CENET employs a mask-based strategy to generate the final results.
arXiv Detail & Related papers (2022-11-20T08:32:59Z) - Semi-Supervised Graph Attention Networks for Event Representation
Learning [0.0]
This paper presents GNEE (GAT Neural Event Embeddings), a method that combines Graph Attention Networks and Graph Regularization.
A statistical analysis of experimental results with five real-world event graphs and six graph embedding methods shows that our GNEE outperforms state-of-the-art semi-supervised graph embedding methods.
arXiv Detail & Related papers (2022-01-02T14:38:28Z) - Efficient Dynamic Graph Representation Learning at Scale [66.62859857734104]
We propose Efficient Dynamic Graph lEarning (EDGE), which selectively expresses certain temporal dependency via training loss to improve the parallelism in computations.
We show that EDGE can scale to dynamic graphs with millions of nodes and hundreds of millions of temporal events and achieve new state-of-the-art (SOTA) performance.
arXiv Detail & Related papers (2021-12-14T22:24:53Z) - Event Detection on Dynamic Graphs [4.128347119808724]
Event detection is a critical task for timely decision-making in graph analytics applications.
We propose DyGED, a simple yet novel deep learning model for event detection on dynamic graphs.
arXiv Detail & Related papers (2021-10-23T05:52:03Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - Graph Hawkes Neural Network for Forecasting on Temporal Knowledge Graphs [38.56057203198837]
Hawkes process has become a standard method for modeling self-exciting event sequences with different event types.
We propose the Graph Hawkes Neural Network that can capture the dynamics of evolving graph sequences and can predict the occurrence of a fact in a future time instance.
arXiv Detail & Related papers (2020-03-30T12:56:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.