Deep graph kernel point processes
- URL: http://arxiv.org/abs/2306.11313v4
- Date: Mon, 11 Nov 2024 06:12:24 GMT
- Title: Deep graph kernel point processes
- Authors: Zheng Dong, Matthew Repasky, Xiuyuan Cheng, Yao Xie,
- Abstract summary: This paper presents a novel point process model for discrete event data over graphs, where the event interaction occurs within a latent graph structure.
The key idea is to represent the influence kernel by Graph Neural Networks (GNN) to capture the underlying graph structure.
Compared with prior works focusing on directly modeling the conditional intensity function using neural networks, our kernel presentation herds the repeated event influence patterns more effectively.
- Score: 17.74234892097879
- License:
- Abstract: Point process models are widely used for continuous asynchronous event data, where each data point includes time and additional information called "marks", which can be locations, nodes, or event types. This paper presents a novel point process model for discrete event data over graphs, where the event interaction occurs within a latent graph structure. Our model builds upon Hawkes's classic influence kernel-based formulation in the original self-exciting point processes work to capture the influence of historical events on future events' occurrence. The key idea is to represent the influence kernel by Graph Neural Networks (GNN) to capture the underlying graph structure while harvesting the strong representation power of GNNs. Compared with prior works focusing on directly modeling the conditional intensity function using neural networks, our kernel presentation herds the repeated event influence patterns more effectively by combining statistical and deep models, achieving better model estimation/learning efficiency and superior predictive performance. Our work significantly extends the existing deep spatio-temporal kernel for point process data, which is inapplicable to our setting due to the fundamental difference in the nature of the observation space being Euclidean rather than a graph. We present comprehensive experiments on synthetic and real-world data to show the superior performance of the proposed approach against the state-of-the-art in predicting future events and uncovering the relational structure among data.
Related papers
- DPCL-Diff: The Temporal Knowledge Graph Reasoning based on Graph Node Diffusion Model with Dual-Domain Periodic Contrastive Learning [3.645855411897217]
We propose a graph node diffusion model with dual-domain periodic contrastive learning (DPCL-Diff)
GNDiff introduces noise into sparsely related events to simulate new events, generating high-quality data that better conforms to the actual distribution.
DPCL-Diff maps periodic and non-periodic event entities to Poincar'e and Euclidean spaces, leveraging their characteristics to distinguish similar periodic events effectively.
arXiv Detail & Related papers (2024-11-03T08:30:29Z) - Revealing Decurve Flows for Generalized Graph Propagation [108.80758541147418]
This study addresses the limitations of the traditional analysis of message-passing, central to graph learning, by defining em textbfgeneralized propagation with directed and weighted graphs.
We include a preliminary exploration of learned propagation patterns in datasets, a first in the field.
arXiv Detail & Related papers (2024-02-13T14:13:17Z) - Exploring the Limits of Historical Information for Temporal Knowledge
Graph Extrapolation [59.417443739208146]
We propose a new event forecasting model based on a novel training framework of historical contrastive learning.
CENET learns both the historical and non-historical dependency to distinguish the most potential entities.
We evaluate our proposed model on five benchmark graphs.
arXiv Detail & Related papers (2023-08-29T03:26:38Z) - A Graph Regularized Point Process Model For Event Propagation Sequence [2.9093633827040724]
Point process is the dominant paradigm for modeling event sequences occurring at irregular intervals.
We propose a Graph Regularized Point Process that characterizes the event interactions across nodes with neighbors.
By applying a graph regularization method, GRPP provides model interpretability by uncovering influence strengths between nodes.
arXiv Detail & Related papers (2022-11-21T04:49:59Z) - Spatio-temporal point processes with deep non-stationary kernels [18.10670233156497]
We develop a new deep non-stationary influence kernel that can model non-stationary-temporal point processes.
The main idea is to approximate the influence kernel with a novel and general low-rank decomposition.
We also take a new approach to maintain the non-negativity constraint of the conditional intensity by introducing a log-barrier penalty.
arXiv Detail & Related papers (2022-11-21T04:49:39Z) - CEP3: Community Event Prediction with Neural Point Process on Graph [59.434777403325604]
We propose a novel model combining Graph Neural Networks and Marked Temporal Point Process (MTPP)
Our experiments demonstrate the superior performance of our model in terms of both model accuracy and training efficiency.
arXiv Detail & Related papers (2022-05-21T15:30:25Z) - Neural Spectral Marked Point Processes [18.507050473968985]
We introduce a novel and general neural network-based non-stationary influence kernel for handling complex discrete events.
We demonstrate the superior performance of our proposed method compared with the state-of-the-art on synthetic and real data.
arXiv Detail & Related papers (2021-06-20T23:00:37Z) - CatGCN: Graph Convolutional Networks with Categorical Node Features [99.555850712725]
CatGCN is tailored for graph learning when the node features are categorical.
We train CatGCN in an end-to-end fashion and demonstrate it on semi-supervised node classification.
arXiv Detail & Related papers (2020-09-11T09:25:17Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.