An Empirical Study: Extensive Deep Temporal Point Process
- URL: http://arxiv.org/abs/2110.09823v2
- Date: Thu, 21 Oct 2021 14:09:31 GMT
- Title: An Empirical Study: Extensive Deep Temporal Point Process
- Authors: Haitao Lin, Cheng Tan, Lirong Wu, Zhangyang Gao, and Stan. Z. Li
- Abstract summary: We first review recent research emphasis and difficulties in modeling asynchronous event sequences with deep temporal pointprocess.
We propose a Granger causality discovery framework for exploiting the relations among multi-types of events.
- Score: 23.9359814366167
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Temporal point process as the stochastic process on continuous domain of time
is commonly used to model the asyn-chronous event sequence featuring with
occurence timestamps. Because the strong expressivity of deep neural networks,
they areemerging as a promising choice for capturing the patterns in
asynchronous sequences, in the context of temporal point process. In thispaper,
we first review recent research emphasis and difficulties in modeling
asynchronous event sequences with deep temporal pointprocess, which can be
concluded into four fields: encoding of history sequence, formulation of
conditional intensity function, relationaldiscovery of events and learning
approaches for optimization. We introduce most of recently proposed models by
dismantling theminto the four parts, and conduct experiments by remodularizing
the first three parts with the same learning strategy for a fair
empiricalevaluation. Besides, we extend the history encoders and conditional
intensity function family, and propose a Granger causality discoveryframework
for exploiting the relations among multi-types of events. Because the Granger
causality can be represented by the Grangercausality graph, discrete graph
structure learning in the framework of Variational Inference is employed to
reveal latent structures of thegraph, and further experiments shows that the
proposed framework with learned latent graph can both capture the relations and
achievean improved fitting and predicting performance.
Related papers
- Decoupled Marked Temporal Point Process using Neural Ordinary Differential Equations [14.828081841581296]
A Marked Temporal Point Process (MTPP) is a process whose realization is a set of event-time data.
Recent studies have utilized deep neural networks to capture complex temporal dependencies of events.
We propose a Decoupled MTPP framework that disentangles characterization of a process into a set of evolving influences from different events.
arXiv Detail & Related papers (2024-06-10T10:15:32Z) - Enhancing Asynchronous Time Series Forecasting with Contrastive
Relational Inference [21.51753838306655]
Temporal point processes(TPPs) are the standard method for modeling such.
Existing TPP models have focused on the conditional distribution of future events instead of explicitly modeling event interactions, imposing challenges for event predictions.
We propose a novel approach that leverages a Neural Inference (NRI) to learn a graph that infers interactions while simultaneously learning dynamics patterns from observational data.
arXiv Detail & Related papers (2023-09-06T09:47:03Z) - Exploring the Limits of Historical Information for Temporal Knowledge
Graph Extrapolation [59.417443739208146]
We propose a new event forecasting model based on a novel training framework of historical contrastive learning.
CENET learns both the historical and non-historical dependency to distinguish the most potential entities.
We evaluate our proposed model on five benchmark graphs.
arXiv Detail & Related papers (2023-08-29T03:26:38Z) - Deep graph kernel point processes [17.74234892097879]
This paper presents a novel point process model for discrete event data over graphs, where the event interaction occurs within a latent graph structure.
The key idea is to represent the influence kernel by Graph Neural Networks (GNN) to capture the underlying graph structure.
Compared with prior works focusing on directly modeling the conditional intensity function using neural networks, our kernel presentation herds the repeated event influence patterns more effectively.
arXiv Detail & Related papers (2023-06-20T06:15:19Z) - Intensity Profile Projection: A Framework for Continuous-Time
Representation Learning for Dynamic Networks [50.2033914945157]
We present a representation learning framework, Intensity Profile Projection, for continuous-time dynamic network data.
The framework consists of three stages: estimating pairwise intensity functions, learning a projection which minimises a notion of intensity reconstruction error.
Moreoever, we develop estimation theory providing tight control on the error of any estimated trajectory, indicating that the representations could even be used in quite noise-sensitive follow-on analyses.
arXiv Detail & Related papers (2023-06-09T15:38:25Z) - Towards Out-of-Distribution Sequential Event Prediction: A Causal
Treatment [72.50906475214457]
The goal of sequential event prediction is to estimate the next event based on a sequence of historical events.
In practice, the next-event prediction models are trained with sequential data collected at one time.
We propose a framework with hierarchical branching structures for learning context-specific representations.
arXiv Detail & Related papers (2022-10-24T07:54:13Z) - Learning the Evolutionary and Multi-scale Graph Structure for
Multivariate Time Series Forecasting [50.901984244738806]
We show how to model the evolutionary and multi-scale interactions of time series.
In particular, we first provide a hierarchical graph structure cooperated with the dilated convolution to capture the scale-specific correlations.
A unified neural network is provided to integrate the components above to get the final prediction.
arXiv Detail & Related papers (2022-06-28T08:11:12Z) - Deep Recurrent Modelling of Granger Causality with Latent Confounding [0.0]
We propose a deep learning-based approach to model non-linear Granger causality by directly accounting for latent confounders.
We demonstrate the model performance on non-linear time series for which the latent confounder influences the cause and effect with different time lags.
arXiv Detail & Related papers (2022-02-23T03:26:22Z) - Extracting Temporal Event Relation with Syntactic-Guided Temporal Graph
Transformer [17.850316385809617]
We propose a new Temporal Graph Transformer network to explicitly find the connection between two events from a syntactic graph constructed from one or two continuous sentences.
Experiments on MATRES and TB-Dense datasets show that our approach significantly outperforms previous state-of-the-art methods on both end-to-end temporal relation extraction and temporal relation classification.
arXiv Detail & Related papers (2021-04-19T19:00:45Z) - Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting [48.8617204809538]
We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
arXiv Detail & Related papers (2021-01-31T11:00:55Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.