Variational Neural Temporal Point Process
- URL: http://arxiv.org/abs/2202.10585v1
- Date: Thu, 17 Feb 2022 13:34:30 GMT
- Title: Variational Neural Temporal Point Process
- Authors: Deokjun Eom, Sehyun Lee, Jaesik Choi
- Abstract summary: A temporal point process is a process that predicts which type of events is likely to happen and when the event will occur.
We introduce the inference and the generative networks, and train a distribution of latent variable to deal with property on deep neural network.
We empirically demonstrate that our model can generalize the representations of various event types.
- Score: 22.396329275957996
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A temporal point process is a stochastic process that predicts which type of
events is likely to happen and when the event will occur given a history of a
sequence of events. There are various examples of occurrence dynamics in the
daily life, and it is important to train the temporal dynamics and solve two
different prediction problems, time and type predictions. Especially, deep
neural network based models have outperformed the statistical models, such as
Hawkes processes and Poisson processes. However, many existing approaches
overfit to specific events, instead of learning and predicting various event
types. Therefore, such approaches could not cope with the modified
relationships between events and fail to predict the intensity functions of
temporal point processes very well. In this paper, to solve these problems, we
propose a variational neural temporal point process (VNTPP). We introduce the
inference and the generative networks, and train a distribution of latent
variable to deal with stochastic property on deep neural network. The intensity
functions are computed using the distribution of latent variable so that we can
predict event types and the arrival times of the events more accurately. We
empirically demonstrate that our model can generalize the representations of
various event types. Moreover, we show quantitatively and qualitatively that
our model outperforms other deep neural network based models and statistical
processes on synthetic and real-world datasets.
Related papers
- Decoupled Marked Temporal Point Process using Neural Ordinary Differential Equations [14.828081841581296]
A Marked Temporal Point Process (MTPP) is a process whose realization is a set of event-time data.
Recent studies have utilized deep neural networks to capture complex temporal dependencies of events.
We propose a Decoupled MTPP framework that disentangles characterization of a process into a set of evolving influences from different events.
arXiv Detail & Related papers (2024-06-10T10:15:32Z) - Neural multi-event forecasting on spatio-temporal point processes using
probabilistically enriched transformers [18.66217537327045]
Predicting discrete events in time and space has many scientific applications, such as predicting hazardous earthquakes and outbreaks of infectious diseases.
In this architecture we propose a new neural multi-event forecasting of-temporal point processes, utilizing augmented with normalizing flows and layers.
Our network makes batched predictions of complex future discrete events, achieving state-the-art performance on a variety of benchmark datasets.
arXiv Detail & Related papers (2022-11-05T14:55:36Z) - Towards Out-of-Distribution Sequential Event Prediction: A Causal
Treatment [72.50906475214457]
The goal of sequential event prediction is to estimate the next event based on a sequence of historical events.
In practice, the next-event prediction models are trained with sequential data collected at one time.
We propose a framework with hierarchical branching structures for learning context-specific representations.
arXiv Detail & Related papers (2022-10-24T07:54:13Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Bayesian Neural Hawkes Process for Event Uncertainty Prediction [0.2148535041822524]
Models for predicting time of occurrence play a significant role in a diverse set of applications like social networks, financial transactions, healthcare, and human mobility.
Recent works have introduced neural network based point process for modeling event-times, and were shown to provide state-of-the-art performance in predicting event-times.
We propose a novel point process model, Bayesian Neural Hawkes process which leverages uncertainty modelling capability of Bayesian models and generalization capability of the neural networks.
arXiv Detail & Related papers (2021-12-29T09:47:22Z) - Dynamic Hawkes Processes for Discovering Time-evolving Communities'
States behind Diffusion Processes [57.22860407362061]
We propose a novel Hawkes process model that is able to capture the underlying dynamics of community states behind the diffusion processes.
The proposed method, termed DHP, offers a flexible way to learn complex representations of the time-evolving communities' states.
arXiv Detail & Related papers (2021-05-24T08:35:48Z) - Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting [48.8617204809538]
We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
arXiv Detail & Related papers (2021-01-31T11:00:55Z) - User-Dependent Neural Sequence Models for Continuous-Time Event Data [27.45413274751265]
Continuous-time event data are common in applications such as individual behavior data, financial transactions, and medical health records.
Recurrent neural networks that parameterize time-varying intensity functions are the current state-of-the-art for predictive modeling with such data.
In this paper, we extend the broad class of neural marked point process models to mixtures of latent embeddings.
arXiv Detail & Related papers (2020-11-06T08:32:57Z) - Neural Conditional Event Time Models [11.920908437656413]
Event time models predict occurrence times of an event of interest based on known features.
We develop a conditional event time model that distinguishes between a) the probability of event occurrence, and b) the predicted time of occurrence.
Results demonstrate superior event occurrence and event time predictions on synthetic data, medical events (MIMIC-III), and social media posts.
arXiv Detail & Related papers (2020-04-03T05:08:13Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.