Non-Autoregressive Diffusion-based Temporal Point Processes for
Continuous-Time Long-Term Event Prediction
- URL: http://arxiv.org/abs/2311.01033v1
- Date: Thu, 2 Nov 2023 06:52:44 GMT
- Title: Non-Autoregressive Diffusion-based Temporal Point Processes for
Continuous-Time Long-Term Event Prediction
- Authors: Wang-Tao Zhou, Zhao Kang, Ling Tian
- Abstract summary: We propose a diffusion-based non-autoregressive temporal point process model for long-term event prediction in continuous time.
In order to perform diffusion processes on event sequences, we develop a bidirectional map between target event sequences and the Euclidean vector space.
Experiments are conducted to prove the superiority of our proposed model over state-of-the-art methods on long-term event prediction in continuous time.
- Score: 8.88485011274486
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Continuous-time long-term event prediction plays an important role in many
application scenarios. Most existing works rely on autoregressive frameworks to
predict event sequences, which suffer from error accumulation, thus
compromising prediction quality. Inspired by the success of denoising diffusion
probabilistic models, we propose a diffusion-based non-autoregressive temporal
point process model for long-term event prediction in continuous time. Instead
of generating events one at a time in an autoregressive way, our model predicts
the future event sequence entirely as a whole. In order to perform diffusion
processes on event sequences, we develop a bidirectional map between target
event sequences and the Euclidean vector space. Furthermore, we design a novel
denoising network to capture both sequential and contextual features for better
sample quality. Extensive experiments are conducted to prove the superiority of
our proposed model over state-of-the-art methods on long-term event prediction
in continuous time. To the best of our knowledge, this is the first work to
apply diffusion methods to long-term event prediction problems.
Related papers
- EventFlow: Forecasting Continuous-Time Event Data with Flow Matching [12.976042923229466]
We propose EventFlow, a non-autoregressive generative model for temporal point processes.
Our model builds on the flow matching framework in order to directly learn joint distributions over event times, side-stepping the autoregressive process.
arXiv Detail & Related papers (2024-10-09T20:57:00Z) - Diffusion Forcing: Next-token Prediction Meets Full-Sequence Diffusion [61.03681839276652]
Diffusion Forcing is a new training paradigm where a diffusion model is trained to denoise a set of tokens with independent per-token noise levels.
We apply Diffusion Forcing to sequence generative modeling by training a causal next-token prediction model to generate one or several future tokens.
arXiv Detail & Related papers (2024-07-01T15:43:25Z) - Meta-Learning for Neural Network-based Temporal Point Processes [36.31950058651308]
The point process is widely used to predict events related to human activities.
Recent high-performance point process models require the input of sufficient numbers of events collected over a long period.
We propose a novel meta-learning approach for periodicity-aware prediction of future events given short sequences.
arXiv Detail & Related papers (2024-01-29T02:42:22Z) - Interacting Diffusion Processes for Event Sequence Forecasting [20.380620709345898]
We introduce a novel approach that incorporates a diffusion generative model.
The model facilitates sequence-to-sequence prediction, allowing multi-step predictions based on historical event sequences.
We demonstrate that our proposal outperforms state-of-the-art baselines for long-horizon forecasting of TPP.
arXiv Detail & Related papers (2023-10-26T22:17:25Z) - Performative Time-Series Forecasting [71.18553214204978]
We formalize performative time-series forecasting (PeTS) from a machine-learning perspective.
We propose a novel approach, Feature Performative-Shifting (FPS), which leverages the concept of delayed response to anticipate distribution shifts.
We conduct comprehensive experiments using multiple time-series models on COVID-19 and traffic forecasting tasks.
arXiv Detail & Related papers (2023-10-09T18:34:29Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Towards Out-of-Distribution Sequential Event Prediction: A Causal
Treatment [72.50906475214457]
The goal of sequential event prediction is to estimate the next event based on a sequence of historical events.
In practice, the next-event prediction models are trained with sequential data collected at one time.
We propose a framework with hierarchical branching structures for learning context-specific representations.
arXiv Detail & Related papers (2022-10-24T07:54:13Z) - HyperHawkes: Hypernetwork based Neural Temporal Point Process [5.607676459156789]
Temporal point process serves as an essential tool for modeling time-to-event data in continuous time space.
It is not generalizable to predict events from unseen sequences in dynamic environment.
We propose textitHyperHawkes, a hypernetwork based temporal point process framework.
arXiv Detail & Related papers (2022-10-01T07:14:19Z) - Ambiguity in Sequential Data: Predicting Uncertain Futures with
Recurrent Models [110.82452096672182]
We propose an extension of the Multiple Hypothesis Prediction (MHP) model to handle ambiguous predictions with sequential data.
We also introduce a novel metric for ambiguous problems, which is better suited to account for uncertainties.
arXiv Detail & Related papers (2020-03-10T09:15:42Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.