Interacting Diffusion Processes for Event Sequence Forecasting
- URL: http://arxiv.org/abs/2310.17800v2
- Date: Sat, 20 Jul 2024 02:52:55 GMT
- Title: Interacting Diffusion Processes for Event Sequence Forecasting
- Authors: Mai Zeng, Florence Regol, Mark Coates,
- Abstract summary: We introduce a novel approach that incorporates a diffusion generative model.
The model facilitates sequence-to-sequence prediction, allowing multi-step predictions based on historical event sequences.
We demonstrate that our proposal outperforms state-of-the-art baselines for long-horizon forecasting of TPP.
- Score: 20.380620709345898
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural Temporal Point Processes (TPPs) have emerged as the primary framework for predicting sequences of events that occur at irregular time intervals, but their sequential nature can hamper performance for long-horizon forecasts. To address this, we introduce a novel approach that incorporates a diffusion generative model. The model facilitates sequence-to-sequence prediction, allowing multi-step predictions based on historical event sequences. In contrast to previous approaches, our model directly learns the joint probability distribution of types and inter-arrival times for multiple events. This allows us to fully leverage the high dimensional modeling capability of modern generative models. Our model is composed of two diffusion processes, one for the time intervals and one for the event types. These processes interact through their respective denoising functions, which can take as input intermediate representations from both processes, allowing the model to learn complex interactions. We demonstrate that our proposal outperforms state-of-the-art baselines for long-horizon forecasting of TPP.
Related papers
- On the Efficient Marginalization of Probabilistic Sequence Models [3.5897534810405403]
This dissertation focuses on using autoregressive models to answer complex probabilistic queries.
We develop a class of novel and efficient approximation techniques for marginalization in sequential models that are model-agnostic.
arXiv Detail & Related papers (2024-03-06T19:29:08Z) - Cumulative Distribution Function based General Temporal Point Processes [49.758080415846884]
CuFun model represents a novel approach to TPPs that revolves around the Cumulative Distribution Function (CDF)
Our approach addresses several critical issues inherent in traditional TPP modeling.
Our contributions encompass the introduction of a pioneering CDF-based TPP model, the development of a methodology for incorporating past event information into future event prediction.
arXiv Detail & Related papers (2024-02-01T07:21:30Z) - Add and Thin: Diffusion for Temporal Point Processes [24.4686728569167]
ADD-THIN is a principled probabilistic denoising diffusion model for temporal point process (TPP) networks.
It operates on entire event sequences and matches state-of-the-art TPP models in density estimation.
In experiments on synthetic and real-world datasets, our model matches the state-of-the-art TPP models in density estimation and strongly outperforms them in forecasting.
arXiv Detail & Related papers (2023-11-02T10:42:35Z) - Non-Autoregressive Diffusion-based Temporal Point Processes for
Continuous-Time Long-Term Event Prediction [8.88485011274486]
We propose a diffusion-based non-autoregressive temporal point process model for long-term event prediction in continuous time.
In order to perform diffusion processes on event sequences, we develop a bidirectional map between target event sequences and the Euclidean vector space.
Experiments are conducted to prove the superiority of our proposed model over state-of-the-art methods on long-term event prediction in continuous time.
arXiv Detail & Related papers (2023-11-02T06:52:44Z) - Enhancing Asynchronous Time Series Forecasting with Contrastive
Relational Inference [21.51753838306655]
Temporal point processes(TPPs) are the standard method for modeling such.
Existing TPP models have focused on the conditional distribution of future events instead of explicitly modeling event interactions, imposing challenges for event predictions.
We propose a novel approach that leverages a Neural Inference (NRI) to learn a graph that infers interactions while simultaneously learning dynamics patterns from observational data.
arXiv Detail & Related papers (2023-09-06T09:47:03Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Complex Event Forecasting with Prediction Suffix Trees: Extended
Technical Report [70.7321040534471]
Complex Event Recognition (CER) systems have become popular in the past two decades due to their ability to "instantly" detect patterns on real-time streams of events.
There is a lack of methods for forecasting when a pattern might occur before such an occurrence is actually detected by a CER engine.
We present a formal framework that attempts to address the issue of Complex Event Forecasting.
arXiv Detail & Related papers (2021-09-01T09:52:31Z) - Interval-censored Hawkes processes [82.87738318505582]
We propose a model to estimate the parameters of a Hawkes process in interval-censored settings.
We show how a non-homogeneous approximation to the Hawkes admits a tractable likelihood in the interval-censored setting.
arXiv Detail & Related papers (2021-04-16T07:29:04Z) - Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting [48.8617204809538]
We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
arXiv Detail & Related papers (2021-01-31T11:00:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.