Long Horizon Forecasting With Temporal Point Processes
- URL: http://arxiv.org/abs/2101.02815v2
- Date: Sun, 7 Mar 2021 16:24:37 GMT
- Title: Long Horizon Forecasting With Temporal Point Processes
- Authors: Prathamesh Deshpande, Kamlesh Marathe, Abir De, Sunita Sarawagi
- Abstract summary: Marked temporal point processes (MTPPs) have emerged as a powerful modeling machinery to characterize asynchronous events.
In this paper, we develop DualTPP which is specifically well-suited to long horizon event forecasting.
Experiments with a diverse set of real datasets show that DualTPP outperforms existing MTPP methods on long horizon forecasting by substantial margins.
- Score: 22.896152113905856
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: In recent years, marked temporal point processes (MTPPs) have emerged as a
powerful modeling machinery to characterize asynchronous events in a wide
variety of applications. MTPPs have demonstrated significant potential in
predicting event-timings, especially for events arriving in near future.
However, due to current design choices, MTPPs often show poor predictive
performance at forecasting event arrivals in distant future. To ameliorate this
limitation, in this paper, we design DualTPP which is specifically well-suited
to long horizon event forecasting. DualTPP has two components. The first
component is an intensity free MTPP model, which captures microscopic or
granular level signals of the event dynamics by modeling the time of future
events. The second component takes a different dual perspective of modeling
aggregated counts of events in a given time-window, thus encapsulating
macroscopic event dynamics. Then we develop a novel inference framework jointly
over the two models % for efficiently forecasting long horizon events by
solving a sequence of constrained quadratic optimization problems. Experiments
with a diverse set of real datasets show that DualTPP outperforms existing MTPP
methods on long horizon forecasting by substantial margins, achieving almost an
order of magnitude reduction in Wasserstein distance between actual events and
forecasts.
Related papers
- Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - DeTPP: Leveraging Object Detection for Robust Long-Horizon Event Prediction [1.534667887016089]
We introduce DeTPP, a novel approach inspired by object detection techniques from computer vision.
DeTPP employs a unique matching-based loss function that selectively prioritizes reliably predictable events.
The proposed hybrid approach enhances the accuracy of next event prediction by up to 2.7% on a large transactional dataset.
arXiv Detail & Related papers (2024-08-23T14:57:46Z) - HoTPP Benchmark: Are We Good at the Long Horizon Events Forecasting? [1.3654846342364308]
Accurately forecasting multiple future events within a given time horizon is crucial for finance, retail, social networks, and healthcare applications.
We propose a novel evaluation method inspired by object detection techniques from computer vision.
To support further research, we release HoTPP, the first benchmark designed explicitly for evaluating long-horizon MTPP predictions.
arXiv Detail & Related papers (2024-06-20T14:09:00Z) - Adapting to Length Shift: FlexiLength Network for Trajectory Prediction [53.637837706712794]
Trajectory prediction plays an important role in various applications, including autonomous driving, robotics, and scene understanding.
Existing approaches mainly focus on developing compact neural networks to increase prediction precision on public datasets, typically employing a standardized input duration.
We introduce a general and effective framework, the FlexiLength Network (FLN), to enhance the robustness of existing trajectory prediction against varying observation periods.
arXiv Detail & Related papers (2024-03-31T17:18:57Z) - Cumulative Distribution Function based General Temporal Point Processes [49.758080415846884]
CuFun model represents a novel approach to TPPs that revolves around the Cumulative Distribution Function (CDF)
Our approach addresses several critical issues inherent in traditional TPP modeling.
Our contributions encompass the introduction of a pioneering CDF-based TPP model, the development of a methodology for incorporating past event information into future event prediction.
arXiv Detail & Related papers (2024-02-01T07:21:30Z) - Interacting Diffusion Processes for Event Sequence Forecasting [20.380620709345898]
We introduce a novel approach that incorporates a diffusion generative model.
The model facilitates sequence-to-sequence prediction, allowing multi-step predictions based on historical event sequences.
We demonstrate that our proposal outperforms state-of-the-art baselines for long-horizon forecasting of TPP.
arXiv Detail & Related papers (2023-10-26T22:17:25Z) - CEP3: Community Event Prediction with Neural Point Process on Graph [59.434777403325604]
We propose a novel model combining Graph Neural Networks and Marked Temporal Point Process (MTPP)
Our experiments demonstrate the superior performance of our model in terms of both model accuracy and training efficiency.
arXiv Detail & Related papers (2022-05-21T15:30:25Z) - Joint Forecasting of Panoptic Segmentations with Difference Attention [72.03470153917189]
We study a new panoptic segmentation forecasting model that jointly forecasts all object instances in a scene.
We evaluate the proposed model on the Cityscapes and AIODrive datasets.
arXiv Detail & Related papers (2022-04-14T17:59:32Z) - MUSE-VAE: Multi-Scale VAE for Environment-Aware Long Term Trajectory
Prediction [28.438787700968703]
Conditional MUSE offers diverse and simultaneously more accurate predictions compared to the current state-of-the-art.
We demonstrate these assertions through a comprehensive set of experiments on nuScenes and SDD benchmarks as well as PFSD, a new synthetic dataset.
arXiv Detail & Related papers (2022-01-18T18:40:03Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.