Speculative Sampling for Parametric Temporal Point Processes
- URL: http://arxiv.org/abs/2510.20031v1
- Date: Wed, 22 Oct 2025 21:20:26 GMT
- Title: Speculative Sampling for Parametric Temporal Point Processes
- Authors: Marin Biloš, Anderson Schneider, Yuriy Nevmyvaka,
- Abstract summary: temporal point processes are powerful generative models for event sequences.<n>They are commonly specified using autoregressive models that learn the distribution of the next event from the previous events.<n>We propose a novel algorithm based on rejection sampling that enables exact sampling of multiple future values from existing TPP models.
- Score: 9.15731236208975
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Temporal point processes are powerful generative models for event sequences that capture complex dependencies in time-series data. They are commonly specified using autoregressive models that learn the distribution of the next event from the previous events. This makes sampling inherently sequential, limiting efficiency. In this paper, we propose a novel algorithm based on rejection sampling that enables exact sampling of multiple future values from existing TPP models, in parallel, and without requiring any architectural changes or retraining. Besides theoretical guarantees, our method demonstrates empirical speedups on real-world datasets, bridging the gap between expressive modeling and efficient parallel generation for large-scale TPP applications.
Related papers
- Edit-Based Flow Matching for Temporal Point Processes [51.33476564706644]
temporal point processes (TPPs) are a fundamental tool for modeling event sequences in continuous time.<n>Recent non-autoregressive, diffusion-style models mitigate these issues by jointly interpolating between noise and data.<n>We introduce an Edit Flow process for TPPs that transports noise to data via insert, delete, and substitute edit operations.
arXiv Detail & Related papers (2025-10-07T15:44:12Z) - TIMED: Adversarial and Autoregressive Refinement of Diffusion-Based Time Series Generation [0.31498833540989407]
TIMED is a unified generative framework that captures global structure via a forward-reverse diffusion process.<n>To further align the real and synthetic distributions in feature space, TIMED incorporates a Maximum Mean Discrepancy (MMD) loss.<n>We show that TIMED generates more realistic and temporally coherent sequences than state-of-the-art generative models.
arXiv Detail & Related papers (2025-09-23T23:05:40Z) - Point processes with event time uncertainty [16.64005584511643]
We introduce a framework to model time-uncertain point processes possibly on a network.
We experimentally show that the proposed approach outperforms previous General Linear model (GLM) baselines on simulated and real data.
arXiv Detail & Related papers (2024-11-05T00:46:09Z) - EventFlow: Forecasting Temporal Point Processes with Flow Matching [12.976042923229466]
In machine learning it is common to model temporal point processes in an autoregressive fashion using a neural network.<n>We propose EventFlow, a non-autoregressive generative model for temporal point processes.
arXiv Detail & Related papers (2024-10-09T20:57:00Z) - Cumulative Distribution Function based General Temporal Point Processes [49.758080415846884]
CuFun model represents a novel approach to TPPs that revolves around the Cumulative Distribution Function (CDF)
Our approach addresses several critical issues inherent in traditional TPP modeling.
Our contributions encompass the introduction of a pioneering CDF-based TPP model, the development of a methodology for incorporating past event information into future event prediction.
arXiv Detail & Related papers (2024-02-01T07:21:30Z) - Continuous-time convolutions model of event sequences [46.3471121117337]
Event sequences are non-uniform and sparse, making traditional models unsuitable.
We propose COTIC, a method based on an efficient convolution neural network designed to handle the non-uniform occurrence of events over time.
COTIC outperforms existing models in predicting the next event time and type, achieving an average rank of 1.5 compared to 3.714 for the nearest competitor.
arXiv Detail & Related papers (2023-02-13T10:34:51Z) - FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels [82.53569355337586]
This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
arXiv Detail & Related papers (2022-10-10T12:35:02Z) - Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting [48.8617204809538]
We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
arXiv Detail & Related papers (2021-01-31T11:00:55Z) - Fast and Flexible Temporal Point Processes with Triangular Maps [24.099464487795274]
We propose a new class of non-recurrent TPP models, where both sampling and likelihood can be done in parallel.
We demonstrate the advantages of the proposed framework on synthetic and real-world datasets.
arXiv Detail & Related papers (2020-06-22T21:34:10Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.