Unified Flow Matching for Long Horizon Event Forecasting
- URL: http://arxiv.org/abs/2508.04843v1
- Date: Wed, 06 Aug 2025 19:42:49 GMT
- Title: Unified Flow Matching for Long Horizon Event Forecasting
- Authors: Xiao Shou,
- Abstract summary: We propose a unified flow matching framework for marked temporal point processes.<n>By learning continuous-time flows for both components, our method generates coherent long horizon event trajectories without sequential decoding.<n>We evaluate our model on six real-world benchmarks and demonstrate significant improvements over autoregressive and diffusion-based baselines in both accuracy and generation efficiency.
- Score: 3.0639815065447036
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Modeling long horizon marked event sequences is a fundamental challenge in many real-world applications, including healthcare, finance, and user behavior modeling. Existing neural temporal point process models are typically autoregressive, predicting the next event one step at a time, which limits their efficiency and leads to error accumulation in long-range forecasting. In this work, we propose a unified flow matching framework for marked temporal point processes that enables non-autoregressive, joint modeling of inter-event times and event types, via continuous and discrete flow matching. By learning continuous-time flows for both components, our method generates coherent long horizon event trajectories without sequential decoding. We evaluate our model on six real-world benchmarks and demonstrate significant improvements over autoregressive and diffusion-based baselines in both accuracy and generation efficiency.
Related papers
- FELLE: Autoregressive Speech Synthesis with Token-Wise Coarse-to-Fine Flow Matching [51.32059240975148]
FELLE is an autoregressive model that integrates language modeling with token-wise flow matching.<n>For each continuous-valued token, FELLE modifies the general prior distribution in flow matching by incorporating information from the previous step.<n>FELLE generates continuous-valued tokens hierarchically, conditioned on the language model's output.
arXiv Detail & Related papers (2025-02-16T13:54:32Z) - EventFlow: Forecasting Temporal Point Processes with Flow Matching [12.976042923229466]
In machine learning it is common to model temporal point processes in an autoregressive fashion using a neural network.<n>We propose EventFlow, a non-autoregressive generative model for temporal point processes.
arXiv Detail & Related papers (2024-10-09T20:57:00Z) - Non-autoregressive Sequence-to-Sequence Vision-Language Models [59.445765313094434]
We propose a parallel decoding sequence-to-sequence vision-language model that marginalizes over multiple inference paths in the decoder.<n>The model achieves performance on-par with its state-of-the-art autoregressive counterpart, but is faster at inference time.
arXiv Detail & Related papers (2024-03-04T17:34:59Z) - Non-Autoregressive Diffusion-based Temporal Point Processes for
Continuous-Time Long-Term Event Prediction [8.88485011274486]
We propose a diffusion-based non-autoregressive temporal point process model for long-term event prediction in continuous time.
In order to perform diffusion processes on event sequences, we develop a bidirectional map between target event sequences and the Euclidean vector space.
Experiments are conducted to prove the superiority of our proposed model over state-of-the-art methods on long-term event prediction in continuous time.
arXiv Detail & Related papers (2023-11-02T06:52:44Z) - Interacting Diffusion Processes for Event Sequence Forecasting [20.380620709345898]
We introduce a novel approach that incorporates a diffusion generative model.
The model facilitates sequence-to-sequence prediction, allowing multi-step predictions based on historical event sequences.
We demonstrate that our proposal outperforms state-of-the-art baselines for long-horizon forecasting of TPP.
arXiv Detail & Related papers (2023-10-26T22:17:25Z) - Continuous-time convolutions model of event sequences [46.3471121117337]
Event sequences are non-uniform and sparse, making traditional models unsuitable.
We propose COTIC, a method based on an efficient convolution neural network designed to handle the non-uniform occurrence of events over time.
COTIC outperforms existing models in predicting the next event time and type, achieving an average rank of 1.5 compared to 3.714 for the nearest competitor.
arXiv Detail & Related papers (2023-02-13T10:34:51Z) - WaveBound: Dynamic Error Bounds for Stable Time Series Forecasting [30.692056599222926]
Time series forecasting has become a critical task due to its high practicality in real-world applications.
Recent deep-learning-based approaches have shown remarkable success in time series forecasting.
Deep networks still suffer from unstable training and overfitting.
arXiv Detail & Related papers (2022-10-25T19:58:02Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Continuous Latent Process Flows [47.267251969492484]
Partial observations of continuous time-series dynamics at arbitrary time stamps exist in many disciplines. Fitting this type of data using statistical models with continuous dynamics is not only promising at an intuitive level but also has practical benefits.
We tackle these challenges with continuous latent process flows (CLPF), a principled architecture decoding continuous latent processes into continuous observable processes using a time-dependent normalizing flow driven by a differential equation.
Our ablation studies demonstrate the effectiveness of our contributions in various inference tasks on irregular time grids.
arXiv Detail & Related papers (2021-06-29T17:16:04Z) - Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting [48.8617204809538]
We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
arXiv Detail & Related papers (2021-01-31T11:00:55Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.