Flow Matching with Gaussian Process Priors for Probabilistic Time Series Forecasting
- URL: http://arxiv.org/abs/2410.03024v2
- Date: Sun, 11 May 2025 22:30:03 GMT
- Title: Flow Matching with Gaussian Process Priors for Probabilistic Time Series Forecasting
- Authors: Marcel Kollovieh, Marten Lienen, David Lüdke, Leo Schwinn, Stephan Günnemann,
- Abstract summary: We introduce TSFlow, a conditional flow matching (CFM) model for time series combining Gaussian processes, optimal transport paths, and data-dependent prior distributions.<n>We show that both conditionally and unconditionally trained models achieve competitive results across multiple forecasting benchmarks.
- Score: 43.951394031702016
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent advancements in generative modeling, particularly diffusion models, have opened new directions for time series modeling, achieving state-of-the-art performance in forecasting and synthesis. However, the reliance of diffusion-based models on a simple, fixed prior complicates the generative process since the data and prior distributions differ significantly. We introduce TSFlow, a conditional flow matching (CFM) model for time series combining Gaussian processes, optimal transport paths, and data-dependent prior distributions. By incorporating (conditional) Gaussian processes, TSFlow aligns the prior distribution more closely with the temporal structure of the data, enhancing both unconditional and conditional generation. Furthermore, we propose conditional prior sampling to enable probabilistic forecasting with an unconditionally trained model. In our experimental evaluation on eight real-world datasets, we demonstrate the generative capabilities of TSFlow, producing high-quality unconditional samples. Finally, we show that both conditionally and unconditionally trained models achieve competitive results across multiple forecasting benchmarks.
Related papers
- Bridging the Last Mile of Prediction: Enhancing Time Series Forecasting with Conditional Guided Flow Matching [9.465542901469815]
Flow matching offers faster generation, higher-quality outputs, and greater flexibility.<n> Conditional Guided Flow Matching (CGFM) extends flow matching by incorporating the outputs of an auxiliary model.<n>For time series forecasting tasks, CGFM integrates historical data as conditions and guidance, constructs two-sided conditional probability paths, and uses a general affine path to expand the space of probability paths.
arXiv Detail & Related papers (2025-07-09T18:03:31Z) - ADiff4TPP: Asynchronous Diffusion Models for Temporal Point Processes [30.928368603673285]
This work introduces a novel approach to modeling temporal point processes using diffusion models with an asynchronous noise schedule.
We derive an objective to effectively train these models for a general family of noise schedules based on conditional flow matching.
Our method achieves the joint distribution of the latent representations of events in a sequence and state-of-the-art results in predicting both the next inter-event time and event type on benchmark datasets.
arXiv Detail & Related papers (2025-04-29T04:17:39Z) - Probabilistic Forecasting via Autoregressive Flow Matching [1.5467259918426441]
FlowTime is a generative model for probabilistic forecasting of timeseries data.
We decompose the joint distribution of future observations into a sequence of conditional densities, each modeled via a shared flow.
We demonstrate the effectiveness of FlowTime on multiple dynamical systems and real-world forecasting tasks.
arXiv Detail & Related papers (2025-03-13T13:54:24Z) - Generative Modeling with Bayesian Sample Inference [50.07758840675341]
We derive a novel generative model from the simple act of Gaussian posterior inference.
Treating the generated sample as an unknown variable to infer lets us formulate the sampling process in the language of Bayesian probability.
Our model uses a sequence of prediction and posterior update steps to narrow down the unknown sample from a broad initial belief.
arXiv Detail & Related papers (2025-02-11T14:27:10Z) - FM-TS: Flow Matching for Time Series Generation [71.31148785577085]
We introduce FM-TS, a rectified Flow Matching-based framework for Time Series generation.
FM-TS is more efficient in terms of training and inference.
We have achieved superior performance in solar forecasting and MuJoCo imputation tasks.
arXiv Detail & Related papers (2024-11-12T03:03:23Z) - On conditional diffusion models for PDE simulations [53.01911265639582]
We study score-based diffusion models for forecasting and assimilation of sparse observations.
We propose an autoregressive sampling approach that significantly improves performance in forecasting.
We also propose a new training strategy for conditional score-based models that achieves stable performance over a range of history lengths.
arXiv Detail & Related papers (2024-10-21T18:31:04Z) - Leveraging Priors via Diffusion Bridge for Time Series Generation [3.2066708654182743]
Time series generation is widely used in real-world applications such as simulation, data augmentation, and hypothesis test techniques.
diffusion models have emerged as the de facto approach for time series generation.
TimeBridge is a framework that enables flexible synthesis by leveraging diffusion bridges to learn the transport between chosen prior and data distributions.
arXiv Detail & Related papers (2024-08-13T06:47:59Z) - Discrete Flow Matching [74.04153927689313]
We present a novel discrete flow paradigm designed specifically for generating discrete data.
Our approach is capable of generating high-quality discrete data in a non-autoregressive fashion.
arXiv Detail & Related papers (2024-07-22T12:33:27Z) - Guided Flows for Generative Modeling and Decision Making [55.42634941614435]
We show that Guided Flows significantly improves the sample quality in conditional image generation and zero-shot text synthesis-to-speech.
Notably, we are first to apply flow models for plan generation in the offline reinforcement learning setting ax speedup in compared to diffusion models.
arXiv Detail & Related papers (2023-11-22T15:07:59Z) - Predict, Refine, Synthesize: Self-Guiding Diffusion Models for
Probabilistic Time Series Forecasting [10.491628898499684]
We propose TSDiff, an unconditionally-trained diffusion model for time series.
Our proposed self-guidance mechanism enables conditioning TSDiff for downstream tasks during inference, without requiring auxiliary networks or altering the training procedure.
We demonstrate the effectiveness of our method on three different time series tasks: forecasting, refinement, and synthetic data generation.
arXiv Detail & Related papers (2023-07-21T10:56:36Z) - Practical and Asymptotically Exact Conditional Sampling in Diffusion Models [35.686996120862055]
A conditional generation method should provide exact samples for a broad range of conditional distributions without requiring task-specific training.
We introduce the Twisted Diffusion Sampler, or TDS, a sequential Monte Carlo algorithm that targets the conditional distributions of diffusion models through simulating a set of weighted particles.
On benchmark test cases, TDS allows flexible conditioning criteria and often outperforms the state of the art.
arXiv Detail & Related papers (2023-06-30T16:29:44Z) - User-defined Event Sampling and Uncertainty Quantification in Diffusion
Models for Physical Dynamical Systems [49.75149094527068]
We show that diffusion models can be adapted to make predictions and provide uncertainty quantification for chaotic dynamical systems.
We develop a probabilistic approximation scheme for the conditional score function which converges to the true distribution as the noise level decreases.
We are able to sample conditionally on nonlinear userdefined events at inference time, and matches data statistics even when sampling from the tails of the distribution.
arXiv Detail & Related papers (2023-06-13T03:42:03Z) - Bi-Noising Diffusion: Towards Conditional Diffusion Models with
Generative Restoration Priors [64.24948495708337]
We introduce a new method that brings predicted samples to the training data manifold using a pretrained unconditional diffusion model.
We perform comprehensive experiments to demonstrate the effectiveness of our approach on super-resolution, colorization, turbulence removal, and image-deraining tasks.
arXiv Detail & Related papers (2022-12-14T17:26:35Z) - GP-ConvCNP: Better Generalization for Convolutional Conditional Neural
Processes on Time Series Data [4.141867179461668]
Convolutional Conditional Neural Processes (ConvCNP) have shown remarkable improvement in performance over prior art.
We find that they sometimes struggle to generalize when applied to time series data.
In particular, they are not robust to distribution shifts and fail to extrapolate observed patterns into the future.
arXiv Detail & Related papers (2021-06-09T10:26:39Z) - Autoregressive Score Matching [113.4502004812927]
We propose autoregressive conditional score models (AR-CSM) where we parameterize the joint distribution in terms of the derivatives of univariable log-conditionals (scores)
For AR-CSM models, this divergence between data and model distributions can be computed and optimized efficiently, requiring no expensive sampling or adversarial training.
We show with extensive experimental results that it can be applied to density estimation on synthetic data, image generation, image denoising, and training latent variable models with implicit encoders.
arXiv Detail & Related papers (2020-10-24T07:01:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.