Flow Matching with Gaussian Process Priors for Probabilistic Time Series Forecasting
- URL: http://arxiv.org/abs/2410.03024v1
- Date: Thu, 3 Oct 2024 22:12:50 GMT
- Title: Flow Matching with Gaussian Process Priors for Probabilistic Time Series Forecasting
- Authors: Marcel Kollovieh, Marten Lienen, David Lüdke, Leo Schwinn, Stephan Günnemann,
- Abstract summary: We introduce TSFlow, a conditional flow matching (CFM) model for time series.
By incorporating (conditional) Gaussian processes, TSFlow aligns the prior distribution more closely with the temporal structure of the data.
We show that both conditionally and unconditionally trained models achieve competitive results in forecasting benchmarks.
- Score: 43.951394031702016
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent advancements in generative modeling, particularly diffusion models, have opened new directions for time series modeling, achieving state-of-the-art performance in forecasting and synthesis. However, the reliance of diffusion-based models on a simple, fixed prior complicates the generative process since the data and prior distributions differ significantly. We introduce TSFlow, a conditional flow matching (CFM) model for time series that simplifies the generative problem by combining Gaussian processes, optimal transport paths, and data-dependent prior distributions. By incorporating (conditional) Gaussian processes, TSFlow aligns the prior distribution more closely with the temporal structure of the data, enhancing both unconditional and conditional generation. Furthermore, we propose conditional prior sampling to enable probabilistic forecasting with an unconditionally trained model. In our experimental evaluation on eight real-world datasets, we demonstrate the generative capabilities of TSFlow, producing high-quality unconditional samples. Finally, we show that both conditionally and unconditionally trained models achieve competitive results in forecasting benchmarks, surpassing other methods on 6 out of 8 datasets.
Related papers
- FM-TS: Flow Matching for Time Series Generation [71.31148785577085]
We introduce FM-TS, a rectified Flow Matching-based framework for Time Series generation.
FM-TS is more efficient in terms of training and inference.
We have achieved superior performance in solar forecasting and MuJoCo imputation tasks.
arXiv Detail & Related papers (2024-11-12T03:03:23Z) - On conditional diffusion models for PDE simulations [53.01911265639582]
We study score-based diffusion models for forecasting and assimilation of sparse observations.
We propose an autoregressive sampling approach that significantly improves performance in forecasting.
We also propose a new training strategy for conditional score-based models that achieves stable performance over a range of history lengths.
arXiv Detail & Related papers (2024-10-21T18:31:04Z) - Leveraging Priors via Diffusion Bridge for Time Series Generation [3.2066708654182743]
Time series generation is widely used in real-world applications such as simulation, data augmentation, and hypothesis test techniques.
diffusion models have emerged as the de facto approach for time series generation.
TimeBridge is a framework that enables flexible synthesis by leveraging diffusion bridges to learn the transport between chosen prior and data distributions.
arXiv Detail & Related papers (2024-08-13T06:47:59Z) - Discrete Flow Matching [74.04153927689313]
We present a novel discrete flow paradigm designed specifically for generating discrete data.
Our approach is capable of generating high-quality discrete data in a non-autoregressive fashion.
arXiv Detail & Related papers (2024-07-22T12:33:27Z) - Guided Flows for Generative Modeling and Decision Making [55.42634941614435]
We show that Guided Flows significantly improves the sample quality in conditional image generation and zero-shot text synthesis-to-speech.
Notably, we are first to apply flow models for plan generation in the offline reinforcement learning setting ax speedup in compared to diffusion models.
arXiv Detail & Related papers (2023-11-22T15:07:59Z) - Predict, Refine, Synthesize: Self-Guiding Diffusion Models for
Probabilistic Time Series Forecasting [10.491628898499684]
We propose TSDiff, an unconditionally-trained diffusion model for time series.
Our proposed self-guidance mechanism enables conditioning TSDiff for downstream tasks during inference, without requiring auxiliary networks or altering the training procedure.
We demonstrate the effectiveness of our method on three different time series tasks: forecasting, refinement, and synthetic data generation.
arXiv Detail & Related papers (2023-07-21T10:56:36Z) - Practical and Asymptotically Exact Conditional Sampling in Diffusion Models [35.686996120862055]
A conditional generation method should provide exact samples for a broad range of conditional distributions without requiring task-specific training.
We introduce the Twisted Diffusion Sampler, or TDS, a sequential Monte Carlo algorithm that targets the conditional distributions of diffusion models through simulating a set of weighted particles.
On benchmark test cases, TDS allows flexible conditioning criteria and often outperforms the state of the art.
arXiv Detail & Related papers (2023-06-30T16:29:44Z) - User-defined Event Sampling and Uncertainty Quantification in Diffusion
Models for Physical Dynamical Systems [49.75149094527068]
We show that diffusion models can be adapted to make predictions and provide uncertainty quantification for chaotic dynamical systems.
We develop a probabilistic approximation scheme for the conditional score function which converges to the true distribution as the noise level decreases.
We are able to sample conditionally on nonlinear userdefined events at inference time, and matches data statistics even when sampling from the tails of the distribution.
arXiv Detail & Related papers (2023-06-13T03:42:03Z) - GP-ConvCNP: Better Generalization for Convolutional Conditional Neural
Processes on Time Series Data [4.141867179461668]
Convolutional Conditional Neural Processes (ConvCNP) have shown remarkable improvement in performance over prior art.
We find that they sometimes struggle to generalize when applied to time series data.
In particular, they are not robust to distribution shifts and fail to extrapolate observed patterns into the future.
arXiv Detail & Related papers (2021-06-09T10:26:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.