Predict, Refine, Synthesize: Self-Guiding Diffusion Models for
Probabilistic Time Series Forecasting
- URL: http://arxiv.org/abs/2307.11494v3
- Date: Wed, 22 Nov 2023 12:25:41 GMT
- Title: Predict, Refine, Synthesize: Self-Guiding Diffusion Models for
Probabilistic Time Series Forecasting
- Authors: Marcel Kollovieh, Abdul Fatir Ansari, Michael Bohlke-Schneider, Jasper
Zschiegner, Hao Wang, Yuyang Wang
- Abstract summary: We propose TSDiff, an unconditionally-trained diffusion model for time series.
Our proposed self-guidance mechanism enables conditioning TSDiff for downstream tasks during inference, without requiring auxiliary networks or altering the training procedure.
We demonstrate the effectiveness of our method on three different time series tasks: forecasting, refinement, and synthetic data generation.
- Score: 10.491628898499684
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Diffusion models have achieved state-of-the-art performance in generative
modeling tasks across various domains. Prior works on time series diffusion
models have primarily focused on developing conditional models tailored to
specific forecasting or imputation tasks. In this work, we explore the
potential of task-agnostic, unconditional diffusion models for several time
series applications. We propose TSDiff, an unconditionally-trained diffusion
model for time series. Our proposed self-guidance mechanism enables
conditioning TSDiff for downstream tasks during inference, without requiring
auxiliary networks or altering the training procedure. We demonstrate the
effectiveness of our method on three different time series tasks: forecasting,
refinement, and synthetic data generation. First, we show that TSDiff is
competitive with several task-specific conditional forecasting methods
(predict). Second, we leverage the learned implicit probability density of
TSDiff to iteratively refine the predictions of base forecasters with reduced
computational overhead over reverse diffusion (refine). Notably, the generative
performance of the model remains intact -- downstream forecasters trained on
synthetic samples from TSDiff outperform forecasters that are trained on
samples from other state-of-the-art generative time series models, occasionally
even outperforming models trained on real data (synthesize).
Related papers
- On conditional diffusion models for PDE simulations [53.01911265639582]
We study score-based diffusion models for forecasting and assimilation of sparse observations.
We propose an autoregressive sampling approach that significantly improves performance in forecasting.
We also propose a new training strategy for conditional score-based models that achieves stable performance over a range of history lengths.
arXiv Detail & Related papers (2024-10-21T18:31:04Z) - MG-TSD: Multi-Granularity Time Series Diffusion Models with Guided Learning Process [26.661721555671626]
We introduce a novel Multi-Granularity Time Series (MG-TSD) model, which achieves state-of-the-art predictive performance.
Our approach does not rely on additional external data, making it versatile and applicable across various domains.
arXiv Detail & Related papers (2024-03-09T01:15:03Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Add and Thin: Diffusion for Temporal Point Processes [24.4686728569167]
ADD-THIN is a principled probabilistic denoising diffusion model for temporal point process (TPP) networks.
It operates on entire event sequences and matches state-of-the-art TPP models in density estimation.
In experiments on synthetic and real-world datasets, our model matches the state-of-the-art TPP models in density estimation and strongly outperforms them in forecasting.
arXiv Detail & Related papers (2023-11-02T10:42:35Z) - Non-autoregressive Conditional Diffusion Models for Time Series
Prediction [3.9722979176564763]
TimeDiff is a non-autoregressive diffusion model that achieves high-quality time series prediction.
We show that TimeDiff consistently outperforms existing time series diffusion models.
arXiv Detail & Related papers (2023-06-08T08:53:59Z) - Diff-Instruct: A Universal Approach for Transferring Knowledge From
Pre-trained Diffusion Models [77.83923746319498]
We propose a framework called Diff-Instruct to instruct the training of arbitrary generative models.
We show that Diff-Instruct results in state-of-the-art single-step diffusion-based models.
Experiments on refining GAN models show that the Diff-Instruct can consistently improve the pre-trained generators of GAN models.
arXiv Detail & Related papers (2023-05-29T04:22:57Z) - DynaConF: Dynamic Forecasting of Non-Stationary Time Series [4.286546152336783]
We propose a new method to model non-stationary conditional distributions over time.
We show that our model can adapt to non-stationary time series better than state-of-the-art deep learning solutions.
arXiv Detail & Related papers (2022-09-17T21:40:02Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.