Autoregressive Denoising Diffusion Models for Multivariate Probabilistic
Time Series Forecasting
- URL: http://arxiv.org/abs/2101.12072v1
- Date: Thu, 28 Jan 2021 15:46:10 GMT
- Title: Autoregressive Denoising Diffusion Models for Multivariate Probabilistic
Time Series Forecasting
- Authors: Kashif Rasul, Calvin Seward, Ingmar Schuster, Roland Vollgraf
- Abstract summary: We use diffusion probabilistic models, a class of latent variable models closely connected to score matching and energy-based methods.
Our model learns gradients by optimizing a variational bound on the data likelihood and at inference time converts white noise into a sample of the distribution of interest.
- Score: 4.1573460459258245
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this work, we propose \texttt{TimeGrad}, an autoregressive model for
multivariate probabilistic time series forecasting which samples from the data
distribution at each time step by estimating its gradient. To this end, we use
diffusion probabilistic models, a class of latent variable models closely
connected to score matching and energy-based methods. Our model learns
gradients by optimizing a variational bound on the data likelihood and at
inference time converts white noise into a sample of the distribution of
interest through a Markov chain using Langevin sampling. We demonstrate
experimentally that the proposed autoregressive denoising diffusion model is
the new state-of-the-art multivariate probabilistic forecasting method on
real-world data sets with thousands of correlated dimensions. We hope that this
method is a useful tool for practitioners and lays the foundation for future
research in this area.
Related papers
- A Simple Early Exiting Framework for Accelerated Sampling in Diffusion Models [14.859580045688487]
A practical bottleneck of diffusion models is their sampling speed.
We propose a novel framework capable of adaptively allocating compute required for the score estimation.
We show that our method could significantly improve the sampling throughput of the diffusion models without compromising image quality.
arXiv Detail & Related papers (2024-08-12T05:33:45Z) - Stochastic Diffusion: A Diffusion Probabilistic Model for Stochastic Time Series Forecasting [8.232475807691255]
We propose a novel Diffusion (StochDiff) model which learns data-driven prior knowledge at each time step.
The learnt prior knowledge helps the model to capture complex temporal dynamics and the inherent uncertainty of the data.
arXiv Detail & Related papers (2024-06-05T00:13:38Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Bi-Noising Diffusion: Towards Conditional Diffusion Models with
Generative Restoration Priors [64.24948495708337]
We introduce a new method that brings predicted samples to the training data manifold using a pretrained unconditional diffusion model.
We perform comprehensive experiments to demonstrate the effectiveness of our approach on super-resolution, colorization, turbulence removal, and image-deraining tasks.
arXiv Detail & Related papers (2022-12-14T17:26:35Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Modeling Temporal Data as Continuous Functions with Stochastic Process
Diffusion [2.2849153854336763]
temporal data can be viewed as discretized measurements of the underlying function.
To build a generative model for such data we have to model the process that governs it.
We propose a solution by defining the denoising diffusion model in the function space.
arXiv Detail & Related papers (2022-11-04T17:02:01Z) - Wasserstein multivariate auto-regressive models for modeling distributional time series [0.0]
We propose a new auto-regressive model for the statistical analysis of multivariate distributional time series.
Results on the existence, uniqueness and stationarity of the solution of such a model are provided.
To shed some light on the benefits of our approach for real data analysis, we also apply this methodology to a data set made of observations from age distribution in different countries.
arXiv Detail & Related papers (2022-07-12T10:18:36Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Oops I Took A Gradient: Scalable Sampling for Discrete Distributions [53.3142984019796]
We show that this approach outperforms generic samplers in a number of difficult settings.
We also demonstrate the use of our improved sampler for training deep energy-based models on high dimensional discrete data.
arXiv Detail & Related papers (2021-02-08T20:08:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.