Non-autoregressive Conditional Diffusion Models for Time Series
Prediction
- URL: http://arxiv.org/abs/2306.05043v1
- Date: Thu, 8 Jun 2023 08:53:59 GMT
- Title: Non-autoregressive Conditional Diffusion Models for Time Series
Prediction
- Authors: Lifeng Shen, James Kwok
- Abstract summary: TimeDiff is a non-autoregressive diffusion model that achieves high-quality time series prediction.
We show that TimeDiff consistently outperforms existing time series diffusion models.
- Score: 3.9722979176564763
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Recently, denoising diffusion models have led to significant breakthroughs in
the generation of images, audio and text. However, it is still an open question
on how to adapt their strong modeling ability to model time series. In this
paper, we propose TimeDiff, a non-autoregressive diffusion model that achieves
high-quality time series prediction with the introduction of two novel
conditioning mechanisms: future mixup and autoregressive initialization.
Similar to teacher forcing, future mixup allows parts of the ground-truth
future predictions for conditioning, while autoregressive initialization helps
better initialize the model with basic time series patterns such as short-term
trends. Extensive experiments are performed on nine real-world datasets.
Results show that TimeDiff consistently outperforms existing time series
diffusion models, and also achieves the best overall performance across a
variety of the existing strong baselines (including transformers and FiLM).
Related papers
- Energy-Based Diffusion Language Models for Text Generation [126.23425882687195]
Energy-based Diffusion Language Model (EDLM) is an energy-based model operating at the full sequence level for each diffusion step.
Our framework offers a 1.3$times$ sampling speedup over existing diffusion models.
arXiv Detail & Related papers (2024-10-28T17:25:56Z) - Retrieval-Augmented Diffusion Models for Time Series Forecasting [19.251274915003265]
We propose a Retrieval- Augmented Time series Diffusion model (RATD)
RATD consists of two parts: an embedding-based retrieval process and a reference-guided diffusion model.
Our approach allows leveraging meaningful samples within the database to aid in sampling, thus maximizing the utilization of datasets.
arXiv Detail & Related papers (2024-10-24T13:14:39Z) - On conditional diffusion models for PDE simulations [53.01911265639582]
We study score-based diffusion models for forecasting and assimilation of sparse observations.
We propose an autoregressive sampling approach that significantly improves performance in forecasting.
We also propose a new training strategy for conditional score-based models that achieves stable performance over a range of history lengths.
arXiv Detail & Related papers (2024-10-21T18:31:04Z) - FDF: Flexible Decoupled Framework for Time Series Forecasting with Conditional Denoising and Polynomial Modeling [5.770377200028654]
Time series forecasting is vital in numerous web applications, influencing critical decision-making across industries.
We argue that diffusion models suffer from a significant drawback: indiscriminate noise addition to the original time series followed by denoising.
We propose a novel flexible decoupled framework that learns high-quality time series representations for enhanced forecasting performance.
arXiv Detail & Related papers (2024-10-17T06:20:43Z) - Stochastic Diffusion: A Diffusion Probabilistic Model for Stochastic Time Series Forecasting [8.232475807691255]
We propose a novel Diffusion (StochDiff) model which learns data-driven prior knowledge at each time step.
The learnt prior knowledge helps the model to capture complex temporal dynamics and the inherent uncertainty of the data.
arXiv Detail & Related papers (2024-06-05T00:13:38Z) - Timer: Generative Pre-trained Transformers Are Large Time Series Models [83.03091523806668]
This paper aims at the early development of large time series models (LTSM)
During pre-training, we curate large-scale datasets with up to 1 billion time points.
To meet diverse application needs, we convert forecasting, imputation, and anomaly detection of time series into a unified generative task.
arXiv Detail & Related papers (2024-02-04T06:55:55Z) - Lag-Llama: Towards Foundation Models for Probabilistic Time Series
Forecasting [54.04430089029033]
We present Lag-Llama, a general-purpose foundation model for time series forecasting based on a decoder-only transformer architecture.
Lag-Llama is pretrained on a large corpus of diverse time series data from several domains, and demonstrates strong zero-shot generalization capabilities.
When fine-tuned on relatively small fractions of such previously unseen datasets, Lag-Llama achieves state-of-the-art performance.
arXiv Detail & Related papers (2023-10-12T12:29:32Z) - Predict, Refine, Synthesize: Self-Guiding Diffusion Models for
Probabilistic Time Series Forecasting [10.491628898499684]
We propose TSDiff, an unconditionally-trained diffusion model for time series.
Our proposed self-guidance mechanism enables conditioning TSDiff for downstream tasks during inference, without requiring auxiliary networks or altering the training procedure.
We demonstrate the effectiveness of our method on three different time series tasks: forecasting, refinement, and synthetic data generation.
arXiv Detail & Related papers (2023-07-21T10:56:36Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - DynaConF: Dynamic Forecasting of Non-Stationary Time Series [4.286546152336783]
We propose a new method to model non-stationary conditional distributions over time.
We show that our model can adapt to non-stationary time series better than state-of-the-art deep learning solutions.
arXiv Detail & Related papers (2022-09-17T21:40:02Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.