Towards Better Long-range Time Series Forecasting using Generative
Adversarial Networks
- URL: http://arxiv.org/abs/2110.08770v1
- Date: Sun, 17 Oct 2021 09:13:45 GMT
- Title: Towards Better Long-range Time Series Forecasting using Generative
Adversarial Networks
- Authors: Shiyu Liu, Mehul Motani
- Abstract summary: We use Conditional Wasserstein GAN (CWGAN) and augment it with an error penalty term, leading to a new generative model which aims to generate high-quality synthetic time series data.
By using such synthetic data, we develop a long-range forecasting approach, called Generative Forecasting (GenF)
- Score: 40.662116703422846
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Accurate long-range forecasting of time series data is an important problem
in many sectors, such as energy, healthcare, and finance. In recent years,
Generative Adversarial Networks (GAN) have provided a revolutionary approach to
many problems. However, the use of GAN to improve long-range time series
forecasting remains relatively unexplored. In this paper, we utilize a
Conditional Wasserstein GAN (CWGAN) and augment it with an error penalty term,
leading to a new generative model which aims to generate high-quality synthetic
time series data, called CWGAN-TS. By using such synthetic data, we develop a
long-range forecasting approach, called Generative Forecasting (GenF),
consisting of three components: (i) CWGAN-TS to generate synthetic data for the
next few time steps. (ii) a predictor which makes long-range predictions based
on generated and observed data. (iii) an information theoretic clustering (ITC)
algorithm to better train the CWGAN-TS and the predictor. Our experimental
results on three public datasets demonstrate that GenF significantly
outperforms a diverse range of state-of-the-art benchmarks and classical
approaches. In most cases, we find a 6% - 12% improvement in predictive
performance (mean absolute error) and a 37% reduction in parameters compared to
the best performing benchmark. Lastly, we conduct an ablation study to
demonstrate the effectiveness of the CWGAN-TS and the ITC algorithm.
Related papers
- MGCP: A Multi-Grained Correlation based Prediction Network for Multivariate Time Series [54.91026286579748]
We propose a Multi-Grained Correlations-based Prediction Network.
It simultaneously considers correlations at three levels to enhance prediction performance.
It employs adversarial training with an attention mechanism-based predictor and conditional discriminator to optimize prediction results at coarse-grained level.
arXiv Detail & Related papers (2024-05-30T03:32:44Z) - Long-term drought prediction using deep neural networks based on geospatial weather data [75.38539438000072]
High-quality drought forecasting up to a year in advance is critical for agriculture planning and insurance.
We tackle drought data by introducing an end-to-end approach that adopts a systematic end-to-end approach.
Key findings are the exceptional performance of a Transformer model, EarthFormer, in making accurate short-term (up to six months) forecasts.
arXiv Detail & Related papers (2023-09-12T13:28:06Z) - GAT-GAN : A Graph-Attention-based Time-Series Generative Adversarial
Network [0.0]
We propose a Graph-Attention-based Generative Adversarial Network (GAT-GAN)
GAT-GAN generates long time-series data of high fidelity using an adversarially trained autoencoder architecture.
We introduce a Frechet Inception distance-like (FID) metric for time-series data called Frechet Transformer distance (FTD) score (lower is better) to evaluate the quality and variety of generated data.
arXiv Detail & Related papers (2023-06-03T04:23:49Z) - Mlinear: Rethink the Linear Model for Time-series Forecasting [9.841293660201261]
Mlinear is a simple yet effective method based mainly on linear layers.
We introduce a new loss function that significantly outperforms the widely used mean squared error (MSE) on multiple datasets.
Our method significantly outperforms PatchTST with a ratio of 21:3 at 336 sequence length input and 29:10 at 512 sequence length input.
arXiv Detail & Related papers (2023-05-08T15:54:18Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Towards Better Long-range Time Series Forecasting using Generative
Forecasting [29.046659097553515]
We propose a new forecasting strategy called Generative Forecasting (GenF)
GenF generates synthetic data for the next few time steps and then makes long-range forecasts based on generated and observed data.
We find a 5% - 11% improvement in predictive performance (mean absolute error) while having a 15% - 50% reduction in parameters compared to the benchmarks.
arXiv Detail & Related papers (2022-12-09T13:35:39Z) - Grouped self-attention mechanism for a memory-efficient Transformer [64.0125322353281]
Real-world tasks such as forecasting weather, electricity consumption, and stock market involve predicting data that vary over time.
Time-series data are generally recorded over a long period of observation with long sequences owing to their periodic characteristics and long-range dependencies over time.
We propose two novel modules, Grouped Self-Attention (GSA) and Compressed Cross-Attention (CCA)
Our proposed model efficiently exhibited reduced computational complexity and performance comparable to or better than existing methods.
arXiv Detail & Related papers (2022-10-02T06:58:49Z) - Towards Synthetic Multivariate Time Series Generation for Flare
Forecasting [5.098461305284216]
One of the limiting factors in training data-driven, rare-event prediction algorithms is the scarcity of the events of interest.
In this study, we explore the usefulness of the conditional generative adversarial network (CGAN) as a means to perform data-informed oversampling.
arXiv Detail & Related papers (2021-05-16T22:23:23Z) - Spatiotemporal Adaptive Neural Network for Long-term Forecasting of
Financial Time Series [0.2793095554369281]
We investigate whether deep neural networks (DNNs) can be used to forecast time series (TS) forecasts conjointly.
We make use of the dynamic factor graph (DFG) to build a multivariate autoregressive model.
With ACTM, it is possible to vary the autoregressive order of a TS model over time and model a larger set of probability distributions.
arXiv Detail & Related papers (2020-03-27T00:53:11Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.