Transformer-based conditional generative adversarial network for
multivariate time series generation
- URL: http://arxiv.org/abs/2210.02089v1
- Date: Wed, 5 Oct 2022 08:29:33 GMT
- Title: Transformer-based conditional generative adversarial network for
multivariate time series generation
- Authors: Abdellah Madane, Mohamed-djallel Dilmi, Florent Forest, Hanane Azzag,
Mustapha Lebbah, Jerome Lacaille
- Abstract summary: Conditional generation of time-dependent data is a task that has much interest.
Recent works proposed a Transformer-based Time series generative adversarial network (TTS-GAN)
We extend the TTS-GAN by conditioning its generated output on a particular encoded context.
We show that this transformer-based CGAN can generate realistic high-dimensional and long data sequences under different kinds of conditions.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Conditional generation of time-dependent data is a task that has much
interest, whether for data augmentation, scenario simulation, completing
missing data, or other purposes. Recent works proposed a Transformer-based Time
series generative adversarial network (TTS-GAN) to address the limitations of
recurrent neural networks. However, this model assumes a unimodal distribution
and tries to generate samples around the expectation of the real data
distribution. One of its limitations is that it may generate a random
multivariate time series; it may fail to generate samples in the presence of
multiple sub-components within an overall distribution. One could train models
to fit each sub-component separately to overcome this limitation. Our work
extends the TTS-GAN by conditioning its generated output on a particular
encoded context allowing the use of one model to fit a mixture distribution
with multiple sub-components. Technically, it is a conditional generative
adversarial network that models realistic multivariate time series under
different types of conditions, such as categorical variables or multivariate
time series. We evaluate our model on UniMiB Dataset, which contains
acceleration data following the XYZ axes of human activities collected using
Smartphones. We use qualitative evaluations and quantitative metrics such as
Principal Component Analysis (PCA), and we introduce a modified version of the
Frechet inception distance (FID) to measure the performance of our model and
the statistical similarities between the generated and the real data
distributions. We show that this transformer-based CGAN can generate realistic
high-dimensional and long data sequences under different kinds of conditions.
Related papers
- Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - TimeDiT: General-purpose Diffusion Transformers for Time Series Foundation Model [11.281386703572842]
A family of models have been developed, utilizing a temporal auto-regressive generative Transformer architecture.
TimeDiT is a general foundation model for time series that employs a denoising diffusion paradigm instead of temporal auto-regressive generation.
Extensive experiments conducted on a varity of tasks such as forecasting, imputation, and anomaly detection, demonstrate the effectiveness of TimeDiT.
arXiv Detail & Related papers (2024-09-03T22:31:57Z) - UniTST: Effectively Modeling Inter-Series and Intra-Series Dependencies for Multivariate Time Series Forecasting [98.12558945781693]
We propose a transformer-based model UniTST containing a unified attention mechanism on the flattened patch tokens.
Although our proposed model employs a simple architecture, it offers compelling performance as shown in our experiments on several datasets for time series forecasting.
arXiv Detail & Related papers (2024-06-07T14:39:28Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - Timer: Generative Pre-trained Transformers Are Large Time Series Models [83.03091523806668]
This paper aims at the early development of large time series models (LTSM)
During pre-training, we curate large-scale datasets with up to 1 billion time points.
To meet diverse application needs, we convert forecasting, imputation, and anomaly detection of time series into a unified generative task.
arXiv Detail & Related papers (2024-02-04T06:55:55Z) - EdgeConvFormer: Dynamic Graph CNN and Transformer based Anomaly
Detection in Multivariate Time Series [7.514010315664322]
We propose a novel anomaly detection method, named EdgeConvFormer, which integrates stacked Time2vec embedding, dynamic graph CNN, and Transformer to extract global and local spatial-time information.
Experiments demonstrate that EdgeConvFormer can learn the spatial-temporal modeling from multivariate time series data and achieve better anomaly detection performance than the state-of-the-art approaches on many real-world datasets of different scales.
arXiv Detail & Related papers (2023-12-04T08:38:54Z) - Time-series Transformer Generative Adversarial Networks [5.254093731341154]
We consider limitations posed specifically on time-series data and present a model that can generate synthetic time-series.
A model that generates synthetic time-series data has two objectives: 1) to capture the stepwise conditional distribution of real sequences, and 2) to faithfully model the joint distribution of entire real sequences.
We present TsT-GAN, a framework that capitalises on the Transformer architecture to satisfy the desiderata and compare its performance against five state-of-the-art models on five datasets.
arXiv Detail & Related papers (2022-05-23T10:04:21Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Towards Synthetic Multivariate Time Series Generation for Flare
Forecasting [5.098461305284216]
One of the limiting factors in training data-driven, rare-event prediction algorithms is the scarcity of the events of interest.
In this study, we explore the usefulness of the conditional generative adversarial network (CGAN) as a means to perform data-informed oversampling.
arXiv Detail & Related papers (2021-05-16T22:23:23Z) - Global Models for Time Series Forecasting: A Simulation Study [2.580765958706854]
We simulate time series from simple data generating processes (DGP), such as Auto Regressive (AR) and Seasonal AR, to complex DGPs, such as Chaotic Logistic Map, Self-Exciting Threshold Auto-Regressive, and Mackey-Glass equations.
The lengths and the number of series in the dataset are varied in different scenarios.
We perform experiments on these datasets using global forecasting models including Recurrent Neural Networks (RNN), Feed-Forward Neural Networks, Pooled Regression (PR) models, and Light Gradient Boosting Models (LGBM)
arXiv Detail & Related papers (2020-12-23T04:45:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.