Time-series Transformer Generative Adversarial Networks
- URL: http://arxiv.org/abs/2205.11164v1
- Date: Mon, 23 May 2022 10:04:21 GMT
- Title: Time-series Transformer Generative Adversarial Networks
- Authors: Padmanaba Srinivasan, William J. Knottenbelt
- Abstract summary: We consider limitations posed specifically on time-series data and present a model that can generate synthetic time-series.
A model that generates synthetic time-series data has two objectives: 1) to capture the stepwise conditional distribution of real sequences, and 2) to faithfully model the joint distribution of entire real sequences.
We present TsT-GAN, a framework that capitalises on the Transformer architecture to satisfy the desiderata and compare its performance against five state-of-the-art models on five datasets.
- Score: 5.254093731341154
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Many real-world tasks are plagued by limitations on data: in some instances
very little data is available and in others, data is protected by privacy
enforcing regulations (e.g. GDPR). We consider limitations posed specifically
on time-series data and present a model that can generate synthetic time-series
which can be used in place of real data. A model that generates synthetic
time-series data has two objectives: 1) to capture the stepwise conditional
distribution of real sequences, and 2) to faithfully model the joint
distribution of entire real sequences. Autoregressive models trained via
maximum likelihood estimation can be used in a system where previous
predictions are fed back in and used to predict future ones; in such models,
errors can accrue over time. Furthermore, a plausible initial value is required
making MLE based models not really generative. Many downstream tasks learn to
model conditional distributions of the time-series, hence, synthetic data drawn
from a generative model must satisfy 1) in addition to performing 2). We
present TsT-GAN, a framework that capitalises on the Transformer architecture
to satisfy the desiderata and compare its performance against five
state-of-the-art models on five datasets and show that TsT-GAN achieves higher
predictive performance on all datasets.
Related papers
- Tackling Data Heterogeneity in Federated Time Series Forecasting [61.021413959988216]
Time series forecasting plays a critical role in various real-world applications, including energy consumption prediction, disease transmission monitoring, and weather forecasting.
Most existing methods rely on a centralized training paradigm, where large amounts of data are collected from distributed devices to a central cloud server.
We propose a novel framework, Fed-TREND, to address data heterogeneity by generating informative synthetic data as auxiliary knowledge carriers.
arXiv Detail & Related papers (2024-11-24T04:56:45Z) - Approximate Probabilistic Inference for Time-Series Data A Robust Latent Gaussian Model With Temporal Awareness [0.40924476987095715]
We present a probabilistic generative model that can be trained to capture temporal information, and that is robust to data errors.
Our model is trained to minimize a loss function based on the negative log loss.
Experiments conducted show that tDLGM is able to reconstruct and generate complex time series data, and that it is robust against to noise and faulty data.
arXiv Detail & Related papers (2024-11-14T09:38:58Z) - sTransformer: A Modular Approach for Extracting Inter-Sequential and Temporal Information for Time-Series Forecasting [6.434378359932152]
We review and categorize existing Transformer-based models into two main types: (1) modifications to the model structure and (2) modifications to the input data.
We propose $textbfsTransformer$, which introduces the Sequence and Temporal Convolutional Network (STCN) to fully capture both sequential and temporal information.
We compare our model with linear models and existing forecasting models on long-term time-series forecasting, achieving new state-of-the-art results.
arXiv Detail & Related papers (2024-08-19T06:23:41Z) - TimeAutoDiff: Combining Autoencoder and Diffusion model for time series tabular data synthesizing [13.385264002435145]
In this paper, we leverage the power of latent diffusion models to generate synthetic time series tabular data.
We tackle this problem by combining the ideas of the variational auto-encoder (VAE) and the denoising diffusion probabilistic model (DDPM)
Our model named as textttTimeAutoDiff has several key advantages including (1) Generality: the ability to handle the broad spectrum of time series data from single to multi-sequence datasets.
arXiv Detail & Related papers (2024-06-23T06:32:27Z) - Timer: Generative Pre-trained Transformers Are Large Time Series Models [83.03091523806668]
This paper aims at the early development of large time series models (LTSM)
During pre-training, we curate large-scale datasets with up to 1 billion time points.
To meet diverse application needs, we convert forecasting, imputation, and anomaly detection of time series into a unified generative task.
arXiv Detail & Related papers (2024-02-04T06:55:55Z) - Learning Defect Prediction from Unrealistic Data [57.53586547895278]
Pretrained models of code have become popular choices for code understanding and generation tasks.
Such models tend to be large and require commensurate volumes of training data.
It has become popular to train models with far larger but less realistic datasets, such as functions with artificially injected bugs.
Models trained on such data tend to only perform well on similar data, while underperforming on real world programs.
arXiv Detail & Related papers (2023-11-02T01:51:43Z) - Lag-Llama: Towards Foundation Models for Probabilistic Time Series
Forecasting [54.04430089029033]
We present Lag-Llama, a general-purpose foundation model for time series forecasting based on a decoder-only transformer architecture.
Lag-Llama is pretrained on a large corpus of diverse time series data from several domains, and demonstrates strong zero-shot generalization capabilities.
When fine-tuned on relatively small fractions of such previously unseen datasets, Lag-Llama achieves state-of-the-art performance.
arXiv Detail & Related papers (2023-10-12T12:29:32Z) - On the Stability of Iterative Retraining of Generative Models on their own Data [56.153542044045224]
We study the impact of training generative models on mixed datasets.
We first prove the stability of iterative training under the condition that the initial generative models approximate the data distribution well enough.
We empirically validate our theory on both synthetic and natural images by iteratively training normalizing flows and state-of-the-art diffusion models.
arXiv Detail & Related papers (2023-09-30T16:41:04Z) - Deep Latent State Space Models for Time-Series Generation [68.45746489575032]
We propose LS4, a generative model for sequences with latent variables evolving according to a state space ODE.
Inspired by recent deep state space models (S4), we achieve speedups by leveraging a convolutional representation of LS4.
We show that LS4 significantly outperforms previous continuous-time generative models in terms of marginal distribution, classification, and prediction scores on real-world datasets.
arXiv Detail & Related papers (2022-12-24T15:17:42Z) - Transformer-based conditional generative adversarial network for
multivariate time series generation [0.0]
Conditional generation of time-dependent data is a task that has much interest.
Recent works proposed a Transformer-based Time series generative adversarial network (TTS-GAN)
We extend the TTS-GAN by conditioning its generated output on a particular encoded context.
We show that this transformer-based CGAN can generate realistic high-dimensional and long data sequences under different kinds of conditions.
arXiv Detail & Related papers (2022-10-05T08:29:33Z) - TTS-CGAN: A Transformer Time-Series Conditional GAN for Biosignal Data
Augmentation [5.607676459156789]
We present TTS-CGAN, a conditional GAN model that can be trained on existing multi-class datasets and generate class-specific synthetic time-series sequences.
Synthetic sequences generated by our model are indistinguishable from real ones, and can be used to complement or replace real signals of the same type.
arXiv Detail & Related papers (2022-06-28T01:01:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.