PSA-GAN: Progressive Self Attention GANs for Synthetic Time Series
- URL: http://arxiv.org/abs/2108.00981v1
- Date: Mon, 2 Aug 2021 15:30:15 GMT
- Title: PSA-GAN: Progressive Self Attention GANs for Synthetic Time Series
- Authors: Jeha Paul, Bohlke-Schneider Michael, Mercado Pedro, Singh Nirwan
Rajbir, Kapoor Shubham, Flunkert Valentin, Gasthaus Jan, Januschowski Tim
- Abstract summary: We present PSA-GAN, a generative adversarial network (GAN) that generates long time series samples of high quality.
We show that PSA-GAN can be used to reduce the error in two downstream forecasting tasks over baselines that only use real data.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Realistic synthetic time series data of sufficient length enables practical
applications in time series modeling tasks, such as forecasting, but remains a
challenge. In this paper we present PSA-GAN, a generative adversarial network
(GAN) that generates long time series samples of high quality using progressive
growing of GANs and self-attention. We show that PSA-GAN can be used to reduce
the error in two downstream forecasting tasks over baselines that only use real
data. We also introduce a Frechet-Inception Distance-like score, Context-FID,
assessing the quality of synthetic time series samples. In our downstream
tasks, we find that the lowest scoring models correspond to the best-performing
ones. Therefore, Context-FID could be a useful tool to develop time series GAN
models.
Related papers
- Generating Fine-Grained Causality in Climate Time Series Data for Forecasting and Anomaly Detection [67.40407388422514]
We design a conceptual fine-grained causal model named TBN Granger Causality.
Second, we propose an end-to-end deep generative model called TacSas, which discovers TBN Granger Causality in a generative manner.
We test TacSas on climate benchmark ERA5 for climate forecasting and the extreme weather benchmark of NOAA for extreme weather alerts.
arXiv Detail & Related papers (2024-08-08T06:47:21Z) - Timer: Generative Pre-trained Transformers Are Large Time Series Models [83.03091523806668]
This paper aims at the early development of large time series models (LTSM)
During pre-training, we curate large-scale datasets with up to 1 billion time points.
To meet diverse application needs, we convert forecasting, imputation, and anomaly detection of time series into a unified generative task.
arXiv Detail & Related papers (2024-02-04T06:55:55Z) - Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - A Distance Correlation-Based Approach to Characterize the Effectiveness of Recurrent Neural Networks for Time Series Forecasting [1.9950682531209158]
We provide an approach to link time series characteristics with RNN components via the versatile metric of distance correlation.
We empirically show that the RNN activation layers learn the lag structures of time series well.
We also show that the activation layers cannot adequately model moving average and heteroskedastic time series processes.
arXiv Detail & Related papers (2023-07-28T22:32:08Z) - GAT-GAN : A Graph-Attention-based Time-Series Generative Adversarial
Network [0.0]
We propose a Graph-Attention-based Generative Adversarial Network (GAT-GAN)
GAT-GAN generates long time-series data of high fidelity using an adversarially trained autoencoder architecture.
We introduce a Frechet Inception distance-like (FID) metric for time-series data called Frechet Transformer distance (FTD) score (lower is better) to evaluate the quality and variety of generated data.
arXiv Detail & Related papers (2023-06-03T04:23:49Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Data-Driven Modeling of Noise Time Series with Convolutional Generative
Adversarial Networks [0.456877715768796]
Given the recent interest in generative adversarial networks (GANs) for data-driven modeling, it is important to determine to what extent GANs can faithfully reproduce noise in target data sets.
We present an empirical investigation that aims to shed light on this issue for time series.
We find that GANs are capable of learning many noise types, although they predictably struggle when the GAN architecture is not well suited to some aspects of the noise.
arXiv Detail & Related papers (2022-07-03T20:07:00Z) - TTS-GAN: A Transformer-based Time-Series Generative Adversarial Network [4.989480853499916]
Time-series data is one of the most common types of data used in medical machine learning applications.
We introduce TTS-GAN, a transformer-based GAN which can successfully generate realistic synthetic time-series data sequences.
We use visualizations and dimensionality reduction techniques to demonstrate the similarity of real and generated time-series data.
arXiv Detail & Related papers (2022-02-06T03:05:47Z) - Towards Generating Real-World Time Series Data [52.51620668470388]
We propose a novel generative framework for time series data generation - RTSGAN.
RTSGAN learns an encoder-decoder module which provides a mapping between a time series instance and a fixed-dimension latent vector.
To generate time series with missing values, we further equip RTSGAN with an observation embedding layer and a decide-and-generate decoder.
arXiv Detail & Related papers (2021-11-16T11:31:37Z) - Learning summary features of time series for likelihood free inference [93.08098361687722]
We present a data-driven strategy for automatically learning summary features from time series data.
Our results indicate that learning summary features from data can compete and even outperform LFI methods based on hand-crafted values.
arXiv Detail & Related papers (2020-12-04T19:21:37Z) - Conditional GAN for timeseries generation [0.0]
Time Series GAN (TSGAN) is proposed to model realistic time series data.
We evaluate TSGAN on 70 data sets from a benchmark time series database.
arXiv Detail & Related papers (2020-06-30T02:19:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.