TTS-GAN: A Transformer-based Time-Series Generative Adversarial Network
- URL: http://arxiv.org/abs/2202.02691v1
- Date: Sun, 6 Feb 2022 03:05:47 GMT
- Title: TTS-GAN: A Transformer-based Time-Series Generative Adversarial Network
- Authors: Xiaomin Li, Vangelis Metsis, Huangyingrui Wang, Anne Hee Hiong Ngu
- Abstract summary: Time-series data is one of the most common types of data used in medical machine learning applications.
We introduce TTS-GAN, a transformer-based GAN which can successfully generate realistic synthetic time-series data sequences.
We use visualizations and dimensionality reduction techniques to demonstrate the similarity of real and generated time-series data.
- Score: 4.989480853499916
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Signal measurements appearing in the form of time series are one of the most
common types of data used in medical machine learning applications. However,
such datasets are often small, making the training of deep neural network
architectures ineffective. For time-series, the suite of data augmentation
tricks we can use to expand the size of the dataset is limited by the need to
maintain the basic properties of the signal. Data generated by a Generative
Adversarial Network (GAN) can be utilized as another data augmentation tool.
RNN-based GANs suffer from the fact that they cannot effectively model long
sequences of data points with irregular temporal relations. To tackle these
problems, we introduce TTS-GAN, a transformer-based GAN which can successfully
generate realistic synthetic time-series data sequences of arbitrary length,
similar to the real ones. Both the generator and discriminator networks of the
GAN model are built using a pure transformer encoder architecture. We use
visualizations and dimensionality reduction techniques to demonstrate the
similarity of real and generated time-series data. We also compare the quality
of our generated data with the best existing alternative, which is an RNN-based
time-series GAN.
Related papers
- TransFusion: Generating Long, High Fidelity Time Series using Diffusion Models with Transformers [3.2623791881739033]
We propose TransFusion, a diffusion, and transformers-based generative model to generate high-quality long-sequence time-series data.
We have stretched the sequence length to 384, and generated high-quality synthetic data.
arXiv Detail & Related papers (2023-07-24T10:14:51Z) - GAT-GAN : A Graph-Attention-based Time-Series Generative Adversarial
Network [0.0]
We propose a Graph-Attention-based Generative Adversarial Network (GAT-GAN)
GAT-GAN generates long time-series data of high fidelity using an adversarially trained autoencoder architecture.
We introduce a Frechet Inception distance-like (FID) metric for time-series data called Frechet Transformer distance (FTD) score (lower is better) to evaluate the quality and variety of generated data.
arXiv Detail & Related papers (2023-06-03T04:23:49Z) - TSGM: A Flexible Framework for Generative Modeling of Synthetic Time Series [61.436361263605114]
Time series data are often scarce or highly sensitive, which precludes the sharing of data between researchers and industrial organizations.
We introduce Time Series Generative Modeling (TSGM), an open-source framework for the generative modeling of synthetic time series.
arXiv Detail & Related papers (2023-05-19T10:11:21Z) - LD-GAN: Low-Dimensional Generative Adversarial Network for Spectral
Image Generation with Variance Regularization [72.4394510913927]
Deep learning methods are state-of-the-art for spectral image (SI) computational tasks.
GANs enable diverse augmentation by learning and sampling from the data distribution.
GAN-based SI generation is challenging since the high-dimensionality nature of this kind of data hinders the convergence of the GAN training yielding to suboptimal generation.
We propose a statistical regularization to control the low-dimensional representation variance for the autoencoder training and to achieve high diversity of samples generated with the GAN.
arXiv Detail & Related papers (2023-04-29T00:25:02Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - TTS-CGAN: A Transformer Time-Series Conditional GAN for Biosignal Data
Augmentation [5.607676459156789]
We present TTS-CGAN, a conditional GAN model that can be trained on existing multi-class datasets and generate class-specific synthetic time-series sequences.
Synthetic sequences generated by our model are indistinguishable from real ones, and can be used to complement or replace real signals of the same type.
arXiv Detail & Related papers (2022-06-28T01:01:34Z) - Towards Generating Real-World Time Series Data [52.51620668470388]
We propose a novel generative framework for time series data generation - RTSGAN.
RTSGAN learns an encoder-decoder module which provides a mapping between a time series instance and a fixed-dimension latent vector.
To generate time series with missing values, we further equip RTSGAN with an observation embedding layer and a decide-and-generate decoder.
arXiv Detail & Related papers (2021-11-16T11:31:37Z) - Deep Transformer Networks for Time Series Classification: The NPP Safety
Case [59.20947681019466]
An advanced temporal neural network referred to as the Transformer is used within a supervised learning fashion to model the time-dependent NPP simulation data.
The Transformer can learn the characteristics of the sequential data and yield promising performance with approximately 99% classification accuracy on the testing dataset.
arXiv Detail & Related papers (2021-04-09T14:26:25Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.