Conditional GAN for timeseries generation
- URL: http://arxiv.org/abs/2006.16477v1
- Date: Tue, 30 Jun 2020 02:19:18 GMT
- Title: Conditional GAN for timeseries generation
- Authors: Kaleb E Smith, Anthony O Smith
- Abstract summary: Time Series GAN (TSGAN) is proposed to model realistic time series data.
We evaluate TSGAN on 70 data sets from a benchmark time series database.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: It is abundantly clear that time dependent data is a vital source of
information in the world. The challenge has been for applications in machine
learning to gain access to a considerable amount of quality data needed for
algorithm development and analysis. Modeling synthetic data using a Generative
Adversarial Network (GAN) has been at the heart of providing a viable solution.
Our work focuses on one dimensional times series and explores the few shot
approach, which is the ability of an algorithm to perform well with limited
data. This work attempts to ease the frustration by proposing a new
architecture, Time Series GAN (TSGAN), to model realistic time series data. We
evaluate TSGAN on 70 data sets from a benchmark time series database. Our
results demonstrate that TSGAN performs better than the competition both
quantitatively using the Frechet Inception Score (FID) metric, and
qualitatively when classification is used as the evaluation criteria.
Related papers
- TimeLDM: Latent Diffusion Model for Unconditional Time Series Generation [2.4454605633840143]
Time series generation is a crucial research topic in the area of decision-making systems.
Recent approaches focus on learning in the data space to model time series information.
We propose TimeLDM, a novel latent diffusion model for high-quality time series generation.
arXiv Detail & Related papers (2024-07-05T01:47:20Z) - Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - CausalTime: Realistically Generated Time-series for Benchmarking of
Causal Discovery [14.092834149864514]
This study introduces the CausalTime pipeline to generate time-series that highly resemble the real data.
The pipeline starts from real observations in a specific scenario and produces a matching benchmark dataset.
In the experiments, we validate the fidelity of the generated data through qualitative and quantitative experiments, followed by a benchmarking of existing TSCD algorithms.
arXiv Detail & Related papers (2023-10-03T02:29:19Z) - MADS: Modulated Auto-Decoding SIREN for time series imputation [9.673093148930874]
We propose MADS, a novel auto-decoding framework for time series imputation, built upon implicit neural representations.
We evaluate our model on two real-world datasets, and show that it outperforms state-of-the-art methods for time series imputation.
arXiv Detail & Related papers (2023-07-03T09:08:47Z) - GAT-GAN : A Graph-Attention-based Time-Series Generative Adversarial
Network [0.0]
We propose a Graph-Attention-based Generative Adversarial Network (GAT-GAN)
GAT-GAN generates long time-series data of high fidelity using an adversarially trained autoencoder architecture.
We introduce a Frechet Inception distance-like (FID) metric for time-series data called Frechet Transformer distance (FTD) score (lower is better) to evaluate the quality and variety of generated data.
arXiv Detail & Related papers (2023-06-03T04:23:49Z) - TSGM: A Flexible Framework for Generative Modeling of Synthetic Time Series [61.436361263605114]
Time series data are often scarce or highly sensitive, which precludes the sharing of data between researchers and industrial organizations.
We introduce Time Series Generative Modeling (TSGM), an open-source framework for the generative modeling of synthetic time series.
arXiv Detail & Related papers (2023-05-19T10:11:21Z) - Towards Generating Real-World Time Series Data [52.51620668470388]
We propose a novel generative framework for time series data generation - RTSGAN.
RTSGAN learns an encoder-decoder module which provides a mapping between a time series instance and a fixed-dimension latent vector.
To generate time series with missing values, we further equip RTSGAN with an observation embedding layer and a decide-and-generate decoder.
arXiv Detail & Related papers (2021-11-16T11:31:37Z) - Networked Time Series Prediction with Incomplete Data [59.45358694862176]
We propose NETS-ImpGAN, a novel deep learning framework that can be trained on incomplete data with missing values in both history and future.
We conduct extensive experiments on three real-world datasets under different missing patterns and missing rates.
arXiv Detail & Related papers (2021-10-05T18:20:42Z) - PSA-GAN: Progressive Self Attention GANs for Synthetic Time Series [0.0]
We present PSA-GAN, a generative adversarial network (GAN) that generates long time series samples of high quality.
We show that PSA-GAN can be used to reduce the error in two downstream forecasting tasks over baselines that only use real data.
arXiv Detail & Related papers (2021-08-02T15:30:15Z) - TadGAN: Time Series Anomaly Detection Using Generative Adversarial
Networks [73.01104041298031]
TadGAN is an unsupervised anomaly detection approach built on Generative Adversarial Networks (GANs)
To capture the temporal correlations of time series, we use LSTM Recurrent Neural Networks as base models for Generators and Critics.
To demonstrate the performance and generalizability of our approach, we test several anomaly scoring techniques and report the best-suited one.
arXiv Detail & Related papers (2020-09-16T15:52:04Z) - DeGAN : Data-Enriching GAN for Retrieving Representative Samples from a
Trained Classifier [58.979104709647295]
We bridge the gap between the abundance of available data and lack of relevant data, for the future learning tasks of a trained network.
We use the available data, that may be an imbalanced subset of the original training dataset, or a related domain dataset, to retrieve representative samples.
We demonstrate that data from a related domain can be leveraged to achieve state-of-the-art performance.
arXiv Detail & Related papers (2019-12-27T02:05:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.