Towards Generating Real-World Time Series Data
- URL: http://arxiv.org/abs/2111.08386v1
- Date: Tue, 16 Nov 2021 11:31:37 GMT
- Title: Towards Generating Real-World Time Series Data
- Authors: Hengzhi Pei, Kan Ren, Yuqing Yang, Chang Liu, Tao Qin, Dongsheng Li
- Abstract summary: We propose a novel generative framework for time series data generation - RTSGAN.
RTSGAN learns an encoder-decoder module which provides a mapping between a time series instance and a fixed-dimension latent vector.
To generate time series with missing values, we further equip RTSGAN with an observation embedding layer and a decide-and-generate decoder.
- Score: 52.51620668470388
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series data generation has drawn increasing attention in recent years.
Several generative adversarial network (GAN) based methods have been proposed
to tackle the problem usually with the assumption that the targeted time series
data are well-formatted and complete. However, real-world time series (RTS)
data are far away from this utopia, e.g., long sequences with variable lengths
and informative missing data raise intractable challenges for designing
powerful generation algorithms. In this paper, we propose a novel generative
framework for RTS data - RTSGAN to tackle the aforementioned challenges. RTSGAN
first learns an encoder-decoder module which provides a mapping between a time
series instance and a fixed-dimension latent vector and then learns a
generation module to generate vectors in the same latent space. By combining
the generator and the decoder, RTSGAN is able to generate RTS which respect the
original feature distributions and the temporal dynamics. To generate time
series with missing values, we further equip RTSGAN with an observation
embedding layer and a decide-and-generate decoder to better utilize the
informative missing patterns. Experiments on the four RTS datasets show that
the proposed framework outperforms the previous generation methods in terms of
synthetic data utility for downstream classification and prediction tasks.
Related papers
- Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - Timer: Generative Pre-trained Transformers Are Large Time Series Models [83.03091523806668]
This paper aims at the early development of large time series models (LTSM)
During pre-training, we curate large-scale datasets with up to 1 billion time points.
To meet diverse application needs, we convert forecasting, imputation, and anomaly detection of time series into a unified generative task.
arXiv Detail & Related papers (2024-02-04T06:55:55Z) - Representation Learning of Multivariate Time Series using Attention and
Adversarial Training [2.0577627277681887]
A Transformer-based autoencoder is proposed that is regularized using an adversarial training scheme to generate artificial time series signals.
Our results indicate that the generated signals exhibit higher similarity to an exemplary dataset than using a convolutional network approach.
arXiv Detail & Related papers (2024-01-03T21:32:46Z) - GAT-GAN : A Graph-Attention-based Time-Series Generative Adversarial
Network [0.0]
We propose a Graph-Attention-based Generative Adversarial Network (GAT-GAN)
GAT-GAN generates long time-series data of high fidelity using an adversarially trained autoencoder architecture.
We introduce a Frechet Inception distance-like (FID) metric for time-series data called Frechet Transformer distance (FTD) score (lower is better) to evaluate the quality and variety of generated data.
arXiv Detail & Related papers (2023-06-03T04:23:49Z) - FormerTime: Hierarchical Multi-Scale Representations for Multivariate
Time Series Classification [53.55504611255664]
FormerTime is a hierarchical representation model for improving the classification capacity for the multivariate time series classification task.
It exhibits three aspects of merits: (1) learning hierarchical multi-scale representations from time series data, (2) inheriting the strength of both transformers and convolutional networks, and (3) tacking the efficiency challenges incurred by the self-attention mechanism.
arXiv Detail & Related papers (2023-02-20T07:46:14Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Deep Generative model with Hierarchical Latent Factors for Time Series
Anomaly Detection [40.21502451136054]
This work presents DGHL, a new family of generative models for time series anomaly detection.
A top-down Convolution Network maps a novel hierarchical latent space to time series windows, exploiting temporal dynamics to encode information efficiently.
Our method outperformed current state-of-the-art models on four popular benchmark datasets.
arXiv Detail & Related papers (2022-02-15T17:19:44Z) - TTS-GAN: A Transformer-based Time-Series Generative Adversarial Network [4.989480853499916]
Time-series data is one of the most common types of data used in medical machine learning applications.
We introduce TTS-GAN, a transformer-based GAN which can successfully generate realistic synthetic time-series data sequences.
We use visualizations and dimensionality reduction techniques to demonstrate the similarity of real and generated time-series data.
arXiv Detail & Related papers (2022-02-06T03:05:47Z) - Deep Cellular Recurrent Network for Efficient Analysis of Time-Series
Data with Spatial Information [52.635997570873194]
This work proposes a novel deep cellular recurrent neural network (DCRNN) architecture to process complex multi-dimensional time series data with spatial information.
The proposed architecture achieves state-of-the-art performance while utilizing substantially less trainable parameters when compared to comparable methods in the literature.
arXiv Detail & Related papers (2021-01-12T20:08:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.