Data-Driven Modeling of Noise Time Series with Convolutional Generative
Adversarial Networks
- URL: http://arxiv.org/abs/2207.01110v3
- Date: Thu, 20 Jul 2023 16:44:47 GMT
- Title: Data-Driven Modeling of Noise Time Series with Convolutional Generative
Adversarial Networks
- Authors: Adam Wunderlich, Jack Sklar
- Abstract summary: Given the recent interest in generative adversarial networks (GANs) for data-driven modeling, it is important to determine to what extent GANs can faithfully reproduce noise in target data sets.
We present an empirical investigation that aims to shed light on this issue for time series.
We find that GANs are capable of learning many noise types, although they predictably struggle when the GAN architecture is not well suited to some aspects of the noise.
- Score: 0.456877715768796
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Random noise arising from physical processes is an inherent characteristic of
measurements and a limiting factor for most signal processing and data analysis
tasks. Given the recent interest in generative adversarial networks (GANs) for
data-driven modeling, it is important to determine to what extent GANs can
faithfully reproduce noise in target data sets. In this paper, we present an
empirical investigation that aims to shed light on this issue for time series.
Namely, we assess two general-purpose GANs for time series that are based on
the popular deep convolutional GAN (DCGAN) architecture, a direct time-series
model and an image-based model that uses a short-time Fourier transform (STFT)
data representation. The GAN models are trained and quantitatively evaluated
using distributions of simulated noise time series with known ground-truth
parameters. Target time series distributions include a broad range of noise
types commonly encountered in physical measurements, electronics, and
communication systems: band-limited thermal noise, power law noise, shot noise,
and impulsive noise. We find that GANs are capable of learning many noise
types, although they predictably struggle when the GAN architecture is not well
suited to some aspects of the noise, e.g., impulsive time-series with extreme
outliers. Our findings provide insights into the capabilities and potential
limitations of current approaches to time-series GANs and highlight areas for
further research. In addition, our battery of tests provides a useful benchmark
to aid the development of deep generative models for time series.
Related papers
- TSLANet: Rethinking Transformers for Time Series Representation Learning [19.795353886621715]
Time series data is characterized by its intrinsic long and short-range dependencies.
We introduce a novel Time Series Lightweight Network (TSLANet) as a universal convolutional model for diverse time series tasks.
Our experiments demonstrate that TSLANet outperforms state-of-the-art models in various tasks spanning classification, forecasting, and anomaly detection.
arXiv Detail & Related papers (2024-04-12T13:41:29Z) - Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - TFAD: A Decomposition Time Series Anomaly Detection Architecture with
Time-Frequency Analysis [12.867257563413972]
Time series anomaly detection is a challenging problem due to the complex temporal dependencies and the limited label data.
We propose a Time-Frequency analysis based time series Anomaly Detection model, or TFAD, to exploit both time and frequency domains for performance improvement.
arXiv Detail & Related papers (2022-10-18T09:08:57Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - TTS-GAN: A Transformer-based Time-Series Generative Adversarial Network [4.989480853499916]
Time-series data is one of the most common types of data used in medical machine learning applications.
We introduce TTS-GAN, a transformer-based GAN which can successfully generate realistic synthetic time-series data sequences.
We use visualizations and dimensionality reduction techniques to demonstrate the similarity of real and generated time-series data.
arXiv Detail & Related papers (2022-02-06T03:05:47Z) - Towards Generating Real-World Time Series Data [52.51620668470388]
We propose a novel generative framework for time series data generation - RTSGAN.
RTSGAN learns an encoder-decoder module which provides a mapping between a time series instance and a fixed-dimension latent vector.
To generate time series with missing values, we further equip RTSGAN with an observation embedding layer and a decide-and-generate decoder.
arXiv Detail & Related papers (2021-11-16T11:31:37Z) - PSA-GAN: Progressive Self Attention GANs for Synthetic Time Series [0.0]
We present PSA-GAN, a generative adversarial network (GAN) that generates long time series samples of high quality.
We show that PSA-GAN can be used to reduce the error in two downstream forecasting tasks over baselines that only use real data.
arXiv Detail & Related papers (2021-08-02T15:30:15Z) - Deep Autoregressive Models with Spectral Attention [74.08846528440024]
We propose a forecasting architecture that combines deep autoregressive models with a Spectral Attention (SA) module.
By characterizing in the spectral domain the embedding of the time series as occurrences of a random process, our method can identify global trends and seasonality patterns.
Two spectral attention models, global and local to the time series, integrate this information within the forecast and perform spectral filtering to remove time series's noise.
arXiv Detail & Related papers (2021-07-13T11:08:47Z) - TadGAN: Time Series Anomaly Detection Using Generative Adversarial
Networks [73.01104041298031]
TadGAN is an unsupervised anomaly detection approach built on Generative Adversarial Networks (GANs)
To capture the temporal correlations of time series, we use LSTM Recurrent Neural Networks as base models for Generators and Critics.
To demonstrate the performance and generalizability of our approach, we test several anomaly scoring techniques and report the best-suited one.
arXiv Detail & Related papers (2020-09-16T15:52:04Z) - Conditional GAN for timeseries generation [0.0]
Time Series GAN (TSGAN) is proposed to model realistic time series data.
We evaluate TSGAN on 70 data sets from a benchmark time series database.
arXiv Detail & Related papers (2020-06-30T02:19:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.