TADA: Temporal Adversarial Data Augmentation for Time Series Data
- URL: http://arxiv.org/abs/2407.15174v1
- Date: Sun, 21 Jul 2024 14:21:00 GMT
- Title: TADA: Temporal Adversarial Data Augmentation for Time Series Data
- Authors: Byeong Tak Lee, Joon-myoung Kwon, Yong-Yeon Jo,
- Abstract summary: Domain generalization involves training machine learning models to perform robustly on unseen samples from out-of-distribution datasets.
Adversarial Data Augmentation (ADA) is a commonly used approach that enhances model adaptability by incorporating synthetic samples.
We propose the Temporal Adversarial Data Augmentation for time teries Data (TADA), which incorporates a time warping technique specifically targeting temporal shifts.
- Score: 1.686373523281992
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Domain generalization involves training machine learning models to perform robustly on unseen samples from out-of-distribution datasets. Adversarial Data Augmentation (ADA) is a commonly used approach that enhances model adaptability by incorporating synthetic samples, designed to simulate potential unseen samples. While ADA effectively addresses amplitude-related distribution shifts, it falls short in managing temporal shifts, which are essential for time series data. To address this limitation, we propose the Temporal Adversarial Data Augmentation for time teries Data (TADA), which incorporates a time warping technique specifically targeting temporal shifts. Recognizing the challenge of non-differentiability in traditional time warping, we make it differentiable by leveraging phase shifts in the frequency domain. Our evaluations across diverse domains demonstrate that TADA significantly outperforms existing ADA variants, enhancing model performance across time series datasets with varied distributions.
Related papers
- Robust Multivariate Time Series Forecasting against Intra- and Inter-Series Transitional Shift [40.734564394464556]
We present a unified Probabilistic Graphical Model to Jointly capturing intra-/inter-series correlations and modeling the time-variant transitional distribution.
We validate the effectiveness and efficiency of JointPGM through extensive experiments on six highly non-stationary MTS datasets.
arXiv Detail & Related papers (2024-07-18T06:16:03Z) - tPARAFAC2: Tracking evolving patterns in (incomplete) temporal data [0.7285444492473742]
We introduce t(emporal)PARAFAC2 which utilizes temporal smoothness regularization on the evolving factors.
Our numerical experiments on both simulated and real datasets demonstrate the effectiveness of the temporal smoothness regularization.
arXiv Detail & Related papers (2024-07-01T15:10:55Z) - DeformTime: Capturing Variable Dependencies with Deformable Attention for Time Series Forecasting [0.34530027457862006]
We present DeformTime, a neural network architecture that attempts to capture correlated temporal patterns from the input space.
We conduct extensive experiments on 6 MTS data sets, using previously established benchmarks as well as challenging infectious disease modelling tasks.
Results demonstrate that DeformTime improves accuracy against previous competitive methods across the vast majority of MTS forecasting tasks.
arXiv Detail & Related papers (2024-06-11T16:45:48Z) - PeFAD: A Parameter-Efficient Federated Framework for Time Series Anomaly Detection [51.20479454379662]
We propose a.
Federated Anomaly Detection framework named PeFAD with the increasing privacy concerns.
We conduct extensive evaluations on four real datasets, where PeFAD outperforms existing state-of-the-art baselines by up to 28.74%.
arXiv Detail & Related papers (2024-06-04T13:51:08Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - Data Attribution for Diffusion Models: Timestep-induced Bias in
Influence Estimation [58.20016784231991]
Diffusion models operate over a sequence of timesteps instead of instantaneous input-output relationships in previous contexts.
We present Diffusion-TracIn that incorporates this temporal dynamics and observe that samples' loss gradient norms are highly dependent on timestep.
We introduce Diffusion-ReTrac as a re-normalized adaptation that enables the retrieval of training samples more targeted to the test sample of interest.
arXiv Detail & Related papers (2024-01-17T07:58:18Z) - UniTime: A Language-Empowered Unified Model for Cross-Domain Time Series
Forecasting [59.11817101030137]
This research advocates for a unified model paradigm that transcends domain boundaries.
Learning an effective cross-domain model presents the following challenges.
We propose UniTime for effective cross-domain time series learning.
arXiv Detail & Related papers (2023-10-15T06:30:22Z) - Towards Diverse and Coherent Augmentation for Time-Series Forecasting [22.213927377926804]
Time-series data augmentations mitigate the issue of insufficient training data for deep learning models.
We propose to combine Spectral and Time Augmentation for generating more diverse and coherent samples.
Experiments on five real-world time-series datasets demonstrate that STAug outperforms the base models without data augmentation.
arXiv Detail & Related papers (2023-03-24T19:40:34Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - TimeVAE: A Variational Auto-Encoder for Multivariate Time Series
Generation [6.824692201913679]
We propose a novel architecture for synthetically generating time-series data with the use of Variversaational Auto-Encoders (VAEs)
The proposed architecture has several distinct properties: interpretability, ability to encode domain knowledge, and reduced training times.
arXiv Detail & Related papers (2021-11-15T21:42:14Z) - Learning summary features of time series for likelihood free inference [93.08098361687722]
We present a data-driven strategy for automatically learning summary features from time series data.
Our results indicate that learning summary features from data can compete and even outperform LFI methods based on hand-crafted values.
arXiv Detail & Related papers (2020-12-04T19:21:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.