Time Series Data Augmentation for Deep Learning: A Survey
- URL: http://arxiv.org/abs/2002.12478v4
- Date: Thu, 31 Mar 2022 18:22:00 GMT
- Title: Time Series Data Augmentation for Deep Learning: A Survey
- Authors: Qingsong Wen, Liang Sun, Fan Yang, Xiaomin Song, Jingkun Gao, Xue
Wang, Huan Xu
- Abstract summary: We systematically review different data augmentation methods for time series data.
We empirically compare different data augmentation methods for different tasks including time series classification, anomaly detection, and forecasting.
- Score: 35.2161833151567
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning performs remarkably well on many time series analysis tasks
recently. The superior performance of deep neural networks relies heavily on a
large number of training data to avoid overfitting. However, the labeled data
of many real-world time series applications may be limited such as
classification in medical time series and anomaly detection in AIOps. As an
effective way to enhance the size and quality of the training data, data
augmentation is crucial to the successful application of deep learning models
on time series data. In this paper, we systematically review different data
augmentation methods for time series. We propose a taxonomy for the reviewed
methods, and then provide a structured review for these methods by highlighting
their strengths and limitations. We also empirically compare different data
augmentation methods for different tasks including time series classification,
anomaly detection, and forecasting. Finally, we discuss and highlight five
future directions to provide useful research guidance.
Related papers
- VSFormer: Value and Shape-Aware Transformer with Prior-Enhanced Self-Attention for Multivariate Time Series Classification [47.92529531621406]
We propose a novel method, VSFormer, that incorporates both discriminative patterns (shape) and numerical information (value)
In addition, we extract class-specific prior information derived from supervised information to enrich the positional encoding.
Extensive experiments on all 30 UEA archived datasets demonstrate the superior performance of our method compared to SOTA models.
arXiv Detail & Related papers (2024-12-21T07:31:22Z) - Learning from Neighbors: Category Extrapolation for Long-Tail Learning [62.30734737735273]
We offer a novel perspective on long-tail learning, inspired by an observation: datasets with finer granularity tend to be less affected by data imbalance.
We introduce open-set auxiliary classes that are visually similar to existing ones, aiming to enhance representation learning for both head and tail classes.
To prevent the overwhelming presence of auxiliary classes from disrupting training, we introduce a neighbor-silencing loss.
arXiv Detail & Related papers (2024-10-21T13:06:21Z) - Deep Time Series Models: A Comprehensive Survey and Benchmark [74.28364194333447]
Time series data is of great significance in real-world scenarios.
Recent years have witnessed remarkable breakthroughs in the time series community.
We release Time Series Library (TSLib) as a fair benchmark of deep time series models for diverse analysis tasks.
arXiv Detail & Related papers (2024-07-18T08:31:55Z) - Pushing the Limits of Pre-training for Time Series Forecasting in the
CloudOps Domain [54.67888148566323]
We introduce three large-scale time series forecasting datasets from the cloud operations domain.
We show it is a strong zero-shot baseline and benefits from further scaling, both in model and dataset size.
Accompanying these datasets and results is a suite of comprehensive benchmark results comparing classical and deep learning baselines to our pre-trained method.
arXiv Detail & Related papers (2023-10-08T08:09:51Z) - Time Series Contrastive Learning with Information-Aware Augmentations [57.45139904366001]
A key component of contrastive learning is to select appropriate augmentations imposing some priors to construct feasible positive samples.
How to find the desired augmentations of time series data that are meaningful for given contrastive learning tasks and datasets remains an open question.
We propose a new contrastive learning approach with information-aware augmentations, InfoTS, that adaptively selects optimal augmentations for time series representation learning.
arXiv Detail & Related papers (2023-03-21T15:02:50Z) - Deep networks for system identification: a Survey [56.34005280792013]
System identification learns mathematical descriptions of dynamic systems from input-output data.
Main aim of the identified model is to predict new data from previous observations.
We discuss architectures commonly adopted in the literature, like feedforward, convolutional, and recurrent networks.
arXiv Detail & Related papers (2023-01-30T12:38:31Z) - Empirical Evaluation of Data Augmentations for Biobehavioral Time Series
Data with Deep Learning [16.84326709739788]
Data augmentation (DA) is a critical step for the success of deep learning models on biobehavioral time series data.
We first systematically review eight basic DA methods for biobehavioral time series data, and evaluate the effects on seven datasets with three backbones.
Next, we explore adapting more recent DA techniques to biobehavioral time series data by designing a new policy architecture.
arXiv Detail & Related papers (2022-10-13T03:40:12Z) - Data Augmentation techniques in time series domain: A survey and
taxonomy [0.20971479389679332]
Deep neural networks used to work with time series heavily depend on the size and consistency of the datasets used in training.
This work systematically reviews the current state-of-the-art in the area to provide an overview of all available algorithms.
The ultimate aim of this study is to provide a summary of the evolution and performance of areas that produce better results to guide future researchers in this field.
arXiv Detail & Related papers (2022-06-25T17:09:00Z) - Time Series Data Imputation: A Survey on Deep Learning Approaches [4.4458738910060775]
Time series data imputation is a well-studied problem with different categories of methods.
Time series methods based on deep learning have made progress with the usage of models like RNN.
We will review and discuss their model architectures, their pros and cons as well as their effects to show the development of the time series imputation methods.
arXiv Detail & Related papers (2020-11-23T11:57:27Z) - Deep learning for time series classification [2.0305676256390934]
Time series analysis allows us to visualize and understand the evolution of a process over time.
Time series classification consists of constructing algorithms dedicated to automatically label time series data.
Deep learning has emerged as one of the most effective methods for tackling the supervised classification task.
arXiv Detail & Related papers (2020-10-01T17:38:40Z) - An Empirical Survey of Data Augmentation for Time Series Classification
with Neural Networks [17.20906062729132]
We survey data augmentation techniques for time series and their application to time series classification with neural networks.
We propose a taxonomy and outline the four families in time series data augmentation, including transformation-based methods, pattern mixing, generative models, and decomposition methods.
We empirically evaluate 12 time series data augmentation methods on 128 time series classification datasets with six different types of neural networks.
arXiv Detail & Related papers (2020-07-31T10:33:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.