Don't overfit the history -- Recursive time series data augmentation
- URL: http://arxiv.org/abs/2207.02891v1
- Date: Wed, 6 Jul 2022 18:09:50 GMT
- Title: Don't overfit the history -- Recursive time series data augmentation
- Authors: Amine Mohamed Aboussalah, Min-Jae Kwon, Raj G Patel, Cheng Chi,
Chi-Guhn Lee
- Abstract summary: We introduce a general framework for time series augmentation, which we call Recursive Interpolation Method, denoted as RIM.
We perform theoretical analysis to characterize the proposed RIM and to guarantee its test performance.
We apply RIM to diverse real world time series cases to achieve strong performance over non-augmented data on regression, classification, and reinforcement learning tasks.
- Score: 17.31522835086563
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Time series observations can be seen as realizations of an underlying
dynamical system governed by rules that we typically do not know. For time
series learning tasks, we need to understand that we fit our model on available
data, which is a unique realized history. Training on a single realization
often induces severe overfitting lacking generalization. To address this issue,
we introduce a general recursive framework for time series augmentation, which
we call Recursive Interpolation Method, denoted as RIM. New samples are
generated using a recursive interpolation function of all previous values in
such a way that the enhanced samples preserve the original inherent time series
dynamics. We perform theoretical analysis to characterize the proposed RIM and
to guarantee its test performance. We apply RIM to diverse real world time
series cases to achieve strong performance over non-augmented data on
regression, classification, and reinforcement learning tasks.
Related papers
- Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - A Bag of Receptive Fields for Time Series Extrinsic Predictions [8.172425535905038]
High-dimensional time series data poses challenges due to its dynamic nature, varying lengths, and presence of missing values.
We propose BORF, a Bag-Of-Receptive-Fields model, which incorporates notions from time series convolution and 1D-SAX.
We evaluate BORF on Time Series Classification and Time Series Extrinsic Regression tasks using the full UEA and UCR repositories.
arXiv Detail & Related papers (2023-11-29T19:13:10Z) - Time-LLM: Time Series Forecasting by Reprogramming Large Language Models [110.20279343734548]
Time series forecasting holds significant importance in many real-world dynamic systems.
We present Time-LLM, a reprogramming framework to repurpose large language models for time series forecasting.
Time-LLM is a powerful time series learner that outperforms state-of-the-art, specialized forecasting models.
arXiv Detail & Related papers (2023-10-03T01:31:25Z) - MADS: Modulated Auto-Decoding SIREN for time series imputation [9.673093148930874]
We propose MADS, a novel auto-decoding framework for time series imputation, built upon implicit neural representations.
We evaluate our model on two real-world datasets, and show that it outperforms state-of-the-art methods for time series imputation.
arXiv Detail & Related papers (2023-07-03T09:08:47Z) - Unsupervised Feature Based Algorithms for Time Series Extrinsic
Regression [0.9659642285903419]
Time Series Extrinsic Regression (TSER) involves using a set of training time series to form a predictive model of a continuous response variable.
DrCIF and FreshPRINCE models are the only ones that significantly outperform the standard rotation forest regressor.
arXiv Detail & Related papers (2023-05-02T13:58:20Z) - STING: Self-attention based Time-series Imputation Networks using GAN [4.052758394413726]
STING (Self-attention based Time-series Imputation Networks using GAN) is proposed.
We take advantage of generative adversarial networks and bidirectional recurrent neural networks to learn latent representations of the time series.
Experimental results on three real-world datasets demonstrate that STING outperforms the existing state-of-the-art methods in terms of imputation accuracy.
arXiv Detail & Related papers (2022-09-22T06:06:56Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Cluster-and-Conquer: A Framework For Time-Series Forecasting [94.63501563413725]
We propose a three-stage framework for forecasting high-dimensional time-series data.
Our framework is highly general, allowing for any time-series forecasting and clustering method to be used in each step.
When instantiated with simple linear autoregressive models, we are able to achieve state-of-the-art results on several benchmark datasets.
arXiv Detail & Related papers (2021-10-26T20:41:19Z) - Abstractive Query Focused Summarization with Query-Free Resources [60.468323530248945]
In this work, we consider the problem of leveraging only generic summarization resources to build an abstractive QFS system.
We propose Marge, a Masked ROUGE Regression framework composed of a novel unified representation for summaries and queries.
Despite learning from minimal supervision, our system achieves state-of-the-art results in the distantly supervised setting.
arXiv Detail & Related papers (2020-12-29T14:39:35Z) - Time Series Extrinsic Regression [6.5513221781395465]
Time Series Extrinsic Regression (TSER) is a regression task of which the aim is to learn the relationship between a time series and a continuous scalar variable.
We benchmark existing solutions and adaptations of TSC algorithms on a novel archive of 19 TSER datasets.
Our results show that the state-of-the-art TSC algorithm Rocket, when adapted for regression, achieves the highest overall accuracy.
arXiv Detail & Related papers (2020-06-23T00:15:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.