SimTS: Rethinking Contrastive Representation Learning for Time Series
Forecasting
- URL: http://arxiv.org/abs/2303.18205v1
- Date: Fri, 31 Mar 2023 16:59:40 GMT
- Title: SimTS: Rethinking Contrastive Representation Learning for Time Series
Forecasting
- Authors: Xiaochen Zheng and Xingyu Chen and Manuel Sch\"urch and Amina Mollaysa
and Ahmed Allam and Michael Krauthammer
- Abstract summary: We propose SimTS, a representation learning approach for improving time series forecasting.
SimTS does not rely on negative pairs or specific assumptions about the characteristics of the particular time series.
Our experiments show that SimTS achieves competitive performance compared to existing contrastive learning methods.
- Score: 10.987229904763609
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Contrastive learning methods have shown an impressive ability to learn
meaningful representations for image or time series classification. However,
these methods are less effective for time series forecasting, as optimization
of instance discrimination is not directly applicable to predicting the future
state from the history context. Moreover, the construction of positive and
negative pairs in current technologies strongly relies on specific time series
characteristics, restricting their generalization across diverse types of time
series data. To address these limitations, we propose SimTS, a simple
representation learning approach for improving time series forecasting by
learning to predict the future from the past in the latent space. SimTS does
not rely on negative pairs or specific assumptions about the characteristics of
the particular time series. Our extensive experiments on several benchmark time
series forecasting datasets show that SimTS achieves competitive performance
compared to existing contrastive learning methods. Furthermore, we show the
shortcomings of the current contrastive learning framework used for time series
forecasting through a detailed ablation study. Overall, our work suggests that
SimTS is a promising alternative to other contrastive learning approaches for
time series forecasting.
Related papers
- Frequency-Masked Embedding Inference: A Non-Contrastive Approach for Time Series Representation Learning [0.38366697175402226]
This paper introduces Frequency-masked Embedding Inference (FEI), a novel non-contrastive method that completely eliminates the need for positive and negative samples.
FEI significantly outperforms existing contrastive-based methods in terms of generalization.
This study provides new insights into self-supervised representation learning for time series.
arXiv Detail & Related papers (2024-12-30T08:12:17Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Distillation Enhanced Time Series Forecasting Network with Momentum Contrastive Learning [7.4106801792345705]
We propose DE-TSMCL, an innovative distillation enhanced framework for long sequence time series forecasting.
Specifically, we design a learnable data augmentation mechanism which adaptively learns whether to mask a timestamp.
Then, we propose a contrastive learning task with momentum update to explore inter-sample and intra-temporal correlations of time series.
By developing model loss from multiple tasks, we can learn effective representations for downstream forecasting task.
arXiv Detail & Related papers (2024-01-31T12:52:10Z) - Contrastive Difference Predictive Coding [79.74052624853303]
We introduce a temporal difference version of contrastive predictive coding that stitches together pieces of different time series data to decrease the amount of data required to learn predictions of future events.
We apply this representation learning method to derive an off-policy algorithm for goal-conditioned RL.
arXiv Detail & Related papers (2023-10-31T03:16:32Z) - OpenSTL: A Comprehensive Benchmark of Spatio-Temporal Predictive
Learning [67.07363529640784]
We propose OpenSTL to categorize prevalent approaches into recurrent-based and recurrent-free models.
We conduct standard evaluations on datasets across various domains, including synthetic moving object trajectory, human motion, driving scenes, traffic flow and forecasting weather.
We find that recurrent-free models achieve a good balance between efficiency and performance than recurrent models.
arXiv Detail & Related papers (2023-06-20T03:02:14Z) - Self-Supervised Learning for Time Series Analysis: Taxonomy, Progress, and Prospects [84.6945070729684]
Self-supervised learning (SSL) has recently achieved impressive performance on various time series tasks.
This article reviews current state-of-the-art SSL methods for time series data.
arXiv Detail & Related papers (2023-06-16T18:23:10Z) - Time Series Contrastive Learning with Information-Aware Augmentations [57.45139904366001]
A key component of contrastive learning is to select appropriate augmentations imposing some priors to construct feasible positive samples.
How to find the desired augmentations of time series data that are meaningful for given contrastive learning tasks and datasets remains an open question.
We propose a new contrastive learning approach with information-aware augmentations, InfoTS, that adaptively selects optimal augmentations for time series representation learning.
arXiv Detail & Related papers (2023-03-21T15:02:50Z) - VQ-AR: Vector Quantized Autoregressive Probabilistic Time Series
Forecasting [10.605719154114354]
Time series models aim for accurate predictions of the future given the past, where the forecasts are used for important downstream tasks like business decision making.
In this paper, we introduce a novel autoregressive architecture, VQ-AR, which instead learns a emphdiscrete set of representations that are used to predict the future.
arXiv Detail & Related papers (2022-05-31T15:43:46Z) - Using Time-Series Privileged Information for Provably Efficient Learning
of Prediction Models [6.7015527471908625]
We study prediction of future outcomes with supervised models that use privileged information during learning.
privileged information comprises samples of time series observed between the baseline time of prediction and the future outcome.
We show that our approach is generally preferable to classical learning, particularly when data is scarce.
arXiv Detail & Related papers (2021-10-28T10:07:29Z) - Series Saliency: Temporal Interpretation for Multivariate Time Series
Forecasting [30.054015098590874]
We present the series saliency framework for temporal interpretation for time series forecasting.
By extracting the "series images" from the sliding windows of the time series, we apply the saliency map segmentation.
Our framework generates temporal interpretations for the time series forecasting task while produces accurate time series forecast.
arXiv Detail & Related papers (2020-12-16T23:48:00Z) - Predicting Temporal Sets with Deep Neural Networks [50.53727580527024]
We propose an integrated solution based on the deep neural networks for temporal sets prediction.
A unique perspective is to learn element relationship by constructing set-level co-occurrence graph.
We design an attention-based module to adaptively learn the temporal dependency of elements and sets.
arXiv Detail & Related papers (2020-06-20T03:29:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.