SimTS: Rethinking Contrastive Representation Learning for Time Series
Forecasting
- URL: http://arxiv.org/abs/2303.18205v1
- Date: Fri, 31 Mar 2023 16:59:40 GMT
- Title: SimTS: Rethinking Contrastive Representation Learning for Time Series
Forecasting
- Authors: Xiaochen Zheng and Xingyu Chen and Manuel Sch\"urch and Amina Mollaysa
and Ahmed Allam and Michael Krauthammer
- Abstract summary: We propose SimTS, a representation learning approach for improving time series forecasting.
SimTS does not rely on negative pairs or specific assumptions about the characteristics of the particular time series.
Our experiments show that SimTS achieves competitive performance compared to existing contrastive learning methods.
- Score: 10.987229904763609
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Contrastive learning methods have shown an impressive ability to learn
meaningful representations for image or time series classification. However,
these methods are less effective for time series forecasting, as optimization
of instance discrimination is not directly applicable to predicting the future
state from the history context. Moreover, the construction of positive and
negative pairs in current technologies strongly relies on specific time series
characteristics, restricting their generalization across diverse types of time
series data. To address these limitations, we propose SimTS, a simple
representation learning approach for improving time series forecasting by
learning to predict the future from the past in the latent space. SimTS does
not rely on negative pairs or specific assumptions about the characteristics of
the particular time series. Our extensive experiments on several benchmark time
series forecasting datasets show that SimTS achieves competitive performance
compared to existing contrastive learning methods. Furthermore, we show the
shortcomings of the current contrastive learning framework used for time series
forecasting through a detailed ablation study. Overall, our work suggests that
SimTS is a promising alternative to other contrastive learning approaches for
time series forecasting.
Related papers
- Probing the Robustness of Time-series Forecasting Models with
CounterfacTS [1.823020744088554]
We present and publicly release CounterfacTS, a tool to probe the robustness of deep learning models in time-series forecasting tasks.
CounterfacTS has a user-friendly interface that allows the user to visualize, compare and quantify time series data and their forecasts.
arXiv Detail & Related papers (2024-03-06T07:34:47Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Distillation Enhanced Time Series Forecasting Network with Momentum Contrastive Learning [7.4106801792345705]
We propose DE-TSMCL, an innovative distillation enhanced framework for long sequence time series forecasting.
Specifically, we design a learnable data augmentation mechanism which adaptively learns whether to mask a timestamp.
Then, we propose a contrastive learning task with momentum update to explore inter-sample and intra-temporal correlations of time series.
By developing model loss from multiple tasks, we can learn effective representations for downstream forecasting task.
arXiv Detail & Related papers (2024-01-31T12:52:10Z) - Contrastive Difference Predictive Coding [79.74052624853303]
We introduce a temporal difference version of contrastive predictive coding that stitches together pieces of different time series data to decrease the amount of data required to learn predictions of future events.
We apply this representation learning method to derive an off-policy algorithm for goal-conditioned RL.
arXiv Detail & Related papers (2023-10-31T03:16:32Z) - OpenSTL: A Comprehensive Benchmark of Spatio-Temporal Predictive
Learning [67.07363529640784]
We propose OpenSTL to categorize prevalent approaches into recurrent-based and recurrent-free models.
We conduct standard evaluations on datasets across various domains, including synthetic moving object trajectory, human motion, driving scenes, traffic flow and forecasting weather.
We find that recurrent-free models achieve a good balance between efficiency and performance than recurrent models.
arXiv Detail & Related papers (2023-06-20T03:02:14Z) - Self-Supervised Learning for Time Series Analysis: Taxonomy, Progress, and Prospects [84.6945070729684]
Self-supervised learning (SSL) has recently achieved impressive performance on various time series tasks.
This article reviews current state-of-the-art SSL methods for time series data.
arXiv Detail & Related papers (2023-06-16T18:23:10Z) - Time Series Contrastive Learning with Information-Aware Augmentations [57.45139904366001]
A key component of contrastive learning is to select appropriate augmentations imposing some priors to construct feasible positive samples.
How to find the desired augmentations of time series data that are meaningful for given contrastive learning tasks and datasets remains an open question.
We propose a new contrastive learning approach with information-aware augmentations, InfoTS, that adaptively selects optimal augmentations for time series representation learning.
arXiv Detail & Related papers (2023-03-21T15:02:50Z) - VQ-AR: Vector Quantized Autoregressive Probabilistic Time Series
Forecasting [10.605719154114354]
Time series models aim for accurate predictions of the future given the past, where the forecasts are used for important downstream tasks like business decision making.
In this paper, we introduce a novel autoregressive architecture, VQ-AR, which instead learns a emphdiscrete set of representations that are used to predict the future.
arXiv Detail & Related papers (2022-05-31T15:43:46Z) - Cluster-and-Conquer: A Framework For Time-Series Forecasting [94.63501563413725]
We propose a three-stage framework for forecasting high-dimensional time-series data.
Our framework is highly general, allowing for any time-series forecasting and clustering method to be used in each step.
When instantiated with simple linear autoregressive models, we are able to achieve state-of-the-art results on several benchmark datasets.
arXiv Detail & Related papers (2021-10-26T20:41:19Z) - Series Saliency: Temporal Interpretation for Multivariate Time Series
Forecasting [30.054015098590874]
We present the series saliency framework for temporal interpretation for time series forecasting.
By extracting the "series images" from the sliding windows of the time series, we apply the saliency map segmentation.
Our framework generates temporal interpretations for the time series forecasting task while produces accurate time series forecast.
arXiv Detail & Related papers (2020-12-16T23:48:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.