T-Rep: Representation Learning for Time Series using Time-Embeddings
- URL: http://arxiv.org/abs/2310.04486v3
- Date: Thu, 9 May 2024 10:11:23 GMT
- Title: T-Rep: Representation Learning for Time Series using Time-Embeddings
- Authors: Archibald Fraikin, Adrien Bennetot, Stéphanie Allassonnière,
- Abstract summary: We propose T-Rep, a self-supervised method to learn time series representations at a timestep granularity.
T-Rep learns vector embeddings of time alongside its feature extractor, to extract temporal features.
We evaluate T-Rep on downstream classification, forecasting, and anomaly detection tasks.
- Score: 5.885238773559017
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Multivariate time series present challenges to standard machine learning techniques, as they are often unlabeled, high dimensional, noisy, and contain missing data. To address this, we propose T-Rep, a self-supervised method to learn time series representations at a timestep granularity. T-Rep learns vector embeddings of time alongside its feature extractor, to extract temporal features such as trend, periodicity, or distribution shifts from the signal. These time-embeddings are leveraged in pretext tasks, to incorporate smooth and fine-grained temporal dependencies in the representations, as well as reinforce robustness to missing data. We evaluate T-Rep on downstream classification, forecasting, and anomaly detection tasks. It is compared to existing self-supervised algorithms for time series, which it outperforms in all three tasks. We test T-Rep in missing data regimes, where it proves more resilient than its counterparts. Finally, we provide latent space visualisation experiments, highlighting the interpretability of the learned representations.
Related papers
- Self-Supervised Learning of Disentangled Representations for Multivariate Time-Series [10.99576829280084]
TimeDRL is a framework for multivariate time-series representation learning with dual-level disentangled embeddings.
TimeDRL features: (i) timestamp-level and instance-level embeddings using a [] token strategy; (ii) timestamp-predictive and instance-contrastive tasks for representation learning; and (iii) avoidance of augmentation methods to eliminate inductive biases.
Experiments on forecasting and classification datasets show TimeDRL outperforms existing methods, with further validation in semi-supervised settings with limited labeled data.
arXiv Detail & Related papers (2024-10-16T14:24:44Z) - Time Series Representation Learning with Supervised Contrastive Temporal Transformer [8.223940676615857]
We develop a simple, yet novel fusion model, called: textbfSupervised textbfCOntrastive textbfTemporal textbfTransformer (SCOTT)
We first investigate suitable augmentation methods for various types of time series data to assist with learning change-invariant representations.
arXiv Detail & Related papers (2024-03-16T03:37:19Z) - Distillation Enhanced Time Series Forecasting Network with Momentum Contrastive Learning [7.4106801792345705]
We propose DE-TSMCL, an innovative distillation enhanced framework for long sequence time series forecasting.
Specifically, we design a learnable data augmentation mechanism which adaptively learns whether to mask a timestamp.
Then, we propose a contrastive learning task with momentum update to explore inter-sample and intra-temporal correlations of time series.
By developing model loss from multiple tasks, we can learn effective representations for downstream forecasting task.
arXiv Detail & Related papers (2024-01-31T12:52:10Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - TimeDRL: Disentangled Representation Learning for Multivariate Time-Series [10.99576829280084]
TimeDRL is a generic time-series representation learning framework with disentangled dual-level embeddings.
TimeDRL consistently surpasses existing representation learning approaches, achieving an average improvement of 58.02% in MSE and classification by 1.48% in accuracy.
arXiv Detail & Related papers (2023-12-07T08:56:44Z) - FormerTime: Hierarchical Multi-Scale Representations for Multivariate
Time Series Classification [53.55504611255664]
FormerTime is a hierarchical representation model for improving the classification capacity for the multivariate time series classification task.
It exhibits three aspects of merits: (1) learning hierarchical multi-scale representations from time series data, (2) inheriting the strength of both transformers and convolutional networks, and (3) tacking the efficiency challenges incurred by the self-attention mechanism.
arXiv Detail & Related papers (2023-02-20T07:46:14Z) - Spatio-temporal predictive tasks for abnormal event detection in videos [60.02503434201552]
We propose new constrained pretext tasks to learn object level normality patterns.
Our approach consists in learning a mapping between down-scaled visual queries and their corresponding normal appearance and motion characteristics.
Experiments on several benchmark datasets demonstrate the effectiveness of our approach to localize and track anomalies.
arXiv Detail & Related papers (2022-10-27T19:45:12Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Towards Generating Real-World Time Series Data [52.51620668470388]
We propose a novel generative framework for time series data generation - RTSGAN.
RTSGAN learns an encoder-decoder module which provides a mapping between a time series instance and a fixed-dimension latent vector.
To generate time series with missing values, we further equip RTSGAN with an observation embedding layer and a decide-and-generate decoder.
arXiv Detail & Related papers (2021-11-16T11:31:37Z) - Self-supervised Transformer for Multivariate Clinical Time-Series with
Missing Values [7.9405251142099464]
We present STraTS (Self-supervised Transformer for TimeSeries) model.
It treats time-series as a set of observation triplets instead of using the traditional dense matrix representation.
It shows better prediction performance than state-of-theart methods for mortality prediction, especially when labeled data is limited.
arXiv Detail & Related papers (2021-07-29T19:39:39Z) - Interpretable Time-series Representation Learning With Multi-Level
Disentanglement [56.38489708031278]
Disentangle Time Series (DTS) is a novel disentanglement enhancement framework for sequential data.
DTS generates hierarchical semantic concepts as the interpretable and disentangled representation of time-series.
DTS achieves superior performance in downstream applications, with high interpretability of semantic concepts.
arXiv Detail & Related papers (2021-05-17T22:02:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.