Unsupervised Representation Learning for Time Series with Temporal
Neighborhood Coding
- URL: http://arxiv.org/abs/2106.00750v1
- Date: Tue, 1 Jun 2021 19:53:24 GMT
- Title: Unsupervised Representation Learning for Time Series with Temporal
Neighborhood Coding
- Authors: Sana Tonekaboni, Danny Eytan, Anna Goldenberg
- Abstract summary: We propose a self-supervised framework for learning generalizable representations for non-stationary time series.
Our motivation stems from the medical field, where the ability to model the dynamic nature of time series data is especially valuable.
- Score: 8.45908939323268
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series are often complex and rich in information but sparsely labeled
and therefore challenging to model. In this paper, we propose a self-supervised
framework for learning generalizable representations for non-stationary time
series. Our approach, called Temporal Neighborhood Coding (TNC), takes
advantage of the local smoothness of a signal's generative process to define
neighborhoods in time with stationary properties. Using a debiased contrastive
objective, our framework learns time series representations by ensuring that in
the encoding space, the distribution of signals from within a neighborhood is
distinguishable from the distribution of non-neighboring signals. Our
motivation stems from the medical field, where the ability to model the dynamic
nature of time series data is especially valuable for identifying, tracking,
and predicting the underlying patients' latent states in settings where
labeling data is practically impossible. We compare our method to recently
developed unsupervised representation learning approaches and demonstrate
superior performance on clustering and classification tasks for multiple
datasets.
Related papers
- Dynamic Contrastive Learning for Time Series Representation [6.086030037869592]
We propose DynaCL, an unsupervised contrastive representation learning framework for time series.
We demonstrate that DynaCL embeds instances from time series into semantically meaningful clusters.
Our findings also reveal that high scores on unsupervised clustering metrics do not guarantee that the representations are useful in downstream tasks.
arXiv Detail & Related papers (2024-10-20T15:20:24Z) - Motion Code: Robust Time Series Classification and Forecasting via Sparse Variational Multi-Stochastic Processes Learning [3.2857981869020327]
We propose a novel framework that views each time series as a realization of a continuous-time process.
This mathematical approach captures dependencies across timestamps and detects hidden, time-varying signals within the noise.
Experiments on noisy datasets, including real-world Parkinson's disease sensor tracking, demonstrate Motion Code's strong performance against established benchmarks.
arXiv Detail & Related papers (2024-02-21T19:10:08Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Learning Time-aware Graph Structures for Spatially Correlated Time
Series Forecasting [30.93275270960829]
We propose Time-aware Graph Structure Learning (TagSL), which extracts time-aware correlations among time series.
We also present a Graph Convolution-based Gated Recurrent Unit (GCGRU), that jointly captures spatial and temporal dependencies.
Finally, we introduce a unified framework named Time-aware Graph Convolutional Recurrent Network (TGCRN), combining TagSL, GCGRU in an encoder-decoder architecture for multi-step-temporal forecasting.
arXiv Detail & Related papers (2023-12-27T04:23:43Z) - Robust Detection of Lead-Lag Relationships in Lagged Multi-Factor Models [61.10851158749843]
Key insights can be obtained by discovering lead-lag relationships inherent in the data.
We develop a clustering-driven methodology for robust detection of lead-lag relationships in lagged multi-factor models.
arXiv Detail & Related papers (2023-05-11T10:30:35Z) - Contrastive Learning for Time Series on Dynamic Graphs [17.46524362769774]
We propose a framework called GraphTNC for unsupervised learning of joint representations of the graph and the time-series.
We show that it can prove beneficial for the classification task with real-world datasets.
arXiv Detail & Related papers (2022-09-21T21:14:28Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Learning to Reconstruct Missing Data from Spatiotemporal Graphs with
Sparse Observations [11.486068333583216]
This paper tackles the problem of learning effective models to reconstruct missing data points.
We propose a class of attention-based architectures, that given a set of highly sparse observations, learn a representation for points in time and space.
Compared to the state of the art, our model handles sparse data without propagating prediction errors or requiring a bidirectional model to encode forward and backward time dependencies.
arXiv Detail & Related papers (2022-05-26T16:40:48Z) - Novel Features for Time Series Analysis: A Complex Networks Approach [62.997667081978825]
Time series data are ubiquitous in several domains as climate, economics and health care.
Recent conceptual approach relies on time series mapping to complex networks.
Network analysis can be used to characterize different types of time series.
arXiv Detail & Related papers (2021-10-11T13:46:28Z) - Contrastive learning of strong-mixing continuous-time stochastic
processes [53.82893653745542]
Contrastive learning is a family of self-supervised methods where a model is trained to solve a classification task constructed from unlabeled data.
We show that a properly constructed contrastive learning task can be used to estimate the transition kernel for small-to-mid-range intervals in the diffusion case.
arXiv Detail & Related papers (2021-03-03T23:06:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.