SimMTM: A Simple Pre-Training Framework for Masked Time-Series Modeling
- URL: http://arxiv.org/abs/2302.00861v4
- Date: Mon, 23 Oct 2023 13:02:38 GMT
- Title: SimMTM: A Simple Pre-Training Framework for Masked Time-Series Modeling
- Authors: Jiaxiang Dong, Haixu Wu, Haoran Zhang, Li Zhang, Jianmin Wang,
Mingsheng Long
- Abstract summary: SimMTM is a simple pre-training framework for Masked Time-series Modeling.
SimMTM recovers masked time points by the weighted aggregation of multiple neighbors outside the manifold.
SimMTM achieves state-of-the-art fine-tuning performance compared to the most advanced time series pre-training methods.
- Score: 82.69579113377192
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Time series analysis is widely used in extensive areas. Recently, to reduce
labeling expenses and benefit various tasks, self-supervised pre-training has
attracted immense interest. One mainstream paradigm is masked modeling, which
successfully pre-trains deep models by learning to reconstruct the masked
content based on the unmasked part. However, since the semantic information of
time series is mainly contained in temporal variations, the standard way of
randomly masking a portion of time points will seriously ruin vital temporal
variations of time series, making the reconstruction task too difficult to
guide representation learning. We thus present SimMTM, a Simple pre-training
framework for Masked Time-series Modeling. By relating masked modeling to
manifold learning, SimMTM proposes to recover masked time points by the
weighted aggregation of multiple neighbors outside the manifold, which eases
the reconstruction task by assembling ruined but complementary temporal
variations from multiple masked series. SimMTM further learns to uncover the
local structure of the manifold, which is helpful for masked modeling.
Experimentally, SimMTM achieves state-of-the-art fine-tuning performance
compared to the most advanced time series pre-training methods in two canonical
time series analysis tasks: forecasting and classification, covering both in-
and cross-domain settings.
Related papers
- Multi-Patch Prediction: Adapting LLMs for Time Series Representation
Learning [22.28251586213348]
aLLM4TS is an innovative framework that adapts Large Language Models (LLMs) for time-series representation learning.
A distinctive element of our framework is the patch-wise decoding layer, which departs from previous methods reliant on sequence-level decoding.
arXiv Detail & Related papers (2024-02-07T13:51:26Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Time-LLM: Time Series Forecasting by Reprogramming Large Language Models [110.20279343734548]
Time series forecasting holds significant importance in many real-world dynamic systems.
We present Time-LLM, a reprogramming framework to repurpose large language models for time series forecasting.
Time-LLM is a powerful time series learner that outperforms state-of-the-art, specialized forecasting models.
arXiv Detail & Related papers (2023-10-03T01:31:25Z) - TimeMAE: Self-Supervised Representations of Time Series with Decoupled
Masked Autoencoders [55.00904795497786]
We propose TimeMAE, a novel self-supervised paradigm for learning transferrable time series representations based on transformer networks.
The TimeMAE learns enriched contextual representations of time series with a bidirectional encoding scheme.
To solve the discrepancy issue incurred by newly injected masked embeddings, we design a decoupled autoencoder architecture.
arXiv Detail & Related papers (2023-03-01T08:33:16Z) - Ti-MAE: Self-Supervised Masked Time Series Autoencoders [16.98069693152999]
We propose a novel framework named Ti-MAE, in which the input time series are assumed to follow an integrate distribution.
Ti-MAE randomly masks out embedded time series data and learns an autoencoder to reconstruct them at the point-level.
Experiments on several public real-world datasets demonstrate that our framework of masked autoencoding could learn strong representations directly from the raw data.
arXiv Detail & Related papers (2023-01-21T03:20:23Z) - Masked Frequency Modeling for Self-Supervised Visual Pre-Training [102.89756957704138]
We present Masked Frequency Modeling (MFM), a unified frequency-domain-based approach for self-supervised pre-training of visual models.
MFM first masks out a portion of frequency components of the input image and then predicts the missing frequencies on the frequency spectrum.
For the first time, MFM demonstrates that, for both ViT and CNN, a simple non-Siamese framework can learn meaningful representations even using none of the following: (i) extra data, (ii) extra model, (iii) mask token.
arXiv Detail & Related papers (2022-06-15T17:58:30Z) - Explaining Time Series Predictions with Dynamic Masks [91.3755431537592]
We propose dynamic masks (Dynamask) to explain predictions of a machine learning model.
With synthetic and real-world data, we demonstrate that the dynamic underpinning of Dynamask, together with its parsimony, offer a neat improvement in the identification of feature importance over time.
The modularity of Dynamask makes it ideal as a plug-in to increase the transparency of a wide range of machine learning models in areas such as medicine and finance.
arXiv Detail & Related papers (2021-06-09T18:01:09Z) - Recurrent convolutional neural network for the surrogate modeling of
subsurface flow simulation [0.0]
We propose to combine SegNet with ConvLSTM layers for the surrogate modeling of numerical flow simulation.
Results show that the proposed method improves the performance of SegNet based surrogate model remarkably when the output of the simulation is time series data.
arXiv Detail & Related papers (2020-10-08T09:34:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.