MultiResFormer: Transformer with Adaptive Multi-Resolution Modeling for
General Time Series Forecasting
- URL: http://arxiv.org/abs/2311.18780v2
- Date: Thu, 8 Feb 2024 15:40:35 GMT
- Title: MultiResFormer: Transformer with Adaptive Multi-Resolution Modeling for
General Time Series Forecasting
- Authors: Linfeng Du, Ji Xin, Alex Labach, Saba Zuberi, Maksims Volkovs, Rahul
G. Krishnan
- Abstract summary: Transformer-based models have greatly pushed the boundaries of time series forecasting recently.
Existing methods typically encode time series data into $textitpatches$ using one or a fixed set of patch lengths.
We propose MultiResFormer, which dynamically models temporal variations by adaptively choosing optimal patch lengths.
- Score: 18.990322695844675
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Transformer-based models have greatly pushed the boundaries of time series
forecasting recently. Existing methods typically encode time series data into
$\textit{patches}$ using one or a fixed set of patch lengths. This, however,
could result in a lack of ability to capture the variety of intricate temporal
dependencies present in real-world multi-periodic time series. In this paper,
we propose MultiResFormer, which dynamically models temporal variations by
adaptively choosing optimal patch lengths. Concretely, at the beginning of each
layer, time series data is encoded into several parallel branches, each using a
detected periodicity, before going through the transformer encoder block. We
conduct extensive evaluations on long- and short-term forecasting datasets
comparing MultiResFormer with state-of-the-art baselines. MultiResFormer
outperforms patch-based Transformer baselines on long-term forecasting tasks
and also consistently outperforms CNN baselines by a large margin, while using
much fewer parameters than these baselines.
Related papers
- Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - PRformer: Pyramidal Recurrent Transformer for Multivariate Time Series Forecasting [82.03373838627606]
Self-attention mechanism in Transformer architecture requires positional embeddings to encode temporal order in time series prediction.
We argue that this reliance on positional embeddings restricts the Transformer's ability to effectively represent temporal sequences.
We present a model integrating PRE with a standard Transformer encoder, demonstrating state-of-the-art performance on various real-world datasets.
arXiv Detail & Related papers (2024-08-20T01:56:07Z) - DRFormer: Multi-Scale Transformer Utilizing Diverse Receptive Fields for Long Time-Series Forecasting [3.420673126033772]
We propose a dynamic tokenizer with a dynamic sparse learning algorithm to capture diverse receptive fields and sparse patterns of time series data.
Our proposed model, named DRFormer, is evaluated on various real-world datasets, and experimental results demonstrate its superiority compared to existing methods.
arXiv Detail & Related papers (2024-08-05T07:26:47Z) - UniTST: Effectively Modeling Inter-Series and Intra-Series Dependencies for Multivariate Time Series Forecasting [98.12558945781693]
We propose a transformer-based model UniTST containing a unified attention mechanism on the flattened patch tokens.
Although our proposed model employs a simple architecture, it offers compelling performance as shown in our experiments on several datasets for time series forecasting.
arXiv Detail & Related papers (2024-06-07T14:39:28Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - EdgeConvFormer: Dynamic Graph CNN and Transformer based Anomaly
Detection in Multivariate Time Series [7.514010315664322]
We propose a novel anomaly detection method, named EdgeConvFormer, which integrates stacked Time2vec embedding, dynamic graph CNN, and Transformer to extract global and local spatial-time information.
Experiments demonstrate that EdgeConvFormer can learn the spatial-temporal modeling from multivariate time series data and achieve better anomaly detection performance than the state-of-the-art approaches on many real-world datasets of different scales.
arXiv Detail & Related papers (2023-12-04T08:38:54Z) - Multi-resolution Time-Series Transformer for Long-term Forecasting [24.47302799009906]
We propose a novel framework, Multi-resolution Time-Series Transformer (MTST), for simultaneous modeling of diverse temporal patterns at different resolutions.
In contrast to many existing time-series transformers, we employ relative positional encoding, which is better suited for extracting periodic components at different scales.
arXiv Detail & Related papers (2023-11-07T17:18:52Z) - Compatible Transformer for Irregularly Sampled Multivariate Time Series [75.79309862085303]
We propose a transformer-based encoder to achieve comprehensive temporal-interaction feature learning for each individual sample.
We conduct extensive experiments on 3 real-world datasets and validate that the proposed CoFormer significantly and consistently outperforms existing methods.
arXiv Detail & Related papers (2023-10-17T06:29:09Z) - Stecformer: Spatio-temporal Encoding Cascaded Transformer for
Multivariate Long-term Time Series Forecasting [11.021398675773055]
We propose a complete solution to address problems in terms of feature extraction and target prediction.
For extraction, we design an efficient-temporal encoding extractor including a semi-adaptive graph to acquire sufficient-temporal information.
For prediction, we propose a Cascaded De Predictor (CDP) to strengthen the correlation between different intervals.
arXiv Detail & Related papers (2023-05-25T13:00:46Z) - FormerTime: Hierarchical Multi-Scale Representations for Multivariate
Time Series Classification [53.55504611255664]
FormerTime is a hierarchical representation model for improving the classification capacity for the multivariate time series classification task.
It exhibits three aspects of merits: (1) learning hierarchical multi-scale representations from time series data, (2) inheriting the strength of both transformers and convolutional networks, and (3) tacking the efficiency challenges incurred by the self-attention mechanism.
arXiv Detail & Related papers (2023-02-20T07:46:14Z) - W-Transformers : A Wavelet-based Transformer Framework for Univariate
Time Series Forecasting [7.075125892721573]
We build a transformer model for non-stationary time series using wavelet-based transformer encoder architecture.
We evaluate our framework on several publicly available benchmark time series datasets from various domains.
arXiv Detail & Related papers (2022-09-08T17:39:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.