A Joint Time-frequency Domain Transformer for Multivariate Time Series
Forecasting
- URL: http://arxiv.org/abs/2305.14649v2
- Date: Sat, 28 Oct 2023 05:16:59 GMT
- Title: A Joint Time-frequency Domain Transformer for Multivariate Time Series
Forecasting
- Authors: Yushu Chen, Shengzhuo Liu, Jinzhe Yang, Hao Jing, Wenlai Zhao, and
Guangwen Yang
- Abstract summary: This paper introduces the Joint Time-Frequency Domain Transformer (JTFT)
JTFT combines time and frequency domain representations to make predictions.
Experimental results on six real-world datasets demonstrate that JTFT outperforms state-of-the-art baselines in predictive performance.
- Score: 7.501660339993144
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In order to enhance the performance of Transformer models for long-term
multivariate forecasting while minimizing computational demands, this paper
introduces the Joint Time-Frequency Domain Transformer (JTFT). JTFT combines
time and frequency domain representations to make predictions. The frequency
domain representation efficiently extracts multi-scale dependencies while
maintaining sparsity by utilizing a small number of learnable frequencies.
Simultaneously, the time domain (TD) representation is derived from a fixed
number of the most recent data points, strengthening the modeling of local
relationships and mitigating the effects of non-stationarity. Importantly, the
length of the representation remains independent of the input sequence length,
enabling JTFT to achieve linear computational complexity. Furthermore, a
low-rank attention layer is proposed to efficiently capture cross-dimensional
dependencies, thus preventing performance degradation resulting from the
entanglement of temporal and channel-wise modeling. Experimental results on six
real-world datasets demonstrate that JTFT outperforms state-of-the-art
baselines in predictive performance.
Related papers
- MFF-FTNet: Multi-scale Feature Fusion across Frequency and Temporal Domains for Time Series Forecasting [18.815152183468673]
Time series forecasting is crucial in many fields, yet current deep learning models struggle with noise, data sparsity, and capturing complex patterns.
This paper presents MFF-FTNet, a novel framework addressing these challenges by combining contrastive learning with multi-scale feature extraction.
Extensive experiments on five real-world datasets demonstrate that MFF-FTNet significantly outperforms state-of-the-art models.
arXiv Detail & Related papers (2024-11-26T12:41:42Z) - Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - MixLinear: Extreme Low Resource Multivariate Time Series Forecasting with 0.1K Parameters [6.733646592789575]
Long-term Time Series Forecasting (LTSF) involves predicting long-term values by analyzing a large amount of historical time-series data to identify patterns and trends.
Transformer-based models offer high forecasting accuracy, but they are often too compute-intensive to be deployed on devices with hardware constraints.
We propose MixLinear, an ultra-lightweight time series forecasting model specifically designed for resource-constrained devices.
arXiv Detail & Related papers (2024-10-02T23:04:57Z) - MMFNet: Multi-Scale Frequency Masking Neural Network for Multivariate Time Series Forecasting [6.733646592789575]
Long-term Time Series Forecasting (LTSF) is critical for numerous real-world applications, such as electricity consumption planning, financial forecasting, and disease propagation analysis.
We introduce MMFNet, a novel model designed to enhance long-term multivariate forecasting by leveraging a multi-scale masked frequency decomposition approach.
MMFNet captures fine, intermediate, and coarse-grained temporal patterns by converting time series into frequency segments at varying scales while employing a learnable mask to filter out irrelevant components adaptively.
arXiv Detail & Related papers (2024-10-02T22:38:20Z) - FAITH: Frequency-domain Attention In Two Horizons for Time Series Forecasting [13.253624747448935]
Time Series Forecasting plays a crucial role in various fields such as industrial equipment maintenance, meteorology, energy consumption, traffic flow and financial investment.
Current deep learning-based predictive models often exhibit a significant deviation between their forecasting outcomes and the ground truth.
We propose a novel model Frequency-domain Attention In Two Horizons, which decomposes time series into trend and seasonal components.
arXiv Detail & Related papers (2024-05-22T02:37:02Z) - Rethinking Urban Mobility Prediction: A Super-Multivariate Time Series
Forecasting Approach [71.67506068703314]
Long-term urban mobility predictions play a crucial role in the effective management of urban facilities and services.
Traditionally, urban mobility data has been structured as videos, treating longitude and latitude as fundamental pixels.
In our research, we introduce a fresh perspective on urban mobility prediction.
Instead of oversimplifying urban mobility data as traditional video data, we regard it as a complex time series.
arXiv Detail & Related papers (2023-12-04T07:39:05Z) - Multi-scale Transformer Pyramid Networks for Multivariate Time Series
Forecasting [8.739572744117634]
We introduce a dimension invariant embedding technique that captures short-term temporal dependencies.
We present a novel Multi-scale Transformer Pyramid Network (MTPNet) specifically designed to capture temporal dependencies at multiple unconstrained scales.
arXiv Detail & Related papers (2023-08-23T06:40:05Z) - FormerTime: Hierarchical Multi-Scale Representations for Multivariate
Time Series Classification [53.55504611255664]
FormerTime is a hierarchical representation model for improving the classification capacity for the multivariate time series classification task.
It exhibits three aspects of merits: (1) learning hierarchical multi-scale representations from time series data, (2) inheriting the strength of both transformers and convolutional networks, and (3) tacking the efficiency challenges incurred by the self-attention mechanism.
arXiv Detail & Related papers (2023-02-20T07:46:14Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Transform Once: Efficient Operator Learning in Frequency Domain [69.74509540521397]
We study deep neural networks designed to harness the structure in frequency domain for efficient learning of long-range correlations in space or time.
This work introduces a blueprint for frequency domain learning through a single transform: transform once (T1)
arXiv Detail & Related papers (2022-11-26T01:56:05Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.