TimeBridge: Non-Stationarity Matters for Long-term Time Series Forecasting
- URL: http://arxiv.org/abs/2410.04442v2
- Date: Sat, 12 Oct 2024 15:47:38 GMT
- Title: TimeBridge: Non-Stationarity Matters for Long-term Time Series Forecasting
- Authors: Peiyuan Liu, Beiliang Wu, Yifan Hu, Naiqi Li, Tao Dai, Jigang Bao, Shu-tao Xia,
- Abstract summary: TimeBridge is a novel framework designed to bridge the gap between non-stationarity and dependency modeling.
TimeBridge consistently achieves state-of-the-art performance in both short-term and long-term forecasting.
- Score: 49.6208017412376
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Non-stationarity poses significant challenges for multivariate time series forecasting due to the inherent short-term fluctuations and long-term trends that can lead to spurious regressions or obscure essential long-term relationships. Most existing methods either eliminate or retain non-stationarity without adequately addressing its distinct impacts on short-term and long-term modeling. Eliminating non-stationarity is essential for avoiding spurious regressions and capturing local dependencies in short-term modeling, while preserving it is crucial for revealing long-term cointegration across variates. In this paper, we propose TimeBridge, a novel framework designed to bridge the gap between non-stationarity and dependency modeling in long-term time series forecasting. By segmenting input series into smaller patches, TimeBridge applies Integrated Attention to mitigate short-term non-stationarity and capture stable dependencies within each variate, while Cointegrated Attention preserves non-stationarity to model long-term cointegration across variates. Extensive experiments show that TimeBridge consistently achieves state-of-the-art performance in both short-term and long-term forecasting. Additionally, TimeBridge demonstrates exceptional performance in financial forecasting on the CSI 500 and S&P 500 indices, further validating its robustness and effectiveness. Code is available at \url{https://github.com/Hank0626/TimeBridge}.
Related papers
- Introducing Spectral Attention for Long-Range Dependency in Time Series Forecasting [36.577411683455786]
Recent linear and transformer-based forecasters have shown superior performance in time series forecasting.
They are constrained by their inherent inability to effectively address long-range dependencies in time series data.
We introduce a fast and effective Spectral Attention mechanism, which preserves temporal correlations among samples.
arXiv Detail & Related papers (2024-10-28T06:17:20Z) - Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Multiscale Representation Enhanced Temporal Flow Fusion Model for Long-Term Workload Forecasting [19.426131129034115]
This paper proposes a novel framework leveraging self-supervised multiscale representation learning to capture both long-term and near-term workload patterns.
The long-term history is encoded through multiscale representations while the near-term observations are modeled via temporal flow fusion.
arXiv Detail & Related papers (2024-07-29T04:42:18Z) - Learning Graph Structures and Uncertainty for Accurate and Calibrated Time-series Forecasting [65.40983982856056]
We introduce STOIC, that leverages correlations between time-series to learn underlying structure between time-series and to provide well-calibrated and accurate forecasts.
Over a wide-range of benchmark datasets STOIC provides 16% more accurate and better-calibrated forecasts.
arXiv Detail & Related papers (2024-07-02T20:14:32Z) - Self-Supervised Contrastive Learning for Long-term Forecasting [41.11757636744812]
Long-term forecasting presents unique challenges due to the time and memory complexity.
Existing methods, which rely on sliding windows to process long sequences, struggle to effectively capture long-term variations.
We introduce a novel approach that overcomes this limitation by employing contrastive learning and enhanced decomposition architecture.
arXiv Detail & Related papers (2024-02-03T04:32:34Z) - Gated Recurrent Neural Networks with Weighted Time-Delay Feedback [59.125047512495456]
We introduce a novel gated recurrent unit (GRU) with a weighted time-delay feedback mechanism.
We show that $tau$-GRU can converge faster and generalize better than state-of-the-art recurrent units and gated recurrent architectures.
arXiv Detail & Related papers (2022-12-01T02:26:34Z) - Grouped self-attention mechanism for a memory-efficient Transformer [64.0125322353281]
Real-world tasks such as forecasting weather, electricity consumption, and stock market involve predicting data that vary over time.
Time-series data are generally recorded over a long period of observation with long sequences owing to their periodic characteristics and long-range dependencies over time.
We propose two novel modules, Grouped Self-Attention (GSA) and Compressed Cross-Attention (CCA)
Our proposed model efficiently exhibited reduced computational complexity and performance comparable to or better than existing methods.
arXiv Detail & Related papers (2022-10-02T06:58:49Z) - Autoformer: Decomposition Transformers with Auto-Correlation for
Long-Term Series Forecasting [68.86835407617778]
Autoformer is a novel decomposition architecture with an Auto-Correlation mechanism.
In long-term forecasting, Autoformer yields state-of-the-art accuracy, with a relative improvement on six benchmarks.
arXiv Detail & Related papers (2021-06-24T13:43:43Z) - Parallel Extraction of Long-term Trends and Short-term Fluctuation
Framework for Multivariate Time Series Forecasting [14.399919351944677]
There are two characteristics of time series, that is, long-term trend and short-term fluctuation.
The existing prediction methods often do not distinguish between them, which reduces the accuracy of the prediction model.
Three prediction sub-networks are constructed to predict long-term trends, short-term fluctuations and the final value to be predicted.
arXiv Detail & Related papers (2020-08-18T03:55:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.