CycleNet: Enhancing Time Series Forecasting through Modeling Periodic Patterns
- URL: http://arxiv.org/abs/2409.18479v2
- Date: Tue, 15 Oct 2024 06:24:29 GMT
- Title: CycleNet: Enhancing Time Series Forecasting through Modeling Periodic Patterns
- Authors: Shengsheng Lin, Weiwei Lin, Xinyi Hu, Wentai Wu, Ruichao Mo, Haocheng Zhong,
- Abstract summary: We introduce the Residual Cycle Forecasting (RCF) technique, which utilizes learnable recurrent cycles to model the inherent periodic patterns within sequences.
CycleNet achieves state-of-the-art prediction accuracy in multiple domains including electricity, weather, and energy, while offering significant efficiency advantages.
As a novel plug-and-play technique, the RCF can also significantly improve the prediction accuracy of existing models, including PatchTST and iTransformer.
- Score: 6.296866390553982
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The stable periodic patterns present in time series data serve as the foundation for conducting long-horizon forecasts. In this paper, we pioneer the exploration of explicitly modeling this periodicity to enhance the performance of models in long-term time series forecasting (LTSF) tasks. Specifically, we introduce the Residual Cycle Forecasting (RCF) technique, which utilizes learnable recurrent cycles to model the inherent periodic patterns within sequences, and then performs predictions on the residual components of the modeled cycles. Combining RCF with a Linear layer or a shallow MLP forms the simple yet powerful method proposed in this paper, called CycleNet. CycleNet achieves state-of-the-art prediction accuracy in multiple domains including electricity, weather, and energy, while offering significant efficiency advantages by reducing over 90% of the required parameter quantity. Furthermore, as a novel plug-and-play technique, the RCF can also significantly improve the prediction accuracy of existing models, including PatchTST and iTransformer. The source code is available at: https://github.com/ACAT-SCUT/CycleNet.
Related papers
- Sundial: A Family of Highly Capable Time Series Foundation Models [64.6322079384575]
We introduce Sundial, a family of native, flexible, and scalable time series foundation models.
Our model is pre-trained without specifying any prior distribution and can generate multiple probable predictions.
By mitigating mode collapse through TimeFlow Loss, we pre-train a family of Sundial models on TimeBench, which exhibit unprecedented model capacity and generalization performance.
arXiv Detail & Related papers (2025-02-02T14:52:50Z) - FlowScope: Enhancing Decision Making by Time Series Forecasting based on Prediction Optimization using HybridFlow Forecast Framework [0.0]
Time series forecasting is crucial in several sectors, such as meteorology, retail, healthcare, and finance.
We propose FlowScope which offers a versatile and robust platform for predicting time series data.
This empowers enterprises to make informed decisions and optimize long-term strategies for maximum performance.
arXiv Detail & Related papers (2024-11-16T06:25:30Z) - SparseTSF: Modeling Long-term Time Series Forecasting with 1k Parameters [16.966008476215258]
This paper introduces SparseTSF, a novel, extremely lightweight model for Long-term Time Series Forecasting (LTSF)
At the heart of SparseTSF lies the Cross-Period Sparse Forecasting technique, which simplifies the forecasting task by decoupling the periodicity and trend in time series data.
SparseTSF showcases remarkable generalization capabilities, making it well-suited for scenarios with limited computational resources, small samples, or low-quality data.
arXiv Detail & Related papers (2024-05-02T02:15:23Z) - Chronos: Learning the Language of Time Series [79.38691251254173]
Chronos is a framework for pretrained probabilistic time series models.
We show that Chronos models can leverage time series data from diverse domains to improve zero-shot accuracy on unseen forecasting tasks.
arXiv Detail & Related papers (2024-03-12T16:53:54Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - CARD: Channel Aligned Robust Blend Transformer for Time Series
Forecasting [50.23240107430597]
We design a special Transformer, i.e., Channel Aligned Robust Blend Transformer (CARD for short), that addresses key shortcomings of CI type Transformer in time series forecasting.
First, CARD introduces a channel-aligned attention structure that allows it to capture both temporal correlations among signals.
Second, in order to efficiently utilize the multi-scale knowledge, we design a token blend module to generate tokens with different resolutions.
Third, we introduce a robust loss function for time series forecasting to alleviate the potential overfitting issue.
arXiv Detail & Related papers (2023-05-20T05:16:31Z) - OFTER: An Online Pipeline for Time Series Forecasting [3.9962751777898955]
OFTER is a time series forecasting pipeline tailored for mid-sized multivariate time series.
It is specifically designed for online tasks, has an interpretable output, and is able to outperform several state-of-the art baselines.
The computational efficacy of the algorithm, its online nature, and its ability to operate in low signal-to-noise regimes render OFTER an ideal approach for financial time series problems.
arXiv Detail & Related papers (2023-04-08T00:18:03Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Contextually Enhanced ES-dRNN with Dynamic Attention for Short-Term Load
Forecasting [1.1602089225841632]
The proposed model is composed of two simultaneously trained tracks: the context track and the main track.
The RNN architecture consists of multiple recurrent layers stacked with hierarchical dilations and equipped with recently proposed attentive recurrent cells.
The model produces both point forecasts and predictive intervals.
arXiv Detail & Related papers (2022-12-18T07:42:48Z) - Radflow: A Recurrent, Aggregated, and Decomposable Model for Networks of
Time Series [77.47313102926017]
Radflow is a novel model for networks of time series that influence each other.
It embodies three key ideas: a recurrent neural network to obtain node embeddings that depend on time, the aggregation of the flow of influence from neighboring nodes with multi-head attention, and the multi-layer decomposition of time series.
We show that Radflow can learn different trends and seasonal patterns, that it is robust to missing nodes and edges, and that correlated temporal patterns among network neighbors reflect influence strength.
arXiv Detail & Related papers (2021-02-15T00:57:28Z) - Pattern-based Long Short-term Memory for Mid-term Electrical Load
Forecasting [0.0]
This work presents a network for forecasting a monthly electricity demand time series with a one-year horizon.
The novelty of this work is the use of pattern representation of the seasonal time series as an alternative to decomposition.
A simulation study performed on the monthly electricity demand time series for 35 European countries confirmed the high performance of the proposed model.
arXiv Detail & Related papers (2020-04-22T08:39:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.