DEPTS: Deep Expansion Learning for Periodic Time Series Forecasting
- URL: http://arxiv.org/abs/2203.07681v1
- Date: Tue, 15 Mar 2022 06:51:58 GMT
- Title: DEPTS: Deep Expansion Learning for Periodic Time Series Forecasting
- Authors: Wei Fan, Shun Zheng, Xiaohan Yi, Wei Cao, Yanjie Fu, Jiang Bian,
Tie-Yan Liu
- Abstract summary: We introduce a deep expansion learning framework, DEPTS, for PTS forecasting.
DEPTS starts with a decoupled formulation by introducing the periodic state as a hidden variable.
Our two customized modules also have certain interpretable capabilities, such as attributing the forecasts to either local momenta or global periodicity.
- Score: 83.60876685008225
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Periodic time series (PTS) forecasting plays a crucial role in a variety of
industries to foster critical tasks, such as early warning, pre-planning,
resource scheduling, etc. However, the complicated dependencies of the PTS
signal on its inherent periodicity as well as the sophisticated composition of
various periods hinder the performance of PTS forecasting. In this paper, we
introduce a deep expansion learning framework, DEPTS, for PTS forecasting.
DEPTS starts with a decoupled formulation by introducing the periodic state as
a hidden variable, which stimulates us to make two dedicated modules to tackle
the aforementioned two challenges. First, we develop an expansion module on top
of residual learning to perform a layer-by-layer expansion of those complicated
dependencies. Second, we introduce a periodicity module with a parameterized
periodic function that holds sufficient capacity to capture diversified
periods. Moreover, our two customized modules also have certain interpretable
capabilities, such as attributing the forecasts to either local momenta or
global periodicity and characterizing certain core periodic properties, e.g.,
amplitudes and frequencies. Extensive experiments on both synthetic data and
real-world data demonstrate the effectiveness of DEPTS on handling PTS. In most
cases, DEPTS achieves significant improvements over the best baseline.
Specifically, the error reduction can even reach up to 20% for a few cases.
Finally, all codes are publicly available.
Related papers
- FlexTSF: A Universal Forecasting Model for Time Series with Variable Regularities [17.164913785452367]
We propose FlexTSF, a universal time series forecasting model that possesses better generalization and supports both regular and irregular time series.
Experiments on 12 datasets show that FlexTSF outperforms state-of-the-art forecasting models respectively designed for regular and irregular time series.
arXiv Detail & Related papers (2024-10-30T16:14:09Z) - Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Language Model Empowered Spatio-Temporal Forecasting via Physics-Aware Reprogramming [13.744891561921197]
We aim to harness the reasoning and generalization abilities of Pre-trained Language Models (PLMs) for intricate-temporal forecasting.
We propose RePST, a physics-aware PLM reprogramming framework tailored fortemporal forecasting.
We show that the proposed RePST outperforms twelve state-of-the-art baseline methods, particularly in data-scarce scenarios.
arXiv Detail & Related papers (2024-08-24T07:59:36Z) - Temporal Feature Matters: A Framework for Diffusion Model Quantization [105.3033493564844]
Diffusion models rely on the time-step for the multi-round denoising.
We introduce a novel quantization framework that includes three strategies.
This framework preserves most of the temporal information and ensures high-quality end-to-end generation.
arXiv Detail & Related papers (2024-07-28T17:46:15Z) - FAITH: Frequency-domain Attention In Two Horizons for Time Series Forecasting [13.253624747448935]
Time Series Forecasting plays a crucial role in various fields such as industrial equipment maintenance, meteorology, energy consumption, traffic flow and financial investment.
Current deep learning-based predictive models often exhibit a significant deviation between their forecasting outcomes and the ground truth.
We propose a novel model Frequency-domain Attention In Two Horizons, which decomposes time series into trend and seasonal components.
arXiv Detail & Related papers (2024-05-22T02:37:02Z) - FCDNet: Frequency-Guided Complementary Dependency Modeling for
Multivariate Time-Series Forecasting [9.083469629116784]
We propose FCDNet, a concise yet effective framework for time-series forecasting.
It helps extract long- and short-term dependency information adaptively from multi-level frequency patterns.
Experiments show that FCDNet significantly exceeds strong baselines.
arXiv Detail & Related papers (2023-12-27T07:29:52Z) - Grouped self-attention mechanism for a memory-efficient Transformer [64.0125322353281]
Real-world tasks such as forecasting weather, electricity consumption, and stock market involve predicting data that vary over time.
Time-series data are generally recorded over a long period of observation with long sequences owing to their periodic characteristics and long-range dependencies over time.
We propose two novel modules, Grouped Self-Attention (GSA) and Compressed Cross-Attention (CCA)
Our proposed model efficiently exhibited reduced computational complexity and performance comparable to or better than existing methods.
arXiv Detail & Related papers (2022-10-02T06:58:49Z) - Unsupervised Time-Series Representation Learning with Iterative Bilinear
Temporal-Spectral Fusion [6.154427471704388]
We propose a unified framework, namely Bilinear Temporal-Spectral Fusion (BTSF)
Specifically, we utilize the instance-level augmentation with a simple dropout on the entire time series for maximally capturing long-term dependencies.
We devise a novel iterative bilinear temporal-spectral fusion to explicitly encode the affinities of abundant time-frequency pairs.
arXiv Detail & Related papers (2022-02-08T14:04:08Z) - Interpretable Time-series Representation Learning With Multi-Level
Disentanglement [56.38489708031278]
Disentangle Time Series (DTS) is a novel disentanglement enhancement framework for sequential data.
DTS generates hierarchical semantic concepts as the interpretable and disentangled representation of time-series.
DTS achieves superior performance in downstream applications, with high interpretability of semantic concepts.
arXiv Detail & Related papers (2021-05-17T22:02:24Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.