Unlocking the Potential of Deep Learning in Peak-Hour Series Forecasting
- URL: http://arxiv.org/abs/2307.01597v2
- Date: Sat, 30 Sep 2023 05:53:03 GMT
- Title: Unlocking the Potential of Deep Learning in Peak-Hour Series Forecasting
- Authors: Zhenwei Zhang, Xin Wang, Jingyuan Xie, Heling Zhang, Yuantao Gu
- Abstract summary: This paper presents Seq2Peak, a novel framework designed specifically for Peak-Hour Series Forecasting (PHSF) tasks.
It offers two key components: the CyclicNorm pipeline to mitigate the non-stationarity issue and a simple yet effective trainable- parameter-free peak-hour decoder.
Experiments on publicly available time series datasets demonstrate the effectiveness of the proposed framework.
- Score: 19.396667925659507
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Unlocking the potential of deep learning in Peak-Hour Series Forecasting
(PHSF) remains a critical yet underexplored task in various domains. While
state-of-the-art deep learning models excel in regular Time Series Forecasting
(TSF), they struggle to achieve comparable results in PHSF. This can be
attributed to the challenges posed by the high degree of non-stationarity in
peak-hour series, which makes direct forecasting more difficult than standard
TSF. Additionally, manually extracting the maximum value from regular
forecasting results leads to suboptimal performance due to models minimizing
the mean deficit. To address these issues, this paper presents Seq2Peak, a
novel framework designed specifically for PHSF tasks, bridging the performance
gap observed in TSF models. Seq2Peak offers two key components: the CyclicNorm
pipeline to mitigate the non-stationarity issue and a simple yet effective
trainable-parameter-free peak-hour decoder with a hybrid loss function that
utilizes both the original series and peak-hour series as supervised signals.
Extensive experimentation on publicly available time series datasets
demonstrates the effectiveness of the proposed framework, yielding a remarkable
average relative improvement of 37.7% across four real-world datasets for both
transformer- and non-transformer-based TSF models.
Related papers
- A Review of the Long Horizon Forecasting Problem in Time Series Analysis [0.0]
The long horizon forecasting (LHF) problem has come up in the time series literature for over the last 35 years or so.<n>Deep learning has incorporated variants of trend, seasonality, fourier and wavelet transforms, misspecification bias reduction and bandpass filters.<n>We highlight time series decomposition techniques, input data preprocessing and dataset windowing schemes that improve performance.
arXiv Detail & Related papers (2025-06-15T10:49:50Z) - TransDF: Time-Series Forecasting Needs Transformed Label Alignment [53.33409515800757]
We propose Transform-enhanced Direct Forecast (TransDF), which transforms the label sequence into decorrelated components with discriminated significance.<n>Models are trained to align the most significant components, thereby effectively mitigating label autocorrelation and reducing task amount.
arXiv Detail & Related papers (2025-05-23T13:00:35Z) - MFRS: A Multi-Frequency Reference Series Approach to Scalable and Accurate Time-Series Forecasting [51.94256702463408]
Time series predictability is derived from periodic characteristics at different frequencies.
We propose a novel time series forecasting method based on multi-frequency reference series correlation analysis.
Experiments on major open and synthetic datasets show state-of-the-art performance.
arXiv Detail & Related papers (2025-03-11T11:40:14Z) - PFformer: A Position-Free Transformer Variant for Extreme-Adaptive Multivariate Time Series Forecasting [9.511600544581425]
PFformer is a position-free Transformer-based model designed for single-target MTS forecasting.
PFformer integrates two novel embedding strategies: Enhanced Feature-based Embedding (EFE) and Auto-Encoder-based Embedding (AEE)
arXiv Detail & Related papers (2025-02-27T22:21:27Z) - Battling the Non-stationarity in Time Series Forecasting via Test-time Adaptation [39.7344214193566]
We introduce a pioneering test-time adaptation framework tailored for time series forecasting (TSF)
TAFAS, the proposed approach to TSF-TTA, flexibly adapts source forecasters to continuously shifting test distributions while preserving the core semantic information learned during pre-training.
The novel utilization of partially-observed ground truth and gated calibration module enables proactive, robust, and model-agnostic adaptation of source forecasters.
arXiv Detail & Related papers (2025-01-09T04:59:15Z) - Is Precise Recovery Necessary? A Task-Oriented Imputation Approach for Time Series Forecasting on Variable Subset [27.180618587832463]
We propose Task-Oriented Imputation for Variable Subset Forecasting (TOI-VSF) for time series forecasting.
TOI-VSF incorporates a self-supervised imputation module, agnostic to the forecasting model, designed to fill in missing variables.
Extensive experiments across four datasets demonstrate the superiority of TOI-VSF, outperforming baseline methods by $15%$ on average.
arXiv Detail & Related papers (2024-11-15T04:00:54Z) - FM-TS: Flow Matching for Time Series Generation [71.31148785577085]
We introduce FM-TS, a rectified Flow Matching-based framework for Time Series generation.
FM-TS is more efficient in terms of training and inference.
We have achieved superior performance in solar forecasting and MuJoCo imputation tasks.
arXiv Detail & Related papers (2024-11-12T03:03:23Z) - Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts [103.725112190618]
This paper introduces Moirai-MoE, using a single input/output projection layer while delegating the modeling of diverse time series patterns to the sparse mixture of experts.
Extensive experiments on 39 datasets demonstrate the superiority of Moirai-MoE over existing foundation models in both in-distribution and zero-shot scenarios.
arXiv Detail & Related papers (2024-10-14T13:01:11Z) - FAITH: Frequency-domain Attention In Two Horizons for Time Series Forecasting [13.253624747448935]
Time Series Forecasting plays a crucial role in various fields such as industrial equipment maintenance, meteorology, energy consumption, traffic flow and financial investment.
Current deep learning-based predictive models often exhibit a significant deviation between their forecasting outcomes and the ground truth.
We propose a novel model Frequency-domain Attention In Two Horizons, which decomposes time series into trend and seasonal components.
arXiv Detail & Related papers (2024-05-22T02:37:02Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - CARD: Channel Aligned Robust Blend Transformer for Time Series
Forecasting [50.23240107430597]
We design a special Transformer, i.e., Channel Aligned Robust Blend Transformer (CARD for short), that addresses key shortcomings of CI type Transformer in time series forecasting.
First, CARD introduces a channel-aligned attention structure that allows it to capture both temporal correlations among signals.
Second, in order to efficiently utilize the multi-scale knowledge, we design a token blend module to generate tokens with different resolutions.
Third, we introduce a robust loss function for time series forecasting to alleviate the potential overfitting issue.
arXiv Detail & Related papers (2023-05-20T05:16:31Z) - A Novel Method Combines Moving Fronts, Data Decomposition and Deep
Learning to Forecast Intricate Time Series [0.0]
Indian Summer Monsoon Rainfall (ISMR) is a very complex time series.
Conventional one-time decomposition technique suffers from a leak of information from the future.
Moving Front (MF) method is proposed to prevent data leakage.
arXiv Detail & Related papers (2023-03-11T12:07:26Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - CLMFormer: Mitigating Data Redundancy to Revitalize Transformer-based
Long-Term Time Series Forecasting System [46.39662315849883]
Long-term time-series forecasting (LTSF) plays a crucial role in various practical applications.
Existing Transformer-based models, such as Fedformer and Informer, often achieve their best performances on validation sets after just a few epochs.
We propose a novel approach to address this issue by employing curriculum learning and introducing a memory-driven decoder.
arXiv Detail & Related papers (2022-07-16T04:05:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.