FredNormer: Frequency Domain Normalization for Non-stationary Time Series Forecasting
- URL: http://arxiv.org/abs/2410.01860v4
- Date: Wed, 16 Oct 2024 04:19:09 GMT
- Title: FredNormer: Frequency Domain Normalization for Non-stationary Time Series Forecasting
- Authors: Xihao Piao, Zheng Chen, Yushun Dong, Yasuko Matsubara, Yasushi Sakurai,
- Abstract summary: We propose FredNormer, which observes datasets from a frequency perspective and adaptively up-weights the key frequency components.
FredNormer is a plug-and-play module, which does not compromise the efficiency compared to existing normalization methods.
- Score: 18.54376910126127
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent normalization-based methods have shown great success in tackling the distribution shift issue, facilitating non-stationary time series forecasting. Since these methods operate in the time domain, they may fail to fully capture the dynamic patterns that are more apparent in the frequency domain, leading to suboptimal results. This paper first theoretically analyzes how normalization methods affect frequency components. We prove that the current normalization methods that operate in the time domain uniformly scale non-zero frequencies, and thus, they struggle to determine components that contribute to more robust forecasting. Therefore, we propose FredNormer, which observes datasets from a frequency perspective and adaptively up-weights the key frequency components. To this end, FredNormer consists of two components: a statistical metric that normalizes the input samples based on their frequency stability and a learnable weighting layer that adjusts stability and introduces sample-specific variations. Notably, FredNormer is a plug-and-play module, which does not compromise the efficiency compared to existing normalization methods. Extensive experiments show that FredNormer improves the averaged MSE of backbone forecasting models by 33.3% and 55.3% on the ETTm2 dataset. Compared to the baseline normalization methods, FredNormer achieves 18 top-1 results and 6 top-2 results out of 28 settings.
Related papers
- Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts [103.725112190618]
This paper introduces Moirai-MoE, using a single input/output projection layer while delegating the modeling of diverse time series patterns to the sparse mixture of experts.
Extensive experiments on 39 datasets demonstrate the superiority of Moirai-MoE over existing foundation models in both in-distribution and zero-shot scenarios.
arXiv Detail & Related papers (2024-10-14T13:01:11Z) - Tuning Frequency Bias of State Space Models [48.60241978021799]
State space models (SSMs) leverage linear, time-invariant (LTI) systems to learn sequences with long-range dependencies.
We find that SSMs exhibit an implicit bias toward capturing low-frequency components more effectively than high-frequency ones.
arXiv Detail & Related papers (2024-10-02T21:04:22Z) - Frequency Adaptive Normalization For Non-stationary Time Series Forecasting [7.881136718623066]
Time series forecasting needs to address non-stationary data with evolving trend and seasonal patterns.
To address the non-stationarity, instance normalization has been recently proposed to alleviate impacts from the trend with certain statistical measures.
This paper proposes a new instance normalization solution, called frequency adaptive normalization (FAN), which extends instance normalization in handling both dynamic trend and seasonal patterns.
arXiv Detail & Related papers (2024-09-30T15:07:16Z) - Not All Frequencies Are Created Equal:Towards a Dynamic Fusion of Frequencies in Time-Series Forecasting [9.615808695919647]
Time series forecasting methods should be flexible when applied to different scenarios.
We propose Frequency Dynamic Fusion (FreDF), which individually predicts each Fourier component, and dynamically fuses the output of different frequencies.
arXiv Detail & Related papers (2024-07-17T08:54:41Z) - Fredformer: Frequency Debiased Transformer for Time Series Forecasting [8.356290446630373]
The Transformer model has shown leading performance in time series forecasting.
It tends to learn low-frequency features in the data and overlook high-frequency features, showing a frequency bias.
We propose Fredformer, a framework designed to mitigate frequency bias by learning features equally across different frequency bands.
arXiv Detail & Related papers (2024-06-13T11:29:21Z) - Boosting Diffusion Models with Moving Average Sampling in Frequency Domain [101.43824674873508]
Diffusion models rely on the current sample to denoise the next one, possibly resulting in denoising instability.
In this paper, we reinterpret the iterative denoising process as model optimization and leverage a moving average mechanism to ensemble all the prior samples.
We name the complete approach "Moving Average Sampling in Frequency domain (MASF)"
arXiv Detail & Related papers (2024-03-26T16:57:55Z) - FreDF: Learning to Forecast in Frequency Domain [56.24773675942897]
Time series modeling is uniquely challenged by the presence of autocorrelation in both historical and label sequences.
We introduce the Frequency-enhanced Direct Forecast (FreDF) which bypasses the complexity of label autocorrelation by learning to forecast in the frequency domain.
arXiv Detail & Related papers (2024-02-04T08:23:41Z) - Supervised low-rank semi-nonnegative matrix factorization with frequency regularization for forecasting spatio-temporal data [6.725792598352138]
We propose a methodology for forecasting-temporal data using supervised semi-nonnegative matrix factorization (SMF) with frequency regularization.
We find that the results with the proposed methodology are comparable to previous research in the field of geophysical sciences but offer clearer interpretability.
arXiv Detail & Related papers (2023-11-15T01:23:13Z) - Transform Once: Efficient Operator Learning in Frequency Domain [69.74509540521397]
We study deep neural networks designed to harness the structure in frequency domain for efficient learning of long-range correlations in space or time.
This work introduces a blueprint for frequency domain learning through a single transform: transform once (T1)
arXiv Detail & Related papers (2022-11-26T01:56:05Z) - FAMLP: A Frequency-Aware MLP-Like Architecture For Domain Generalization [73.41395947275473]
We propose a novel frequency-aware architecture, in which the domain-specific features are filtered out in the transformed frequency domain.
Experiments on three benchmarks demonstrate significant performance, outperforming the state-of-the-art methods by a margin of 3%, 4% and 9%, respectively.
arXiv Detail & Related papers (2022-03-24T07:26:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.