TIFO: Time-Invariant Frequency Operator for Stationarity-Aware Representation Learning in Time Series
- URL: http://arxiv.org/abs/2602.17122v1
- Date: Thu, 19 Feb 2026 06:46:54 GMT
- Title: TIFO: Time-Invariant Frequency Operator for Stationarity-Aware Representation Learning in Time Series
- Authors: Xihao Piao, Zheng Chen, Lingwei Zhu, Yushun Dong, Yasuko Matsubara, Yasushi Sakurai,
- Abstract summary: Nonstationary time series forecasting suffers from the distribution shift issue due to the different distributions that produce the training and test data.<n>Existing methods attempt to alleviate the dependence by, e.g., removing low-order moments from each individual sample.<n>We propose a Time-Invariant Frequency Operator (TIFO), which learns stationarity-aware weights over the frequency spectrum.
- Score: 30.009887153375345
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Nonstationary time series forecasting suffers from the distribution shift issue due to the different distributions that produce the training and test data. Existing methods attempt to alleviate the dependence by, e.g., removing low-order moments from each individual sample. These solutions fail to capture the underlying time-evolving structure across samples and do not model the complex time structure. In this paper, we aim to address the distribution shift in the frequency space by considering all possible time structures. To this end, we propose a Time-Invariant Frequency Operator (TIFO), which learns stationarity-aware weights over the frequency spectrum across the entire dataset. The weight representation highlights stationary frequency components while suppressing non-stationary ones, thereby mitigating the distribution shift issue in time series. To justify our method, we show that the Fourier transform of time series data implicitly induces eigen-decomposition in the frequency space. TIFO is a plug-and-play approach that can be seamlessly integrated into various forecasting models. Experiments demonstrate our method achieves 18 top-1 and 6 top-2 results out of 28 forecasting settings. Notably, it yields 33.3% and 55.3% improvements in average MSE on the ETTm2 dataset. In addition, TIFO reduces computational costs by 60% -70% compared to baseline methods, demonstrating strong scalability across diverse forecasting models.
Related papers
- LEFT: Learnable Fusion of Tri-view Tokens for Unsupervised Time Series Anomaly Detection [53.191369031661885]
Unsupervised time series anomaly detection aims to build a model for identifying abnormal timestamps without assuming the availability of annotations.<n>We present Learnable Fusion of Tri-view Tokens (LEFT), a unified unsupervised TSAD framework that models anomalies as inconsistencies across complementary representations.<n>Experiments on real-world benchmarks show that LEFT yields the best detection accuracy against SOTA baselines, while achieving a 5x reduction on FLOPs and 8x speed-up for training.
arXiv Detail & Related papers (2026-02-09T13:33:49Z) - Bridging Time and Frequency: A Joint Modeling Framework for Irregular Multivariate Time Series Forecasting [7.6757168009144126]
We propose TFMixer, a joint time-frequency modeling framework for IMTS forecasting.<n>Specifically, TFMixer incorporates a Global Frequency Module that employs a learnable Non-Uniform Discrete Fourier Transform (NUDFT) to directly extract spectral representations from irregular timestamps.<n>In parallel, the Local Time Module introduces a query-based patch mixing mechanism to adaptively aggregate informative temporal patches and alleviate information density imbalance.
arXiv Detail & Related papers (2026-01-31T07:49:44Z) - A Unified Frequency Domain Decomposition Framework for Interpretable and Robust Time Series Forecasting [81.73338008264115]
Current approaches for time series forecasting, whether in the time or frequency domain, predominantly use deep learning models based on linear layers or transformers.<n>We propose FIRE, a unified frequency domain decomposition framework that provides a mathematical abstraction for diverse types of time series.<n>Fire consistently outperforms state-of-the-art models on long-term forecasting benchmarks.
arXiv Detail & Related papers (2025-10-11T09:59:25Z) - Fourier Basis Mapping: A Time-Frequency Learning Framework for Time Series Forecasting [25.304812011127257]
We introduce a novel method for integrating time-frequency features through Fourier basis expansion and mapping in the time-frequency space.<n>Our approach extracts explicit frequency features while preserving temporal characteristics.<n>The results are validated on diverse real-world datasets for both long-term and short-term forecasting tasks.
arXiv Detail & Related papers (2025-07-13T01:45:27Z) - LSCD: Lomb-Scargle Conditioned Diffusion for Time series Imputation [55.800319453296886]
Time series with missing or irregularly sampled data are a persistent challenge in machine learning.<n>We introduce a different Lombiable--Scargle layer that enables a reliable computation of the power spectrum of irregularly sampled data.
arXiv Detail & Related papers (2025-06-20T14:48:42Z) - TimeCF: A TimeMixer-Based Model with adaptive Convolution and Sharpness-Aware Minimization Frequency Domain Loss for long-term time seris forecasting [5.032613143415414]
We propose a deep learning model TimeCF for long-term time series forecasting based on the TimeMixer.<n>TimeCF decomposes the original time series into sequences of different scales.<n>Different scales are aggregated through a Feed-Forward Network.
arXiv Detail & Related papers (2025-05-23T06:39:20Z) - FreqMoE: Enhancing Time Series Forecasting through Frequency Decomposition Mixture of Experts [14.01018670507771]
We propose the Frequency Decomposition Mixture-of-Experts (FreqMoE) model, which decomposes time series data into frequency bands.<n>A gating mechanism adjusts the importance of each output of expert based on frequency characteristics.<n>Experiments demonstrate that FreqMoE outperforms state-of-the-art models.
arXiv Detail & Related papers (2025-01-25T08:25:52Z) - FlowTS: Time Series Generation via Rectified Flow [67.41208519939626]
FlowTS is an ODE-based model that leverages rectified flow with straight-line transport in probability space.<n>For unconditional setting, FlowTS achieves state-of-the-art performance, with context FID scores of 0.019 and 0.011 on Stock and ETTh datasets.<n>For conditional setting, we have achieved superior performance in solar forecasting.
arXiv Detail & Related papers (2024-11-12T03:03:23Z) - Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts [103.725112190618]
This paper introduces Moirai-MoE, using a single input/output projection layer while delegating the modeling of diverse time series patterns to the sparse mixture of experts.
Extensive experiments on 39 datasets demonstrate the superiority of Moirai-MoE over existing foundation models in both in-distribution and zero-shot scenarios.
arXiv Detail & Related papers (2024-10-14T13:01:11Z) - Deep Frequency Derivative Learning for Non-stationary Time Series Forecasting [12.989064148254936]
We present a deep frequency derivative learning framework, DERITS, for non-stationary time series forecasting.
Specifically, DERITS is built upon a novel reversible transformation, namely Frequency Derivative Transformation (FDT)
arXiv Detail & Related papers (2024-06-29T17:56:59Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Transform Once: Efficient Operator Learning in Frequency Domain [69.74509540521397]
We study deep neural networks designed to harness the structure in frequency domain for efficient learning of long-range correlations in space or time.
This work introduces a blueprint for frequency domain learning through a single transform: transform once (T1)
arXiv Detail & Related papers (2022-11-26T01:56:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.