IN-Flow: Instance Normalization Flow for Non-stationary Time Series Forecasting
- URL: http://arxiv.org/abs/2401.16777v2
- Date: Thu, 06 Feb 2025 21:04:26 GMT
- Title: IN-Flow: Instance Normalization Flow for Non-stationary Time Series Forecasting
- Authors: Wei Fan, Shun Zheng, Pengyang Wang, Rui Xie, Kun Yi, Qi Zhang, Jiang Bian, Yanjie Fu,
- Abstract summary: We propose a decoupled formulation for time series forecasting with no reliance on fixed statistics.
We also propose instance normalization flow (IN-Flow), a novel invertible network for time series transformation.
- Score: 38.4809915448213
- License:
- Abstract: Due to the non-stationarity of time series, the distribution shift problem largely hinders the performance of time series forecasting. Existing solutions either rely on using certain statistics to specify the shift, or developing specific mechanisms for certain network architectures. However, the former would fail for the unknown shift beyond simple statistics, while the latter has limited compatibility on different forecasting models. To overcome these problems, we first propose a decoupled formulation for time series forecasting, with no reliance on fixed statistics and no restriction on forecasting architectures. This formulation regards the removing-shift procedure as a special transformation between a raw distribution and a desired target distribution and separates it from the forecasting. Such a formulation is further formalized into a bi-level optimization problem, to enable the joint learning of the transformation (outer loop) and forecasting (inner loop). Moreover, the special requirements of expressiveness and bi-direction for the transformation motivate us to propose instance normalization flow (IN-Flow), a novel invertible network for time series transformation. Different from the classic "normalizing flow" models, IN-Flow does not aim for normalizing input to the prior distribution (e.g., Gaussian distribution) for generation, but creatively transforms time series distribution by stacking normalization layers and flow-based invertible networks, which is thus named "normalization" flow. Finally, we have conducted extensive experiments on both synthetic data and real-world data, which demonstrate the superiority of our method.
Related papers
- Flow-based Conformal Prediction for Multi-dimensional Time Series [9.900139803164372]
We propose a novel conformal prediction method to address two key challenges by integrating Transformer and Normalizing Flow.
The Transformer encodes the historical context of time series, and normalizing flow learns the transformation from the base distribution to the distribution of non-conformity scores conditioned on the encoded historical context.
We demonstrate that our proposed method achieves smaller prediction regions compared to the baselines while satisfying the desired coverage through comprehensive experiments using simulated and real-world time series datasets.
arXiv Detail & Related papers (2025-02-08T22:04:05Z) - Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Frequency Adaptive Normalization For Non-stationary Time Series Forecasting [7.881136718623066]
Time series forecasting needs to address non-stationary data with evolving trend and seasonal patterns.
To address the non-stationarity, instance normalization has been recently proposed to alleviate impacts from the trend with certain statistical measures.
This paper proposes a new instance normalization solution, called frequency adaptive normalization (FAN), which extends instance normalization in handling both dynamic trend and seasonal patterns.
arXiv Detail & Related papers (2024-09-30T15:07:16Z) - Evolving Multi-Scale Normalization for Time Series Forecasting under Distribution Shifts [20.02869280775877]
We propose a novel model-agnostic Evolving Multi-Scale Normalization (EvoMSN) framework to tackle the distribution shift problem.
We evaluate the effectiveness of EvoMSN in improving the performance of five mainstream forecasting methods on benchmark datasets.
arXiv Detail & Related papers (2024-09-29T14:26:22Z) - Marginalization Consistent Mixture of Separable Flows for Probabilistic Irregular Time Series Forecasting [4.714246221974192]
We develop a novel probabilistic irregular time series forecasting model, Marginalization Consistent Mixtures of Separable Flows (moses)
moses outperforms other state-of-the-art marginalization consistent models, performs on par with ProFITi, but different from ProFITi, guarantee marginalization consistency.
arXiv Detail & Related papers (2024-06-11T13:28:43Z) - Diffusion models for probabilistic programming [56.47577824219207]
Diffusion Model Variational Inference (DMVI) is a novel method for automated approximate inference in probabilistic programming languages (PPLs)
DMVI is easy to implement, allows hassle-free inference in PPLs without the drawbacks of, e.g., variational inference using normalizing flows, and does not make any constraints on the underlying neural network model.
arXiv Detail & Related papers (2023-11-01T12:17:05Z) - End-to-End Modeling Hierarchical Time Series Using Autoregressive
Transformer and Conditional Normalizing Flow based Reconciliation [13.447952588934337]
We propose a novel end-to-end hierarchical time series forecasting model, based on conditioned normalizing flow-based autoregressive transformer reconciliation.
Unlike other state-of-the-art methods, we achieve the forecasting and reconciliation simultaneously without requiring any explicit post-processing step.
arXiv Detail & Related papers (2022-12-28T05:43:57Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Causally-motivated Shortcut Removal Using Auxiliary Labels [63.686580185674195]
Key challenge to learning such risk-invariant predictors is shortcut learning.
We propose a flexible, causally-motivated approach to address this challenge.
We show both theoretically and empirically that this causally-motivated regularization scheme yields robust predictors.
arXiv Detail & Related papers (2021-05-13T16:58:45Z) - Generalized Entropy Regularization or: There's Nothing Special about
Label Smoothing [83.78668073898001]
We introduce a family of entropy regularizers, which includes label smoothing as a special case.
We find that variance in model performance can be explained largely by the resulting entropy of the model.
We advise the use of other entropy regularization methods in its place.
arXiv Detail & Related papers (2020-05-02T12:46:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.