FADTI: Fourier and Attention Driven Diffusion for Multivariate Time Series Imputation
- URL: http://arxiv.org/abs/2512.15116v1
- Date: Wed, 17 Dec 2025 06:16:31 GMT
- Title: FADTI: Fourier and Attention Driven Diffusion for Multivariate Time Series Imputation
- Authors: Runze Li, Hanchen Wang, Wenjie Zhang, Binghao Li, Yu Zhang, Xuemin Lin, Ying Zhang,
- Abstract summary: We propose FADTI, a diffusion-based framework that injects frequency-informed feature modulation via a learnable Fourier Bias Projection (FBP) module.<n>Experiments on multiple benchmarks, including a newly introduced biological time series dataset, show that FADTI consistently outperforms state-of-the-art methods.
- Score: 26.523421387059035
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Multivariate time series imputation is fundamental in applications such as healthcare, traffic forecasting, and biological modeling, where sensor failures and irregular sampling lead to pervasive missing values. However, existing Transformer- and diffusion-based models lack explicit inductive biases and frequency awareness, limiting their generalization under structured missing patterns and distribution shifts. We propose FADTI, a diffusion-based framework that injects frequency-informed feature modulation via a learnable Fourier Bias Projection (FBP) module and combines it with temporal modeling through self-attention and gated convolution. FBP supports multiple spectral bases, enabling adaptive encoding of both stationary and non-stationary patterns. This design injects frequency-domain inductive bias into the generative imputation process. Experiments on multiple benchmarks, including a newly introduced biological time series dataset, show that FADTI consistently outperforms state-of-the-art methods, particularly under high missing rates. Code is available at https://anonymous.4open.science/r/TimeSeriesImputation-52BF
Related papers
- WaveletDiff: Multilevel Wavelet Diffusion For Time Series Generation [15.28917997999402]
WaveletDiff is a novel framework that trains diffusion models directly on wavelet coefficients to exploit the inherent multi-resolution structure of time series data.<n>It consistently outperforms state-of-the-art time-domain and frequency-domain generative methods on both short and long time series across five diverse performance metrics.
arXiv Detail & Related papers (2025-10-13T18:47:33Z) - A Unified Frequency Domain Decomposition Framework for Interpretable and Robust Time Series Forecasting [81.73338008264115]
Current approaches for time series forecasting, whether in the time or frequency domain, predominantly use deep learning models based on linear layers or transformers.<n>We propose FIRE, a unified frequency domain decomposition framework that provides a mathematical abstraction for diverse types of time series.<n>Fire consistently outperforms state-of-the-art models on long-term forecasting benchmarks.
arXiv Detail & Related papers (2025-10-11T09:59:25Z) - LSCD: Lomb-Scargle Conditioned Diffusion for Time series Imputation [55.800319453296886]
Time series with missing or irregularly sampled data are a persistent challenge in machine learning.<n>We introduce a different Lombiable--Scargle layer that enables a reliable computation of the power spectrum of irregularly sampled data.
arXiv Detail & Related papers (2025-06-20T14:48:42Z) - Wavelet-based Disentangled Adaptive Normalization for Non-stationary Times Series Forecasting [23.34966767653385]
We propose Wavelet-based Disentangled Adaptive Normalization (WDAN) to address non-stationarity in time series forecasting.<n>WDAN uses discrete wavelet transforms to break down the input into low-frequency trends and high-frequency fluctuations.<n>Experiments on multiple benchmarks demonstrate that WDAN consistently improves forecasting accuracy across various backbone model.
arXiv Detail & Related papers (2025-06-06T08:25:29Z) - Multi-Order Wavelet Derivative Transform for Deep Time Series Forecasting [40.57415617022347]
We introduce the multi-order Wavelet Derivative Transform (WDT), enabling the extraction of time-aware patterns spanning both the overall trend and subtle fluctuations.<n> Practically, we embed the WDT into a multi-branch framework named WaveTS, which decomposes the input series into multi-scale time-frequency coefficients, refines them via linear layers, and reconstructs them into the time domain via the inverse WDT.<n>Experiments on ten benchmark datasets demonstrate that WaveTS achieves state-of-the-art forecasting accuracy while retaining high computational efficiency.
arXiv Detail & Related papers (2025-05-17T01:24:09Z) - MFRS: A Multi-Frequency Reference Series Approach to Scalable and Accurate Time-Series Forecasting [51.94256702463408]
Time series predictability is derived from periodic characteristics at different frequencies.<n>We propose a novel time series forecasting method based on multi-frequency reference series correlation analysis.<n> Experiments on major open and synthetic datasets show state-of-the-art performance.
arXiv Detail & Related papers (2025-03-11T11:40:14Z) - TimeDiT: General-purpose Diffusion Transformers for Time Series Foundation Model [11.281386703572842]
TimeDiT is a diffusion transformer model that combines temporal dependency learning with probabilistic sampling.<n>TimeDiT employs a unified masking mechanism to harmonize the training and inference process across diverse tasks.<n>Our systematic evaluation demonstrates TimeDiT's effectiveness both in fundamental tasks, i.e., forecasting and imputation, through zero-shot/fine-tuning.
arXiv Detail & Related papers (2024-09-03T22:31:57Z) - Multi-Source and Test-Time Domain Adaptation on Multivariate Signals using Spatio-Temporal Monge Alignment [59.75420353684495]
Machine learning applications on signals such as computer vision or biomedical data often face challenges due to the variability that exists across hardware devices or session recordings.
In this work, we propose Spatio-Temporal Monge Alignment (STMA) to mitigate these variabilities.
We show that STMA leads to significant and consistent performance gains between datasets acquired with very different settings.
arXiv Detail & Related papers (2024-07-19T13:33:38Z) - Deep Frequency Derivative Learning for Non-stationary Time Series Forecasting [12.989064148254936]
We present a deep frequency derivative learning framework, DERITS, for non-stationary time series forecasting.
Specifically, DERITS is built upon a novel reversible transformation, namely Frequency Derivative Transformation (FDT)
arXiv Detail & Related papers (2024-06-29T17:56:59Z) - Diffusion-TS: Interpretable Diffusion for General Time Series Generation [6.639630994040322]
Diffusion-TS is a novel diffusion-based framework that generates time series samples of high quality.
We train the model to directly reconstruct the sample instead of the noise in each diffusion step, combining a Fourier-based loss term.
Results show that Diffusion-TS achieves the state-of-the-art results on various realistic analyses of time series.
arXiv Detail & Related papers (2024-03-04T05:39:23Z) - Diagnostic Spatio-temporal Transformer with Faithful Encoding [54.02712048973161]
This paper addresses the task of anomaly diagnosis when the underlying data generation process has a complex-temporal (ST) dependency.
We formalize the problem as supervised dependency discovery, where the ST dependency is learned as a side product of time-series classification.
We show that temporal positional encoding used in existing ST transformer works has a serious limitation capturing frequencies in higher frequencies (short time scales)
We also propose a new ST dependency discovery framework, which can provide readily consumable diagnostic information in both spatial and temporal directions.
arXiv Detail & Related papers (2023-05-26T05:31:23Z) - Transform Once: Efficient Operator Learning in Frequency Domain [69.74509540521397]
We study deep neural networks designed to harness the structure in frequency domain for efficient learning of long-range correlations in space or time.
This work introduces a blueprint for frequency domain learning through a single transform: transform once (T1)
arXiv Detail & Related papers (2022-11-26T01:56:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.