TFWaveFormer: Temporal-Frequency Collaborative Multi-level Wavelet Transformer for Dynamic Link Prediction
- URL: http://arxiv.org/abs/2603.03963v1
- Date: Wed, 04 Mar 2026 11:47:57 GMT
- Title: TFWaveFormer: Temporal-Frequency Collaborative Multi-level Wavelet Transformer for Dynamic Link Prediction
- Authors: Hantong Feng, Yonggang Wu, Duxin Chen, Wenwu Yu,
- Abstract summary: We propose TFWaveFormer, a novel Transformer architecture that integrates temporal-frequency analysis with wavelet decomposition to enhance dynamic link prediction.<n>Our framework comprises three key components: (i) a temporal-frequency coordination mechanism that jointly models temporal and spectral representations, (ii) a learnable multi-resolution wavelet decomposition module that adaptively extracts multi-scale temporal patterns through parallel convolutions, and (iii) a hybrid Transformer module that effectively fuses local wavelet features with global temporal dependencies.
- Score: 13.707047958676482
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Dynamic link prediction plays a crucial role in diverse applications including social network analysis, communication forecasting, and financial modeling. While recent Transformer-based approaches have demonstrated promising results in temporal graph learning, their performance remains limited when capturing complex multi-scale temporal dynamics. In this paper, we propose TFWaveFormer, a novel Transformer architecture that integrates temporal-frequency analysis with multi-resolution wavelet decomposition to enhance dynamic link prediction. Our framework comprises three key components: (i) a temporal-frequency coordination mechanism that jointly models temporal and spectral representations, (ii) a learnable multi-resolution wavelet decomposition module that adaptively extracts multi-scale temporal patterns through parallel convolutions, replacing traditional iterative wavelet transforms, and (iii) a hybrid Transformer module that effectively fuses local wavelet features with global temporal dependencies. Extensive experiments on benchmark datasets demonstrate that TFWaveFormer achieves state-of-the-art performance, outperforming existing Transformer-based and hybrid models by significant margins across multiple metrics. The superior performance of TFWaveFormer validates the effectiveness of combining temporal-frequency analysis with wavelet decomposition in capturing complex temporal dynamics for dynamic link prediction tasks.
Related papers
- WaveFormer: Wavelet Embedding Transformer for Biomedical Signals [1.2922946578413579]
We propose a transformer architecture that integrates wavelet decomposition at two critical stages: embedding construction and positional encoding.<n>We evaluate WaveFormer on eight diverse datasets spanning human activity recognition and brain signal analysis, with sequence lengths ranging from 50 to 3000 timesteps and channel counts from 1 to 144.
arXiv Detail & Related papers (2026-02-12T17:20:43Z) - DiTS: Multimodal Diffusion Transformers Are Time Series Forecasters [50.43534351968113]
Existing generative time series models do not address the multi-dimensional properties of time series data well.<n>Inspired by Multimodal Diffusion Transformers that integrate textual guidance into video generation, we propose Diffusion Transformers for Time Series (DiTS)
arXiv Detail & Related papers (2026-02-06T10:48:13Z) - AWGformer: Adaptive Wavelet-Guided Transformer for Multi-Resolution Time Series Forecasting [3.453296006042559]
Time series forecasting requires capturing patterns across multiple temporal scales.<n>This paper introduces AWGformer, a novel architecture that integrates adaptive wavelet decomposition with cross-scale attention mechanisms.
arXiv Detail & Related papers (2026-01-28T09:14:22Z) - A Unified Frequency Domain Decomposition Framework for Interpretable and Robust Time Series Forecasting [81.73338008264115]
Current approaches for time series forecasting, whether in the time or frequency domain, predominantly use deep learning models based on linear layers or transformers.<n>We propose FIRE, a unified frequency domain decomposition framework that provides a mathematical abstraction for diverse types of time series.<n>Fire consistently outperforms state-of-the-art models on long-term forecasting benchmarks.
arXiv Detail & Related papers (2025-10-11T09:59:25Z) - Wavelet-Enhanced Neural ODE and Graph Attention for Interpretable Energy Forecasting [0.0]
This paper introduces a neural framework that integrates continuous-time Neural Ordinary Differential Equations (Neural ODEs) and graph attention.<n>It adeptly captures and models diverse, multi-scale temporal dynamics.<n>The model enhances interpretability through SHAP analysis, making it suitable for sustainable energy applications.
arXiv Detail & Related papers (2025-07-14T10:23:18Z) - MFRS: A Multi-Frequency Reference Series Approach to Scalable and Accurate Time-Series Forecasting [51.94256702463408]
Time series predictability is derived from periodic characteristics at different frequencies.<n>We propose a novel time series forecasting method based on multi-frequency reference series correlation analysis.<n> Experiments on major open and synthetic datasets show state-of-the-art performance.
arXiv Detail & Related papers (2025-03-11T11:40:14Z) - Multi-scale Generative Modeling for Fast Sampling [38.570968785490514]
In the wavelet domain, we encounter unique challenges, especially the sparse representation of high-frequency coefficients.
We propose a multi-scale generative modeling in the wavelet domain that employs distinct strategies for handling low and high-frequency bands.
As supported by the theoretical analysis and experimental results, our model significantly improve performance and reduce the number of trainable parameters, sampling steps, and time.
arXiv Detail & Related papers (2024-11-14T11:01:45Z) - PRformer: Pyramidal Recurrent Transformer for Multivariate Time Series Forecasting [82.03373838627606]
Self-attention mechanism in Transformer architecture requires positional embeddings to encode temporal order in time series prediction.
We argue that this reliance on positional embeddings restricts the Transformer's ability to effectively represent temporal sequences.
We present a model integrating PRE with a standard Transformer encoder, demonstrating state-of-the-art performance on various real-world datasets.
arXiv Detail & Related papers (2024-08-20T01:56:07Z) - AdaWaveNet: Adaptive Wavelet Network for Time Series Analysis [12.994308764734761]
AdaWaveNet is a novel approach that employs Adaptive Wavelet Transformation for multi-scale analysis of non-stationary time series data.<n>We conduct experiments on 10 datasets across 3 different tasks, including forecasting, imputation, and a newly established super-resolution task.
arXiv Detail & Related papers (2024-05-17T23:52:33Z) - Function Approximation for Reinforcement Learning Controller for Energy from Spread Waves [69.9104427437916]
Multi-generator Wave Energy Converters (WEC) must handle multiple simultaneous waves coming from different directions called spread waves.
These complex devices need controllers with multiple objectives of energy capture efficiency, reduction of structural stress to limit maintenance, and proactive protection against high waves.
In this paper, we explore different function approximations for the policy and critic networks in modeling the sequential nature of the system dynamics.
arXiv Detail & Related papers (2024-04-17T02:04:10Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.