Universal Redundancies in Time Series Foundation Models
- URL: http://arxiv.org/abs/2602.01605v1
- Date: Mon, 02 Feb 2026 03:53:46 GMT
- Title: Universal Redundancies in Time Series Foundation Models
- Authors: Anthony Bao, Venkata Hasith Vattikuti, Jeffrey Lai, William Gilpin,
- Abstract summary: Time Series Foundation Models (TSFMs) leverage extensive pretraining to accurately predict unseen time series during inference.<n>We introduce a set of tools for mechanistic interpretability of TSFMs, including ablations of specific components and direct logit attribution on the residual stream.
- Score: 3.8551402560229806
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Time Series Foundation Models (TSFMs) leverage extensive pretraining to accurately predict unseen time series during inference, without the need for task-specific fine-tuning. Through large-scale evaluations on standard benchmarks, we find that leading transformer-based TSFMs exhibit redundant components in their intermediate layers. We introduce a set of tools for mechanistic interpretability of TSFMs, including ablations of specific components and direct logit attribution on the residual stream. Our findings are consistent across several leading TSFMs with diverse architectures, and across a diverse set of real-world and synthetic time-series datasets. We discover that all models in our study are robust to ablations of entire layers. Furthermore, we develop a theoretical framework framing transformers as kernel regressors, motivating a purely intrinsic strategy for ablating heads based on the stable rank of the per-head projection matrices. Using this approach, we uncover the specific heads responsible for degenerate phenomena widely observed in TSFMs, such as parroting of motifs from the context and seasonality bias. Our study sheds light on the universal properties of this emerging class of architectures for continuous-time sequence modeling.
Related papers
- MEMTS: Internalizing Domain Knowledge via Parameterized Memory for Retrieval-Free Domain Adaptation of Time Series Foundation Models [51.506429027626005]
Memory for Time Series (MEMTS) is a lightweight and plug-and-play method for retrieval-free domain adaptation in time series forecasting.<n>Key component of MEMTS is a Knowledge Persistence Module (KPM), which internalizes domain-specific temporal dynamics.<n>This paradigm shift enables MEMTS to achieve accurate domain adaptation with constant-time inference and near-zero latency.
arXiv Detail & Related papers (2026-02-14T14:00:06Z) - It's TIME: Towards the Next Generation of Time Series Forecasting Benchmarks [87.7937890373758]
Time series foundation models (TSFMs) are revolutionizing the forecasting landscape from specific dataset modeling to generalizable task evaluation.<n>We introduce TIME, a next-generation task-centric benchmark comprising 50 fresh datasets and 98 forecasting tasks.<n>We propose a novel pattern-level evaluation perspective that moves beyond traditional dataset-level evaluations based on static meta labels.
arXiv Detail & Related papers (2026-02-12T16:31:01Z) - Diversified Scaling Inference in Time Series Foundation Models [17.268760626931517]
This work systematically investigates two questions: how do TSFMs behave under standard sampling-based inference scaling, and can controlled sampling diversity enhance performance?<n>We first examine the properties of TSFMs under standard sampling often fail to adhere to scaling laws due to insufficient exploration of the solution space.<n>We then delve into diversified inference scaling via tailored time series perturbations to expand the generative distribution's support.
arXiv Detail & Related papers (2026-01-24T08:53:42Z) - Time Series Foundation Models for Process Model Forecasting [8.339024524110828]
Process Model Forecasting aims to predict how the control-flow structure of a process evolves over time.<n>Machine learning and deep learning models provide only modest gains over statistical baselines.<n>We investigate Time Series Foundation Models (TSFMs) as an alternative for PMF.
arXiv Detail & Related papers (2025-12-08T15:08:50Z) - Synapse: Adaptive Arbitration of Complementary Expertise in Time Series Foundational Models [50.877082340479085]
We study how different Time Series Foundational Models (TSFMs) exhibit specialized performance profiles across various forecasting settings.<n>We propose Synapse, a novel arbitration framework for TSFMs.<n>Results demonstrate that Synapse consistently outperforms other popular ensembling techniques as well as individual TSFMs.
arXiv Detail & Related papers (2025-11-07T18:01:51Z) - Estimating Time Series Foundation Model Transferability via In-Context Learning [74.65355820906355]
Time series foundation models (TSFMs) offer strong zero-shot forecasting via large-scale pre-training.<n>Fine-tuning remains critical for boosting performance in domains with limited public data.<n>We introduce TimeTic, a transferability estimation framework that recasts model selection as an in-context-learning problem.
arXiv Detail & Related papers (2025-09-28T07:07:13Z) - Multivariate Long-term Time Series Forecasting with Fourier Neural Filter [42.60778405812048]
We introduce FNF as the backbone and DBD as architecture to provide excellent learning capabilities and optimal learning pathways for spatial-temporal modeling.<n>We show that FNF unifies local time-domain and global frequency-domain information processing within a single backbone that extends naturally to spatial modeling.
arXiv Detail & Related papers (2025-06-10T18:40:20Z) - TS-RAG: Retrieval-Augmented Generation based Time Series Foundation Models are Stronger Zero-Shot Forecaster [14.512119661418522]
We present TS-RAG, a retrieval-augmented generation framework for time series forecasting.<n>Specifically, TS-RAG leverages pre-trained time series encoders to retrieve semantically relevant segments from a dedicated knowledge base.<n>We show that TS-RAG achieves state-of-the-art zero-shot forecasting performance, outperforming the existing TSFMs by up to 6.84% across diverse domains.
arXiv Detail & Related papers (2025-03-06T16:48:48Z) - General Time-series Model for Universal Knowledge Representation of Multivariate Time-Series data [61.163542597764796]
We show that time series with different time granularities (or corresponding frequency resolutions) exhibit distinct joint distributions in the frequency domain.<n>A novel Fourier knowledge attention mechanism is proposed to enable learning time-aware representations from both the temporal and frequency domains.<n>An autoregressive blank infilling pre-training framework is incorporated to time series analysis for the first time, leading to a generative tasks agnostic pre-training strategy.
arXiv Detail & Related papers (2025-02-05T15:20:04Z) - TSFM-Bench: A Comprehensive and Unified Benchmark of Foundation Models for Time Series Forecasting [35.505530132151]
Time Series Forecasting (TSF) is key functionality in numerous fields, such as financial investment, weather services, and energy management.<n>Many TSF methods require domain-specific data collection and model training and do not generalize well when applied in other domains.<n>Time Series Foundation Models (TSFMs) that are pre-trained on massive heterogeneous time series data aim to overcome these limitations.
arXiv Detail & Related papers (2024-10-15T17:23:49Z) - Foundation Models for Time Series Analysis: A Tutorial and Survey [70.43311272903334]
Foundation Models (FMs) have fundamentally reshaped the paradigm of model design for time series analysis.
This survey aims to furnish a comprehensive and up-to-date overview of FMs for time series analysis.
arXiv Detail & Related papers (2024-03-21T10:08:37Z) - Attractor Memory for Long-Term Time Series Forecasting: A Chaos Perspective [63.60312929416228]
textbftextitAttraos incorporates chaos theory into long-term time series forecasting.
We show that Attraos outperforms various LTSF methods on mainstream datasets and chaotic datasets with only one-twelfth of the parameters compared to PatchTST.
arXiv Detail & Related papers (2024-02-18T05:35:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.