Diversified Scaling Inference in Time Series Foundation Models
- URL: http://arxiv.org/abs/2601.17376v1
- Date: Sat, 24 Jan 2026 08:53:42 GMT
- Title: Diversified Scaling Inference in Time Series Foundation Models
- Authors: Ruijin Hua, Zichuan Liu, Kun Zhang, Yiyuan Yang,
- Abstract summary: This work systematically investigates two questions: how do TSFMs behave under standard sampling-based inference scaling, and can controlled sampling diversity enhance performance?<n>We first examine the properties of TSFMs under standard sampling often fail to adhere to scaling laws due to insufficient exploration of the solution space.<n>We then delve into diversified inference scaling via tailored time series perturbations to expand the generative distribution's support.
- Score: 17.268760626931517
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The advancement of Time Series Foundation Models (TSFMs) has been driven primarily by large-scale pre-training, but inference-time compute potential remains largely untapped. This work systematically investigates two questions: how do TSFMs behave under standard sampling-based inference scaling, and can controlled sampling diversity enhance performance? We first examine the properties of TSFMs under standard sampling often fail to adhere to scaling laws due to insufficient exploration of the solution space. Building on this, we then delve into diversified inference scaling via tailored time series perturbations to expand the generative distribution's support. We theoretically analyze the diversity-fidelity trade-off and derive a critical sample threshold for diversified sampling to outperform standard sampling. Extensive experiments across various TSFMs and datasets show proper diversified inference scaling yields substantial performance gains without parameter updates, establishing inference design as a critical, compute-efficient dimension of TSFM optimization. As an application, we propose RobustMSE, a rigorous metric to quantify the headroom performance of TSFM under a fixed budget. Overall, our findings clarify these factor interactions, enabling reliable performance via diverse large-scale inference time series in parallel environments without re-training TSFMs.
Related papers
- Universal Redundancies in Time Series Foundation Models [3.8551402560229806]
Time Series Foundation Models (TSFMs) leverage extensive pretraining to accurately predict unseen time series during inference.<n>We introduce a set of tools for mechanistic interpretability of TSFMs, including ablations of specific components and direct logit attribution on the residual stream.
arXiv Detail & Related papers (2026-02-02T03:53:46Z) - UniDiff: A Unified Diffusion Framework for Multimodal Time Series Forecasting [90.47915032778366]
We propose UniDiff, a unified diffusion framework for multimodal time series forecasting.<n>At its core lies a unified and parallel fusion module, where a single cross-attention mechanism integrates structural information from timestamps and semantic context from texts.<n>Experiments on real-world benchmark datasets across eight domains demonstrate that the proposed UniDiff model achieves state-of-the-art performance.
arXiv Detail & Related papers (2025-12-08T05:36:14Z) - Synapse: Adaptive Arbitration of Complementary Expertise in Time Series Foundational Models [50.877082340479085]
We study how different Time Series Foundational Models (TSFMs) exhibit specialized performance profiles across various forecasting settings.<n>We propose Synapse, a novel arbitration framework for TSFMs.<n>Results demonstrate that Synapse consistently outperforms other popular ensembling techniques as well as individual TSFMs.
arXiv Detail & Related papers (2025-11-07T18:01:51Z) - A Unified Frequency Domain Decomposition Framework for Interpretable and Robust Time Series Forecasting [81.73338008264115]
Current approaches for time series forecasting, whether in the time or frequency domain, predominantly use deep learning models based on linear layers or transformers.<n>We propose FIRE, a unified frequency domain decomposition framework that provides a mathematical abstraction for diverse types of time series.<n>Fire consistently outperforms state-of-the-art models on long-term forecasting benchmarks.
arXiv Detail & Related papers (2025-10-11T09:59:25Z) - Multi-Scale Finetuning for Encoder-based Time Series Foundation Models [67.95907033226585]
Time series foundation models (TSFMs) demonstrate impressive zero-shot performance for time series forecasting.<n>While naive finetuning can yield performance gains, we argue that it falls short of fully leveraging TSFMs' capabilities.<n>We propose Multiscale finetuning (MSFT), a simple yet general framework that explicitly integrates multi-scale modeling into the finetuning process.
arXiv Detail & Related papers (2025-06-17T01:06:01Z) - Less is More: Unlocking Specialization of Time Series Foundation Models via Structured Pruning [27.23328609888911]
Time Series Foundation Models pre-train vast parameters and achieve remarkable zero-shot forecasting performance.<n>Surprisingly, even after fine-tuning, TSFMs cannot consistently outperform smaller, specialized models trained on full-shot downstream data.<n>We propose a structured pruning method to regularize the subsequent fine-tuning process by focusing it on a more relevant and compact parameter space.
arXiv Detail & Related papers (2025-05-29T07:33:49Z) - AdaPTS: Adapting Univariate Foundation Models to Probabilistic Multivariate Time Series Forecasting [10.899510048905926]
We present adapters for managing intricate dependencies among features and quantifying uncertainty in predictions.<n>Experiments conducted on both synthetic and real-world datasets confirm the efficacy of adapters.<n>Our framework, AdaPTS, positions adapters as a modular, scalable, and effective solution.
arXiv Detail & Related papers (2025-02-14T15:46:19Z) - FlowTS: Time Series Generation via Rectified Flow [67.41208519939626]
FlowTS is an ODE-based model that leverages rectified flow with straight-line transport in probability space.<n>For unconditional setting, FlowTS achieves state-of-the-art performance, with context FID scores of 0.019 and 0.011 on Stock and ETTh datasets.<n>For conditional setting, we have achieved superior performance in solar forecasting.
arXiv Detail & Related papers (2024-11-12T03:03:23Z) - TSFM-Bench: A Comprehensive and Unified Benchmark of Foundation Models for Time Series Forecasting [35.505530132151]
Time Series Forecasting (TSF) is key functionality in numerous fields, such as financial investment, weather services, and energy management.<n>Many TSF methods require domain-specific data collection and model training and do not generalize well when applied in other domains.<n>Time Series Foundation Models (TSFMs) that are pre-trained on massive heterogeneous time series data aim to overcome these limitations.
arXiv Detail & Related papers (2024-10-15T17:23:49Z) - MITA: Bridging the Gap between Model and Data for Test-time Adaptation [68.62509948690698]
Test-Time Adaptation (TTA) has emerged as a promising paradigm for enhancing the generalizability of models.
We propose Meet-In-The-Middle based MITA, which introduces energy-based optimization to encourage mutual adaptation of the model and data from opposing directions.
arXiv Detail & Related papers (2024-10-12T07:02:33Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.