Synapse: Adaptive Arbitration of Complementary Expertise in Time Series Foundational Models
- URL: http://arxiv.org/abs/2511.05460v1
- Date: Fri, 07 Nov 2025 18:01:51 GMT
- Title: Synapse: Adaptive Arbitration of Complementary Expertise in Time Series Foundational Models
- Authors: Sarkar Snigdha Sarathi Das, Palash Goyal, Mihir Parmar, Yiwen Song, Long T. Le, Lesly Miculicich, Jinsung Yoon, Rui Zhang, Hamid Palangi, Tomas Pfister,
- Abstract summary: We study how different Time Series Foundational Models (TSFMs) exhibit specialized performance profiles across various forecasting settings.<n>We propose Synapse, a novel arbitration framework for TSFMs.<n>Results demonstrate that Synapse consistently outperforms other popular ensembling techniques as well as individual TSFMs.
- Score: 50.877082340479085
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Pre-trained Time Series Foundational Models (TSFMs) represent a significant advance, capable of forecasting diverse time series with complex characteristics, including varied seasonalities, trends, and long-range dependencies. Despite their primary goal of universal time series forecasting, their efficacy is far from uniform; divergent training protocols and data sources cause individual TSFMs to exhibit highly variable performance across different forecasting tasks, domains, and horizons. Leveraging this complementary expertise by arbitrating existing TSFM outputs presents a compelling strategy, yet this remains a largely unexplored area of research. In this paper, we conduct a thorough examination of how different TSFMs exhibit specialized performance profiles across various forecasting settings, and how we can effectively leverage this behavior in arbitration between different time series models. We specifically analyze how factors such as model selection and forecast horizon distribution can influence the efficacy of arbitration strategies. Based on this analysis, we propose Synapse, a novel arbitration framework for TSFMs. Synapse is designed to dynamically leverage a pool of TSFMs, assign and adjust predictive weights based on their relative, context-dependent performance, and construct a robust forecast distribution by adaptively sampling from the output quantiles of constituent models. Experimental results demonstrate that Synapse consistently outperforms other popular ensembling techniques as well as individual TSFMs, demonstrating Synapse's efficacy in time series forecasting.
Related papers
- Universal Redundancies in Time Series Foundation Models [3.8551402560229806]
Time Series Foundation Models (TSFMs) leverage extensive pretraining to accurately predict unseen time series during inference.<n>We introduce a set of tools for mechanistic interpretability of TSFMs, including ablations of specific components and direct logit attribution on the residual stream.
arXiv Detail & Related papers (2026-02-02T03:53:46Z) - Diversified Scaling Inference in Time Series Foundation Models [17.268760626931517]
This work systematically investigates two questions: how do TSFMs behave under standard sampling-based inference scaling, and can controlled sampling diversity enhance performance?<n>We first examine the properties of TSFMs under standard sampling often fail to adhere to scaling laws due to insufficient exploration of the solution space.<n>We then delve into diversified inference scaling via tailored time series perturbations to expand the generative distribution's support.
arXiv Detail & Related papers (2026-01-24T08:53:42Z) - ForecastGAN: A Decomposition-Based Adversarial Framework for Multi-Horizon Time Series Forecasting [0.5213778368155993]
Time series forecasting is essential across domains from finance to supply chain management.<n>This paper introduces ForecastGAN, a novel decomposition based adversarial framework for multi-horizon predictions.<n>ForecastGAN consistently outperforms state-of-the-art transformer models for short-term forecasting while remaining competitive for long-term horizons.
arXiv Detail & Related papers (2025-11-06T15:19:23Z) - A Unified Frequency Domain Decomposition Framework for Interpretable and Robust Time Series Forecasting [81.73338008264115]
Current approaches for time series forecasting, whether in the time or frequency domain, predominantly use deep learning models based on linear layers or transformers.<n>We propose FIRE, a unified frequency domain decomposition framework that provides a mathematical abstraction for diverse types of time series.<n>Fire consistently outperforms state-of-the-art models on long-term forecasting benchmarks.
arXiv Detail & Related papers (2025-10-11T09:59:25Z) - Breaking Silos: Adaptive Model Fusion Unlocks Better Time Series Forecasting [64.45587649141842]
Time-series forecasting plays a critical role in many real-world applications.<n>No single model consistently outperforms others across different test samples, but instead (ii) each model excels in specific cases.<n>We introduce TimeFuse, a framework for collective time-series forecasting with sample-level adaptive fusion of heterogeneous models.
arXiv Detail & Related papers (2025-05-24T00:45:07Z) - A Multi-scale Representation Learning Framework for Long-Term Time Series Forecasting [6.344911113059126]
Long-term time series forecasting (LTSF) offers broad utility in practical settings like energy consumption and weather prediction.<n>This work confronts key issues in LTSF, including the suboptimal use of multi-granularity information.<n>Our method adeptly disentangles complex temporal dynamics using clear, concurrent predictions across various scales.
arXiv Detail & Related papers (2025-05-13T03:26:44Z) - MITA: Bridging the Gap between Model and Data for Test-time Adaptation [68.62509948690698]
Test-Time Adaptation (TTA) has emerged as a promising paradigm for enhancing the generalizability of models.
We propose Meet-In-The-Middle based MITA, which introduces energy-based optimization to encourage mutual adaptation of the model and data from opposing directions.
arXiv Detail & Related papers (2024-10-12T07:02:33Z) - DAM: Towards A Foundation Model for Time Series Forecasting [0.8231118867997028]
We propose a neural model that takes randomly sampled histories and outputs an adjustable basis composition as a continuous function of time.
It involves three key components: (1) a flexible approach for using randomly sampled histories from a long-tail distribution; (2) a transformer backbone that is trained on these actively sampled histories to produce, as representational output; and (3) the basis coefficients of a continuous function of time.
arXiv Detail & Related papers (2024-07-25T08:48:07Z) - Robust Multivariate Time Series Forecasting against Intra- and Inter-Series Transitional Shift [40.734564394464556]
We present a unified Probabilistic Graphical Model to Jointly capturing intra-/inter-series correlations and modeling the time-variant transitional distribution.
We validate the effectiveness and efficiency of JointPGM through extensive experiments on six highly non-stationary MTS datasets.
arXiv Detail & Related papers (2024-07-18T06:16:03Z) - MGCP: A Multi-Grained Correlation based Prediction Network for Multivariate Time Series [54.91026286579748]
We propose a Multi-Grained Correlations-based Prediction Network.
It simultaneously considers correlations at three levels to enhance prediction performance.
It employs adversarial training with an attention mechanism-based predictor and conditional discriminator to optimize prediction results at coarse-grained level.
arXiv Detail & Related papers (2024-05-30T03:32:44Z) - Model-Attentive Ensemble Learning for Sequence Modeling [86.4785354333566]
We present Model-Attentive Ensemble learning for Sequence modeling (MAES)
MAES is a mixture of time-series experts which leverages an attention-based gating mechanism to specialize the experts on different sequence dynamics and adaptively weight their predictions.
We demonstrate that MAES significantly out-performs popular sequence models on datasets subject to temporal shift.
arXiv Detail & Related papers (2021-02-23T05:23:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.