OneCast: Structured Decomposition and Modular Generation for Cross-Domain Time Series Forecasting
- URL: http://arxiv.org/abs/2510.24028v2
- Date: Mon, 03 Nov 2025 01:49:39 GMT
- Title: OneCast: Structured Decomposition and Modular Generation for Cross-Domain Time Series Forecasting
- Authors: Tingyue Pan, Mingyue Cheng, Shilong Zhang, Zhiding Liu, Xiaoyu Tao, Yucong Luo, Jintao Zhang, Qi Liu,
- Abstract summary: Cross-domain time series forecasting is a valuable task in various web applications.<n>OneCast is a structured and modular forecasting framework that decomposes time series into seasonal and trend components.<n>Experiments across eight domains demonstrate that OneCast mostly outperforms state-of-the-art baselines.
- Score: 21.91411439072952
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Cross-domain time series forecasting is a valuable task in various web applications. Despite its rapid advancement, achieving effective generalization across heterogeneous time series data remains a significant challenge. Existing methods have made progress by extending single-domain models, yet often fall short when facing domain-specific trend shifts and inconsistent periodic patterns. We argue that a key limitation lies in treating temporal series as undifferentiated sequence, without explicitly decoupling their inherent structural components. To address this, we propose OneCast, a structured and modular forecasting framework that decomposes time series into seasonal and trend components, each modeled through tailored generative pathways. Specifically, the seasonal component is captured by a lightweight projection module that reconstructs periodic patterns via interpretable basis functions. In parallel, the trend component is encoded into discrete tokens at segment level via a semantic-aware tokenizer, and subsequently inferred through a masked discrete diffusion mechanism. The outputs from both branches are combined to produce a final forecast that captures seasonal patterns while tracking domain-specific trends. Extensive experiments across eight domains demonstrate that OneCast mostly outperforms state-of-the-art baselines.
Related papers
- MEMTS: Internalizing Domain Knowledge via Parameterized Memory for Retrieval-Free Domain Adaptation of Time Series Foundation Models [51.506429027626005]
Memory for Time Series (MEMTS) is a lightweight and plug-and-play method for retrieval-free domain adaptation in time series forecasting.<n>Key component of MEMTS is a Knowledge Persistence Module (KPM), which internalizes domain-specific temporal dynamics.<n>This paradigm shift enables MEMTS to achieve accurate domain adaptation with constant-time inference and near-zero latency.
arXiv Detail & Related papers (2026-02-14T14:00:06Z) - TimeMar: Multi-Scale Autoregressive Modeling for Unconditional Time Series Generation [11.455232661227313]
We propose a structure-disentangled multiscale generation framework for time series.<n>Our approach encodes sequences into discrete tokens at multiple temporal resolutions.<n>We show that our approach produces higher-quality time series than existing methods.
arXiv Detail & Related papers (2026-01-16T11:00:05Z) - ForecastGAN: A Decomposition-Based Adversarial Framework for Multi-Horizon Time Series Forecasting [0.5213778368155993]
Time series forecasting is essential across domains from finance to supply chain management.<n>This paper introduces ForecastGAN, a novel decomposition based adversarial framework for multi-horizon predictions.<n>ForecastGAN consistently outperforms state-of-the-art transformer models for short-term forecasting while remaining competitive for long-term horizons.
arXiv Detail & Related papers (2025-11-06T15:19:23Z) - Effective Series Decomposition and Components Learning for Time Series Generation [2.4675755217083317]
We introduce Seasonal-Trend Diffusion (STDiffusion), a novel framework for time series generation.<n>STDiffusion integrates probabilistic models with advanced learnable series decomposition techniques.<n>Our studies demonstrate that STDiffusion achieves reliable results and highlighted its robustness and versatility.
arXiv Detail & Related papers (2025-11-02T00:15:12Z) - A Unified Frequency Domain Decomposition Framework for Interpretable and Robust Time Series Forecasting [81.73338008264115]
Current approaches for time series forecasting, whether in the time or frequency domain, predominantly use deep learning models based on linear layers or transformers.<n>We propose FIRE, a unified frequency domain decomposition framework that provides a mathematical abstraction for diverse types of time series.<n>Fire consistently outperforms state-of-the-art models on long-term forecasting benchmarks.
arXiv Detail & Related papers (2025-10-11T09:59:25Z) - MFRS: A Multi-Frequency Reference Series Approach to Scalable and Accurate Time-Series Forecasting [51.94256702463408]
Time series predictability is derived from periodic characteristics at different frequencies.<n>We propose a novel time series forecasting method based on multi-frequency reference series correlation analysis.<n> Experiments on major open and synthetic datasets show state-of-the-art performance.
arXiv Detail & Related papers (2025-03-11T11:40:14Z) - Unify and Anchor: A Context-Aware Transformer for Cross-Domain Time Series Forecasting [26.59526791215]
We identify two key challenges in cross-domain time series forecasting: the complexity of temporal patterns and semantic misalignment.<n>We propose the Unify and Anchor" transfer paradigm, which disentangles frequency components for a unified perspective.<n>We introduce ContexTST, a Transformer-based model that employs a time series coordinator for structured representation.
arXiv Detail & Related papers (2025-03-03T04:11:14Z) - Learning Latent Spaces for Domain Generalization in Time Series Forecasting [60.29403194508811]
Time series forecasting is vital in many real-world applications, yet developing models that generalize well on unseen relevant domains remains underexplored.<n>We propose a framework for domain generalization in time series forecasting by mining the latent factors that govern temporal dependencies across domains.<n>Our approach uses a decomposition-based architecture with a new Conditional $beta$-Variational Autoencoder (VAE), wherein time series data is first decomposed into trend-cyclical and seasonal components.
arXiv Detail & Related papers (2024-12-15T12:41:53Z) - A Decomposition Modeling Framework for Seasonal Time-Series Forecasting [0.0]
Seasonal time series exhibit intricate long-term dependencies.<n>This paper introduces the Multi-scale Seasonal Decomposition Model (MSSD) for seasonal time-series forecasting.
arXiv Detail & Related papers (2024-12-12T01:37:25Z) - Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts [103.725112190618]
This paper introduces Moirai-MoE, using a single input/output projection layer while delegating the modeling of diverse time series patterns to the sparse mixture of experts.
Extensive experiments on 39 datasets demonstrate the superiority of Moirai-MoE over existing foundation models in both in-distribution and zero-shot scenarios.
arXiv Detail & Related papers (2024-10-14T13:01:11Z) - Learning Pattern-Specific Experts for Time Series Forecasting Under Patch-level Distribution Shift [51.01356105618118]
Time series often exhibit complex non-uniform distribution with varying patterns across segments, such as season, operating condition, or semantic meaning.<n>Existing approaches, which typically train a single model to capture all these diverse patterns, often struggle with the pattern drifts between patches.<n>We propose TFPS, a novel architecture that leverages pattern-specific experts for more accurate and adaptable time series forecasting.
arXiv Detail & Related papers (2024-10-13T13:35:29Z) - Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a causal Transformer for unified time series forecasting.<n>Based on large-scale pre-training, Timer-XL achieves state-of-the-art zero-shot performance.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Temporal Saliency Detection Towards Explainable Transformer-based
Timeseries Forecasting [3.046315755726937]
This paper introduces Temporal Saliency Detection (TSD), an effective approach that builds upon the attention mechanism and applies it to multi-horizon time series prediction.
The TSD approach facilitates the multiresolution analysis of saliency patterns by condensing multi-heads, thereby progressively enhancing the forecasting of complex time series data.
arXiv Detail & Related papers (2022-12-15T12:47:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.