PRISM: A hierarchical multiscale approach for time series forecasting
- URL: http://arxiv.org/abs/2512.24898v1
- Date: Wed, 31 Dec 2025 14:51:12 GMT
- Title: PRISM: A hierarchical multiscale approach for time series forecasting
- Authors: Zihao Chen, Alexandre Andre, Wenrui Ma, Ian Knight, Sergey Shuvaev, Eva Dyer,
- Abstract summary: Real-world time series contain both global trends, local fine-grained structure, and features on multiple scales in between.<n>We present a new forecasting method, PRISM, that addresses this challenge through a learnable tree-based partitioning of the signal.<n> Experiments across benchmark datasets show that our method outperforms state-of-the-art methods for forecasting.
- Score: 42.91635448262212
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Forecasting is critical in areas such as finance, biology, and healthcare. Despite the progress in the field, making accurate forecasts remains challenging because real-world time series contain both global trends, local fine-grained structure, and features on multiple scales in between. Here, we present a new forecasting method, PRISM (Partitioned Representation for Iterative Sequence Modeling), that addresses this challenge through a learnable tree-based partitioning of the signal. At the root of the tree, a global representation captures coarse trends in the signal, while recursive splits reveal increasingly localized views of the signal. At each level of the tree, data are projected onto a time-frequency basis (e.g., wavelets or exponential moving averages) to extract scale-specific features, which are then aggregated across the hierarchy. This design allows the model to jointly capture global structure and local dynamics of the signal, enabling accurate forecasting. Experiments across benchmark datasets show that our method outperforms state-of-the-art methods for forecasting. Overall, these results demonstrate that our hierarchical approach provides a lightweight and flexible framework for forecasting multivariate time series. The code is available at https://github.com/nerdslab/prism.
Related papers
- S$^2$Transformer: Scalable Structured Transformers for Global Station Weather Forecasting [67.93713728260646]
Existing time series forecasting methods often ignore or unidirectionally model spatial correlation when conducting large-scale global station forecasting.<n>This contradicts the nature underlying observations of the global weather system limiting forecast performance.<n>We propose a novel Structured Spatial Attention in this paper.<n>It partitions the spatial graph into a set of subgraphs and instantiates Intra-subgraph Attention to learn local spatial correlation within each subgraph.<n>It aggregates nodes into subgraph representations for message passing among the subgraphs via Inter-subgraph Attention -- considering both spatial proximity and global correlation.
arXiv Detail & Related papers (2025-09-10T05:33:28Z) - OneForecast: A Universal Framework for Global and Regional Weather Forecasting [67.61381313555091]
We propose a global-regional nested weather forecasting framework (OneForecast) based on graph neural networks.<n>By combining a dynamic system perspective with multi-grid theory, we construct a multi-scale graph structure and densify the target region.<n>We introduce an adaptive messaging mechanism, using dynamic gating units, to deeply integrate node and edge features for more accurate extreme event forecasting.
arXiv Detail & Related papers (2025-02-01T06:49:16Z) - Enhancing Foundation Models for Time Series Forecasting via Wavelet-based Tokenization [74.3339999119713]
We develop a wavelet-based tokenizer that allows models to learn complex representations directly in the space of time-localized frequencies.<n>Our method first scales and decomposes the input time series, then thresholds and quantizes the wavelet coefficients, and finally pre-trains an autoregressive model to forecast coefficients for the forecast horizon.
arXiv Detail & Related papers (2024-12-06T18:22:59Z) - Learning Pattern-Specific Experts for Time Series Forecasting Under Patch-level Distribution Shift [51.01356105618118]
Time series often exhibit complex non-uniform distribution with varying patterns across segments, such as season, operating condition, or semantic meaning.<n>Existing approaches, which typically train a single model to capture all these diverse patterns, often struggle with the pattern drifts between patches.<n>We propose TFPS, a novel architecture that leverages pattern-specific experts for more accurate and adaptable time series forecasting.
arXiv Detail & Related papers (2024-10-13T13:35:29Z) - Hierarchical Classification Auxiliary Network for Time Series Forecasting [26.92086695600799]
We introduce a novel approach by tokenizing time series values to train forecasting models via cross-entropy loss.<n>HCAN integrates multi-granularity high-entropy features at different hierarchy levels.<n>Experiments integrating HCAN with state-of-the-art forecasting models demonstrate substantial improvements over baselines on several real-world datasets.
arXiv Detail & Related papers (2024-05-29T10:38:25Z) - RPMixer: Shaking Up Time Series Forecasting with Random Projections for Large Spatial-Temporal Data [33.0546525587517]
We propose a all-Multi-Layer Perceptron (all-MLP) time series forecasting architecture called RPMixer.
Our method capitalizes on the ensemble-like behavior of deep neural networks, where each individual block behaves like a base learner in an ensemble model.
arXiv Detail & Related papers (2024-02-16T07:28:59Z) - Topological Attention for Time Series Forecasting [9.14716126400637]
We study whether $textitlocal topological properties$, as captured via persistent homology, can serve as a reliable signal.
We propose $textittopological attention$, which allows attending to local topological features within a time horizon of historical data.
arXiv Detail & Related papers (2021-07-19T17:24:05Z) - Deep Autoregressive Models with Spectral Attention [74.08846528440024]
We propose a forecasting architecture that combines deep autoregressive models with a Spectral Attention (SA) module.
By characterizing in the spectral domain the embedding of the time series as occurrences of a random process, our method can identify global trends and seasonality patterns.
Two spectral attention models, global and local to the time series, integrate this information within the forecast and perform spectral filtering to remove time series's noise.
arXiv Detail & Related papers (2021-07-13T11:08:47Z) - A machine learning approach for forecasting hierarchical time series [4.157415305926584]
We propose a machine learning approach for forecasting hierarchical time series.
Forecast reconciliation is the process of adjusting forecasts to make them coherent across the hierarchy.
We exploit the ability of a deep neural network to extract information capturing the structure of the hierarchy.
arXiv Detail & Related papers (2020-05-31T22:26:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.