Hierarchical Time Series Forecasting Via Latent Mean Encoding
- URL: http://arxiv.org/abs/2506.19633v1
- Date: Tue, 24 Jun 2025 13:54:47 GMT
- Title: Hierarchical Time Series Forecasting Via Latent Mean Encoding
- Authors: Alessandro Salatiello, Stefan Birr, Manuel Kunz,
- Abstract summary: Coherently forecasting the behaviour of a target variable across both coarse and fine temporal scales is crucial for profit-optimized decision-making in several business applications.<n>We propose a new hierarchical architecture that tackles this problem by leveraging modules that specialize in forecasting the different temporal aggregation levels of interest.
- Score: 44.99833362998488
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Coherently forecasting the behaviour of a target variable across both coarse and fine temporal scales is crucial for profit-optimized decision-making in several business applications, and remains an open research problem in temporal hierarchical forecasting. Here, we propose a new hierarchical architecture that tackles this problem by leveraging modules that specialize in forecasting the different temporal aggregation levels of interest. The architecture, which learns to encode the average behaviour of the target variable within its hidden layers, makes accurate and coherent forecasts across the target temporal hierarchies. We validate our architecture on the challenging, real-world M5 dataset and show that it outperforms established methods, such as the TSMixer model.
Related papers
- Learning Time-Aware Causal Representation for Model Generalization in Evolving Domains [50.66049136093248]
We develop a time-aware structural causal model (SCM) that incorporates dynamic causal factors and the causal mechanism drifts.<n>We show that our method can yield the optimal causal predictor for each time domain.<n>Results on both synthetic and real-world datasets exhibit that SYNC can achieve superior temporal generalization performance.
arXiv Detail & Related papers (2025-06-21T14:05:37Z) - Temporal-Spectral-Spatial Unified Remote Sensing Dense Prediction [62.376936772702905]
Current deep learning architectures for remote sensing are fundamentally rigid.<n>We introduce the Spatial-Temporal-Spectral Unified Network (STSUN) for unified modeling.<n> STSUN can adapt to input and output data with arbitrary spatial sizes, temporal lengths, and spectral bands.<n>It unifies disparate dense prediction tasks within a single architecture by conditioning the model on trainable task embeddings.
arXiv Detail & Related papers (2025-05-18T07:39:17Z) - GMP-AR: Granularity Message Passing and Adaptive Reconciliation for Temporal Hierarchy Forecasting [20.56839345239421]
Time series forecasts of different temporal granularity are widely used in real-world applications.
We propose a novel granularity message-passing mechanism (GMP) that leverages temporal hierarchy information to improve forecasting performance.
We also introduce an optimization module to achieve task-based targets while adhering to more real-world constraints.
arXiv Detail & Related papers (2024-06-18T03:33:03Z) - Optimizing Time Series Forecasting Architectures: A Hierarchical Neural Architecture Search Approach [17.391148813359088]
We propose a novel hierarchical neural architecture search approach for time series forecasting tasks.
With the design of a hierarchical search space, we incorporate many architecture types designed for forecasting tasks.
Results on long-term-time-series-forecasting tasks show that our approach can search for lightweight high-performing forecasting architectures.
arXiv Detail & Related papers (2024-06-07T17:02:37Z) - Embedded feature selection in LSTM networks with multi-objective
evolutionary ensemble learning for time series forecasting [49.1574468325115]
We present a novel feature selection method embedded in Long Short-Term Memory networks.
Our approach optimize the weights and biases of the LSTM in a partitioned manner.
Experimental evaluations on air quality time series data from Italy and southeast Spain demonstrate that our method substantially improves the ability generalization of conventional LSTMs.
arXiv Detail & Related papers (2023-12-29T08:42:10Z) - $\clubsuit$ CLOVER $\clubsuit$: Probabilistic Forecasting with Coherent Learning Objective Reparameterization [42.215158938066054]
We augment an MQForecaster neural network architecture with a modified multivariate Gaussian factor model that achieves coherence by construction.<n>We call our method the Coherent Learning Objective Reparametrization Neural Network (CLOVER)<n>In comparison to state-of-the-art coherent forecasting methods, CLOVER achieves significant improvements in scaled CRPS forecast accuracy, with average gains of 15%.
arXiv Detail & Related papers (2023-07-19T07:31:37Z) - Temporal Saliency Detection Towards Explainable Transformer-based
Timeseries Forecasting [3.046315755726937]
This paper introduces Temporal Saliency Detection (TSD), an effective approach that builds upon the attention mechanism and applies it to multi-horizon time series prediction.
The TSD approach facilitates the multiresolution analysis of saliency patterns by condensing multi-heads, thereby progressively enhancing the forecasting of complex time series data.
arXiv Detail & Related papers (2022-12-15T12:47:59Z) - Deep Autoregressive Models with Spectral Attention [74.08846528440024]
We propose a forecasting architecture that combines deep autoregressive models with a Spectral Attention (SA) module.
By characterizing in the spectral domain the embedding of the time series as occurrences of a random process, our method can identify global trends and seasonality patterns.
Two spectral attention models, global and local to the time series, integrate this information within the forecast and perform spectral filtering to remove time series's noise.
arXiv Detail & Related papers (2021-07-13T11:08:47Z) - Hierarchically Regularized Deep Forecasting [18.539846932184012]
We propose a new approach for hierarchical forecasting based on decomposing the time series along a global set of basis time series.
Unlike past methods, our approach is scalable at inference-time while preserving coherence among the time series forecasts.
arXiv Detail & Related papers (2021-06-14T17:38:16Z) - Supporting Optimal Phase Space Reconstructions Using Neural Network
Architecture for Time Series Modeling [68.8204255655161]
We propose an artificial neural network with a mechanism to implicitly learn the phase spaces properties.
Our approach is either as competitive as or better than most state-of-the-art strategies.
arXiv Detail & Related papers (2020-06-19T21:04:47Z) - Dynamic Federated Learning [57.14673504239551]
Federated learning has emerged as an umbrella term for centralized coordination strategies in multi-agent environments.
We consider a federated learning model where at every iteration, a random subset of available agents perform local updates based on their data.
Under a non-stationary random walk model on the true minimizer for the aggregate optimization problem, we establish that the performance of the architecture is determined by three factors, namely, the data variability at each agent, the model variability across all agents, and a tracking term that is inversely proportional to the learning rate of the algorithm.
arXiv Detail & Related papers (2020-02-20T15:00:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.