Stacking for Probabilistic Short-term Load Forecasting
- URL: http://arxiv.org/abs/2406.10718v1
- Date: Sat, 15 Jun 2024 19:05:49 GMT
- Title: Stacking for Probabilistic Short-term Load Forecasting
- Authors: Grzegorz Dudek,
- Abstract summary: We introduce both global and local variants of meta-learning.
In the local-learning mode, the meta-model is trained using patterns most similar to the query pattern.
Our findings underscored the superiority of quantile regression forest over its competitors.
- Score: 1.6317061277457001
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In this study, we delve into the realm of meta-learning to combine point base forecasts for probabilistic short-term electricity demand forecasting. Our approach encompasses the utilization of quantile linear regression, quantile regression forest, and post-processing techniques involving residual simulation to generate quantile forecasts. Furthermore, we introduce both global and local variants of meta-learning. In the local-learning mode, the meta-model is trained using patterns most similar to the query pattern.Through extensive experimental studies across 35 forecasting scenarios and employing 16 base forecasting models, our findings underscored the superiority of quantile regression forest over its competitors
Related papers
- Quantile Regression using Random Forest Proximities [0.9423257767158634]
Quantile regression forests estimate the entire conditional distribution of the target variable with a single model.
We show that using quantile regression using Random Forest proximities demonstrates superior performance in approximating conditional target distributions and prediction intervals to the original version of QRF.
arXiv Detail & Related papers (2024-08-05T10:02:33Z) - Predictive Churn with the Set of Good Models [64.05949860750235]
We study the effect of conflicting predictions over the set of near-optimal machine learning models.
We present theoretical results on the expected churn between models within the Rashomon set.
We show how our approach can be used to better anticipate, reduce, and avoid churn in consumer-facing applications.
arXiv Detail & Related papers (2024-02-12T16:15:25Z) - Engression: Extrapolation through the Lens of Distributional Regression [2.519266955671697]
We propose a neural network-based distributional regression methodology called engression'
An engression model is generative in the sense that we can sample from the fitted conditional distribution and is also suitable for high-dimensional outcomes.
We show that engression can successfully perform extrapolation under some assumptions such as monotonicity, whereas traditional regression approaches such as least-squares or quantile regression fall short under the same assumptions.
arXiv Detail & Related papers (2023-07-03T08:19:00Z) - Masked Multi-Step Probabilistic Forecasting for Short-to-Mid-Term
Electricity Demand [7.544120398993689]
Masked Multi-Step Multi Probabilistic Forecasting (MMMPF) is a novel and general framework to train any neural network model.
It combines both the temporal information from the past and the known information about the future to make probabilistic predictions.
MMMPF can also generate desired quantiles to capture uncertainty and enable probabilistic planning for grid of the future.
arXiv Detail & Related papers (2023-02-14T04:09:03Z) - Distributional Gradient Boosting Machines [77.34726150561087]
Our framework is based on XGBoost and LightGBM.
We show that our framework achieves state-of-the-art forecast accuracy.
arXiv Detail & Related papers (2022-04-02T06:32:19Z) - Probabilistic Time Series Forecasting with Implicit Quantile Networks [0.7249731529275341]
We combine an autoregressive recurrent neural network to model temporal dynamics with Implicit Quantile Networks to learn a large class of distributions over a time-series target.
Our approach is favorable in terms of point-wise prediction accuracy as well as on estimating the underlying temporal distribution.
arXiv Detail & Related papers (2021-07-08T10:37:24Z) - Flexible Model Aggregation for Quantile Regression [92.63075261170302]
Quantile regression is a fundamental problem in statistical learning motivated by a need to quantify uncertainty in predictions.
We investigate methods for aggregating any number of conditional quantile models.
All of the models we consider in this paper can be fit using modern deep learning toolkits.
arXiv Detail & Related papers (2021-02-26T23:21:16Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Quantile Surfaces -- Generalizing Quantile Regression to Multivariate
Targets [4.979758772307178]
Our approach is based on an extension of single-output quantile regression (QR) to multivariate-targets, called quantile surfaces (QS)
We present a novel two-stage process: In the first stage, we perform a deterministic point forecast (i.e., central tendency estimation)
Subsequently, we model the prediction uncertainty using QS involving neural networks called quantile surface regression neural networks (QSNN)
We evaluate our novel approach on synthetic data and two currently researched real-world challenges in two different domains: First, probabilistic forecasting for renewable energy power generation, second, short-term cyclists trajectory forecasting for
arXiv Detail & Related papers (2020-09-29T16:35:37Z) - Video Prediction via Example Guidance [156.08546987158616]
In video prediction tasks, one major challenge is to capture the multi-modal nature of future contents and dynamics.
In this work, we propose a simple yet effective framework that can efficiently predict plausible future states.
arXiv Detail & Related papers (2020-07-03T14:57:24Z) - Meta-learning framework with applications to zero-shot time-series
forecasting [82.61728230984099]
This work provides positive evidence using a broad meta-learning framework.
residual connections act as a meta-learning adaptation mechanism.
We show that it is viable to train a neural network on a source TS dataset and deploy it on a different target TS dataset without retraining.
arXiv Detail & Related papers (2020-02-07T16:39:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.