Load Forecasting for Households and Energy Communities: Are Deep Learning Models Worth the Effort?
- URL: http://arxiv.org/abs/2501.05000v2
- Date: Wed, 29 Jan 2025 15:58:28 GMT
- Title: Load Forecasting for Households and Energy Communities: Are Deep Learning Models Worth the Effort?
- Authors: Lukas Moosbrugger, Valentin Seiler, Philipp Wohlgenannt, Sebastian Hegenbart, Sashko Ristov, Peter Kepplinger,
- Abstract summary: This study provides an extensive benchmark of state-of-the-art deep learning models for short-term load forecasting in energy communities.
LSTM, xLSTM, and Transformers are compared with benchmarks such as KNNs, synthetic load models, and persistence forecasting models.
- Score: 0.0
- License:
- Abstract: Accurate load forecasting is crucial for predictive control in many energy domain applications, with significant economic and ecological implications. To address these implications, this study provides an extensive benchmark of state-of-the-art deep learning models for short-term load forecasting in energy communities. Namely, LSTM, xLSTM, and Transformers are compared with benchmarks such as KNNs, synthetic load models, and persistence forecasting models. This comparison considers different scales of aggregation (e.g., number of household loads) and varying training data availability (e.g., training data time spans). Further, the impact of transfer learning from synthetic (standard) load profiles and the deep learning model size (i.e., parameter count) is investigated in terms of forecasting error. Implementations are publicly available and other researchers are encouraged to benchmark models using this framework. Additionally, a comprehensive case study, comprising an energy community of 50 households and a battery storage demonstrates the beneficial financial implications of accurate predictions. Key findings of this research include: (1) Simple persistence benchmarks outperform deep learning models for short-term load forecasting when the available training data is limited to six months or less; (2) Pretraining with publicly available synthetic load profiles improves the normalized Mean Absolute Error (nMAE) by an average of 1.28%pt during the first nine months of training data; (3) Increased aggregation significantly enhances the performance of deep learning models relative to persistence benchmarks; (4) Improved load forecasting, with an nMAE reduction of 1.1%pt, translates to an economic benefit of approximately 600EUR per year in an energy community comprising 50 households.
Related papers
- Hourly Short Term Load Forecasting for Residential Buildings and Energy Communities [0.0]
We introduce persistence models, auto-regressive-based machine learning models, and more advanced deep learning models.
We observe a 15-30% increase in the prediction accuracy of the newly introduced hourly-based forecasting models over existing approaches.
arXiv Detail & Related papers (2025-01-31T15:49:09Z) - Impact of ML Optimization Tactics on Greener Pre-Trained ML Models [46.78148962732881]
This study aims to (i) analyze image classification datasets and pre-trained models, (ii) improve inference efficiency by comparing optimized and non-optimized models, and (iii) assess the economic impact of the optimizations.
We conduct a controlled experiment to evaluate the impact of various PyTorch optimization techniques (dynamic quantization, torch.compile, local pruning, and global pruning) to 42 Hugging Face models for image classification.
Dynamic quantization demonstrates significant reductions in inference time and energy consumption, making it highly suitable for large-scale systems.
arXiv Detail & Related papers (2024-09-19T16:23:03Z) - Improve Load Forecasting in Energy Communities through Transfer Learning using Open-Access Synthetic Profiles [1.124958340749622]
A 1% reduction in forecast error for a 10 GW energy utility can save up to $ 1.6 million annually.
We propose to pre-train the load prediction models with open-access synthetic load profiles using transfer learning techniques.
arXiv Detail & Related papers (2024-07-11T12:17:31Z) - Impact of data for forecasting on performance of model predictive control in buildings with smart energy storage [0.0]
The impact on forecast accuracy of measures to improve model data efficiency are quantified.
The use of more than 2 years of training data for load prediction models provided no significant improvement in forecast accuracy.
Reused models and those trained with 3 months of data had on average 10% higher error than baseline, indicating that deploying MPC systems without prior data collection may be economic.
arXiv Detail & Related papers (2024-02-19T21:01:11Z) - Reusing Pretrained Models by Multi-linear Operators for Efficient
Training [65.64075958382034]
Training large models from scratch usually costs a substantial amount of resources.
Recent studies such as bert2BERT and LiGO have reused small pretrained models to initialize a large model.
We propose a method that linearly correlates each weight of the target model to all the weights of the pretrained model.
arXiv Detail & Related papers (2023-10-16T06:16:47Z) - DECODE: Data-driven Energy Consumption Prediction leveraging Historical
Data and Environmental Factors in Buildings [1.2891210250935148]
This paper introduces a Long Short-Term Memory (LSTM) model designed to forecast building energy consumption.
The LSTM model provides accurate short, medium, and long-term energy predictions for residential and commercial buildings.
It demonstrates exceptional prediction accuracy, boasting the highest R2 score of 0.97 and the most favorable mean absolute error (MAE) of 0.007.
arXiv Detail & Related papers (2023-09-06T11:02:53Z) - Meta-Regression Analysis of Errors in Short-Term Electricity Load
Forecasting [0.0]
We present a Meta-Regression Analysis (MRA) that examines factors that influence the accuracy of short-term electricity load forecasts.
We use data from 421 forecast models published in 59 studies.
We found the LSTM approach and a combination of neural networks with other approaches to be the best forecasting methods.
arXiv Detail & Related papers (2023-05-29T18:26:51Z) - Learning Sample Difficulty from Pre-trained Models for Reliable
Prediction [55.77136037458667]
We propose to utilize large-scale pre-trained models to guide downstream model training with sample difficulty-aware entropy regularization.
We simultaneously improve accuracy and uncertainty calibration across challenging benchmarks.
arXiv Detail & Related papers (2023-04-20T07:29:23Z) - A Hybrid Model for Forecasting Short-Term Electricity Demand [59.372588316558826]
Currently the UK Electric market is guided by load (demand) forecasts published every thirty minutes by the regulator.
We present HYENA: a hybrid predictive model that combines feature engineering (selection of the candidate predictor features), mobile-window predictors and LSTM encoder-decoders.
arXiv Detail & Related papers (2022-05-20T22:13:25Z) - Sparse MoEs meet Efficient Ensembles [49.313497379189315]
We study the interplay of two popular classes of such models: ensembles of neural networks and sparse mixture of experts (sparse MoEs)
We present Efficient Ensemble of Experts (E$3$), a scalable and simple ensemble of sparse MoEs that takes the best of both classes of models, while using up to 45% fewer FLOPs than a deep ensemble.
arXiv Detail & Related papers (2021-10-07T11:58:35Z) - Back2Future: Leveraging Backfill Dynamics for Improving Real-time
Predictions in Future [73.03458424369657]
In real-time forecasting in public health, data collection is a non-trivial and demanding task.
'Backfill' phenomenon and its effect on model performance has been barely studied in the prior literature.
We formulate a novel problem and neural framework Back2Future that aims to refine a given model's predictions in real-time.
arXiv Detail & Related papers (2021-06-08T14:48:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.