A comparative assessment of deep learning models for day-ahead load
forecasting: Investigating key accuracy drivers
- URL: http://arxiv.org/abs/2302.12168v2
- Date: Mon, 25 Sep 2023 13:57:27 GMT
- Title: A comparative assessment of deep learning models for day-ahead load
forecasting: Investigating key accuracy drivers
- Authors: Sotiris Pelekis, Ioannis-Konstantinos Seisopoulos, Evangelos
Spiliotis, Theodosios Pountridis, Evangelos Karakolis, Spiros Mouzakitis,
Dimitris Askounis
- Abstract summary: Short-term load forecasting (STLF) is vital for the effective and economic operation of power grids and energy markets.
Several deep learning models have been proposed in the literature for STLF, reporting promising results.
- Score: 2.572906392867547
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Short-term load forecasting (STLF) is vital for the effective and economic
operation of power grids and energy markets. However, the non-linearity and
non-stationarity of electricity demand as well as its dependency on various
external factors renders STLF a challenging task. To that end, several deep
learning models have been proposed in the literature for STLF, reporting
promising results. In order to evaluate the accuracy of said models in
day-ahead forecasting settings, in this paper we focus on the national net
aggregated STLF of Portugal and conduct a comparative study considering a set
of indicative, well-established deep autoregressive models, namely multi-layer
perceptrons (MLP), long short-term memory networks (LSTM), neural basis
expansion coefficient analysis (N-BEATS), temporal convolutional networks
(TCN), and temporal fusion transformers (TFT). Moreover, we identify factors
that significantly affect the demand and investigate their impact on the
accuracy of each model. Our results suggest that N-BEATS consistently
outperforms the rest of the examined models. MLP follows, providing further
evidence towards the use of feed-forward networks over relatively more
sophisticated architectures. Finally, certain calendar and weather features
like the hour of the day and the temperature are identified as key accuracy
drivers, providing insights regarding the forecasting approach that should be
used per case.
Related papers
- Improving Time Series Forecasting via Instance-aware Post-hoc Revision [44.90322487625981]
Time series forecasting plays a vital role in various real-world applications.<n>Recent methods have achieved remarkable accuracy by incorporating advanced inductive biases and training strategies.<n>We propose a model-agnostic framework, PIR, designed to enhance forecasting performance through Post-forecasting Identification and Revision.
arXiv Detail & Related papers (2025-05-29T15:56:41Z) - IISE PG&E Energy Analytics Challenge 2025: Hourly-Binned Regression Models Beat Transformers in Load Forecasting [0.0]
This study evaluates forecasting models ranging from classical regression techniques to advanced deep learning architectures.<n>The dataset includes two years of historical electricity load data, alongside temperature and global horizontal irradiance (GHI) across five sites.<n>Our results reveal that deep learning models, including TimeGPT, fail to consistently outperform simpler statistical and machine learning approaches.
arXiv Detail & Related papers (2025-05-16T15:55:34Z) - Towards Physically Consistent Deep Learning For Climate Model Parameterizations [46.07009109585047]
parameterizations are a major source of systematic errors and large uncertainties in climate projections.
Deep learning (DL)-based parameterizations, trained on data from computationally expensive short, high-resolution simulations, have shown great promise for improving climate models.
We propose an efficient supervised learning framework for DL-based parameterizations that leads to physically consistent models.
arXiv Detail & Related papers (2024-06-06T10:02:49Z) - Weather Prediction with Diffusion Guided by Realistic Forecast Processes [49.07556359513563]
We introduce a novel method that applies diffusion models (DM) for weather forecasting.
Our method can achieve both direct and iterative forecasting with the same modeling framework.
The flexibility and controllability of our model empowers a more trustworthy DL system for the general weather community.
arXiv Detail & Related papers (2024-02-06T21:28:42Z) - Cumulative Distribution Function based General Temporal Point Processes [49.758080415846884]
CuFun model represents a novel approach to TPPs that revolves around the Cumulative Distribution Function (CDF)
Our approach addresses several critical issues inherent in traditional TPP modeling.
Our contributions encompass the introduction of a pioneering CDF-based TPP model, the development of a methodology for incorporating past event information into future event prediction.
arXiv Detail & Related papers (2024-02-01T07:21:30Z) - In Search of Deep Learning Architectures for Load Forecasting: A
Comparative Analysis and the Impact of the Covid-19 Pandemic on Model
Performance [0.0]
Short-term load forecasting (STLF) is crucial to the optimization of their reliability, emissions, and costs.
This work conducts a comparative study of Deep Learning (DL) architectures, with respect to forecasting accuracy and training sustainability.
The case study focuses on day-ahead forecasts for the Portuguese national 15-minute resolution net load time series.
arXiv Detail & Related papers (2023-02-25T10:08:23Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - DeepVol: Volatility Forecasting from High-Frequency Data with Dilated Causal Convolutions [53.37679435230207]
We propose DeepVol, a model based on Dilated Causal Convolutions that uses high-frequency data to forecast day-ahead volatility.
Our empirical results suggest that the proposed deep learning-based approach effectively learns global features from high-frequency data.
arXiv Detail & Related papers (2022-09-23T16:13:47Z) - Probabilistic AutoRegressive Neural Networks for Accurate Long-range
Forecasting [6.295157260756792]
We introduce the Probabilistic AutoRegressive Neural Networks (PARNN)
PARNN is capable of handling complex time series data exhibiting non-stationarity, nonlinearity, non-seasonality, long-range dependence, and chaotic patterns.
We evaluate the performance of PARNN against standard statistical, machine learning, and deep learning models, including Transformers, NBeats, and DeepAR.
arXiv Detail & Related papers (2022-04-01T17:57:36Z) - Advanced Statistical Learning on Short Term Load Process Forecasting [13.466565318976887]
Short Term Load Forecast (STLF) is necessary for effective scheduling, operation optimization trading, and decision-making for electricity consumers.
We propose different statistical nonlinear models to manage these challenges of hard type datasets and forecast 15-min frequency electricity load up to 2-days ahead.
arXiv Detail & Related papers (2021-10-19T12:32:40Z) - Low-Rank Temporal Attention-Augmented Bilinear Network for financial
time-series forecasting [93.73198973454944]
Deep learning models have led to significant performance improvements in many problems coming from different domains, including prediction problems of financial time-series data.
The Temporal Attention-Augmented Bilinear network was recently proposed as an efficient and high-performing model for Limit Order Book time-series forecasting.
In this paper, we propose a low-rank tensor approximation of the model to further reduce the number of trainable parameters and increase its speed.
arXiv Detail & Related papers (2021-07-05T10:15:23Z) - When in Doubt: Neural Non-Parametric Uncertainty Quantification for
Epidemic Forecasting [70.54920804222031]
Most existing forecasting models disregard uncertainty quantification, resulting in mis-calibrated predictions.
Recent works in deep neural models for uncertainty-aware time-series forecasting also have several limitations.
We model the forecasting task as a probabilistic generative process and propose a functional neural process model called EPIFNP.
arXiv Detail & Related papers (2021-06-07T18:31:47Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Energy Forecasting in Smart Grid Systems: A Review of the
State-of-the-art Techniques [2.3436632098950456]
This paper presents a review of state-of-the-art forecasting methods for smart grid (SG) systems.
Traditional point forecasting methods including statistical, machine learning (ML), and deep learning (DL) are extensively investigated.
A comparative case study using the Victorian electricity consumption and American electric power (AEP) is conducted.
arXiv Detail & Related papers (2020-11-25T09:17:07Z) - Spatiotemporal Adaptive Neural Network for Long-term Forecasting of
Financial Time Series [0.2793095554369281]
We investigate whether deep neural networks (DNNs) can be used to forecast time series (TS) forecasts conjointly.
We make use of the dynamic factor graph (DFG) to build a multivariate autoregressive model.
With ACTM, it is possible to vary the autoregressive order of a TS model over time and model a larger set of probability distributions.
arXiv Detail & Related papers (2020-03-27T00:53:11Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.