Advanced Statistical Learning on Short Term Load Process Forecasting
- URL: http://arxiv.org/abs/2110.09920v1
- Date: Tue, 19 Oct 2021 12:32:40 GMT
- Title: Advanced Statistical Learning on Short Term Load Process Forecasting
- Authors: Junjie Hu, Brenda L\'opez Cabrera, Awdesch Melzer
- Abstract summary: Short Term Load Forecast (STLF) is necessary for effective scheduling, operation optimization trading, and decision-making for electricity consumers.
We propose different statistical nonlinear models to manage these challenges of hard type datasets and forecast 15-min frequency electricity load up to 2-days ahead.
- Score: 13.466565318976887
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Short Term Load Forecast (STLF) is necessary for effective scheduling,
operation optimization trading, and decision-making for electricity consumers.
Modern and efficient machine learning methods are recalled nowadays to manage
complicated structural big datasets, which are characterized by having a
nonlinear temporal dependence structure. We propose different statistical
nonlinear models to manage these challenges of hard type datasets and forecast
15-min frequency electricity load up to 2-days ahead. We show that the
Long-short Term Memory (LSTM) and the Gated Recurrent Unit (GRU) models applied
to the production line of a chemical production facility outperform several
other predictive models in terms of out-of-sample forecasting accuracy by the
Diebold-Mariano (DM) test with several metrics. The predictive information is
fundamental for the risk and production management of electricity consumers.
Related papers
- Multi-variable Adversarial Time-Series Forecast Model [0.7832189413179361]
Short-term industrial enterprises power system forecasting is an important issue for both load control and machine protection.
We propose a new framework, multi-variable adversarial time-series forecasting model, which regularizes Long Short-term Memory (LSTM) models via an adversarial process.
arXiv Detail & Related papers (2024-06-02T02:30:10Z) - Impact of data for forecasting on performance of model predictive control in buildings with smart energy storage [0.0]
The impact on forecast accuracy of measures to improve model data efficiency are quantified.
The use of more than 2 years of training data for load prediction models provided no significant improvement in forecast accuracy.
Reused models and those trained with 3 months of data had on average 10% higher error than baseline, indicating that deploying MPC systems without prior data collection may be economic.
arXiv Detail & Related papers (2024-02-19T21:01:11Z) - TranDRL: A Transformer-Driven Deep Reinforcement Learning Enabled Prescriptive Maintenance Framework [58.474610046294856]
Industrial systems demand reliable predictive maintenance strategies to enhance operational efficiency and reduce downtime.
This paper introduces an integrated framework that leverages the capabilities of the Transformer model-based neural networks and deep reinforcement learning (DRL) algorithms to optimize system maintenance actions.
arXiv Detail & Related papers (2023-09-29T02:27:54Z) - A comparative assessment of deep learning models for day-ahead load
forecasting: Investigating key accuracy drivers [2.572906392867547]
Short-term load forecasting (STLF) is vital for the effective and economic operation of power grids and energy markets.
Several deep learning models have been proposed in the literature for STLF, reporting promising results.
arXiv Detail & Related papers (2023-02-23T17:11:04Z) - Online Evolutionary Neural Architecture Search for Multivariate
Non-Stationary Time Series Forecasting [72.89994745876086]
This work presents the Online Neuro-Evolution-based Neural Architecture Search (ONE-NAS) algorithm.
ONE-NAS is a novel neural architecture search method capable of automatically designing and dynamically training recurrent neural networks (RNNs) for online forecasting tasks.
Results demonstrate that ONE-NAS outperforms traditional statistical time series forecasting methods.
arXiv Detail & Related papers (2023-02-20T22:25:47Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - Short-term Prediction of Household Electricity Consumption Using
Customized LSTM and GRU Models [5.8010446129208155]
This paper proposes a customized GRU (Gated Recurrent Unit) and Long Short-Term Memory (LSTM) architecture to address this challenging problem.
The electricity consumption datasets were obtained from individual household smart meters.
arXiv Detail & Related papers (2022-12-16T23:42:57Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Grouped self-attention mechanism for a memory-efficient Transformer [64.0125322353281]
Real-world tasks such as forecasting weather, electricity consumption, and stock market involve predicting data that vary over time.
Time-series data are generally recorded over a long period of observation with long sequences owing to their periodic characteristics and long-range dependencies over time.
We propose two novel modules, Grouped Self-Attention (GSA) and Compressed Cross-Attention (CCA)
Our proposed model efficiently exhibited reduced computational complexity and performance comparable to or better than existing methods.
arXiv Detail & Related papers (2022-10-02T06:58:49Z) - Appliance Level Short-term Load Forecasting via Recurrent Neural Network [6.351541960369854]
We present an STLF algorithm for efficiently predicting the power consumption of individual electrical appliances.
The proposed method builds upon a powerful recurrent neural network (RNN) architecture in deep learning.
arXiv Detail & Related papers (2021-11-23T16:56:37Z) - Transformer Hawkes Process [79.16290557505211]
We propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies.
THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin.
We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.
arXiv Detail & Related papers (2020-02-21T13:48:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.