An Adaptive Deep Learning Framework for Day-ahead Forecasting of
Photovoltaic Power Generation
- URL: http://arxiv.org/abs/2109.13442v1
- Date: Tue, 28 Sep 2021 02:39:56 GMT
- Title: An Adaptive Deep Learning Framework for Day-ahead Forecasting of
Photovoltaic Power Generation
- Authors: Xing Luo, Dongxiao Zhang
- Abstract summary: This paper proposes an adaptive LSTM (AD-LSTM) model, which is a DL framework that can not only acquire general knowledge from historical data, but also dynamically learn specific knowledge from newly-arrived data.
The developed AD-LSTM model demonstrates greater forecasting capability than the offline LSTM model, particularly in the presence of concept drift.
- Score: 0.8702432681310401
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Accurate forecasts of photovoltaic power generation (PVPG) are essential to
optimize operations between energy supply and demand. Recently, the propagation
of sensors and smart meters has produced an enormous volume of data, which
supports the development of data based PVPG forecasting. Although emerging deep
learning (DL) models, such as the long short-term memory (LSTM) model, based on
historical data, have provided effective solutions for PVPG forecasting with
great successes, these models utilize offline learning. As a result, DL models
cannot take advantage of the opportunity to learn from newly-arrived data, and
are unable to handle concept drift caused by installing extra PV units and
unforeseen PV unit failures. Consequently, to improve day-ahead PVPG
forecasting accuracy, as well as eliminate the impacts of concept drift, this
paper proposes an adaptive LSTM (AD-LSTM) model, which is a DL framework that
can not only acquire general knowledge from historical data, but also
dynamically learn specific knowledge from newly-arrived data. A two-phase
adaptive learning strategy (TP-ALS) is integrated into AD-LSTM, and a sliding
window (SDWIN) algorithm is proposed, to detect concept drift in PV systems.
Multiple datasets from PV systems are utilized to assess the feasibility and
effectiveness of the proposed approaches. The developed AD-LSTM model
demonstrates greater forecasting capability than the offline LSTM model,
particularly in the presence of concept drift. Additionally, the proposed
AD-LSTM model also achieves superior performance in terms of day-ahead PVPG
forecasting compared to other traditional machine learning models and
statistical models in the literature.
Related papers
- Self-Augmented Preference Optimization: Off-Policy Paradigms for Language Model Alignment [104.18002641195442]
We introduce Self-Augmented Preference Optimization (SAPO), an effective and scalable training paradigm that does not require existing paired data.
Building on the self-play concept, which autonomously generates negative responses, we further incorporate an off-policy learning pipeline to enhance data exploration and exploitation.
arXiv Detail & Related papers (2024-05-31T14:21:04Z) - Diffusion-Based Neural Network Weights Generation [80.89706112736353]
D2NWG is a diffusion-based neural network weights generation technique that efficiently produces high-performing weights for transfer learning.
Our method extends generative hyper-representation learning to recast the latent diffusion paradigm for neural network weights generation.
Our approach is scalable to large architectures such as large language models (LLMs), overcoming the limitations of current parameter generation techniques.
arXiv Detail & Related papers (2024-02-28T08:34:23Z) - PILOT: A Pre-Trained Model-Based Continual Learning Toolbox [71.63186089279218]
This paper introduces a pre-trained model-based continual learning toolbox known as PILOT.
On the one hand, PILOT implements some state-of-the-art class-incremental learning algorithms based on pre-trained models, such as L2P, DualPrompt, and CODA-Prompt.
On the other hand, PILOT fits typical class-incremental learning algorithms within the context of pre-trained models to evaluate their effectiveness.
arXiv Detail & Related papers (2023-09-13T17:55:11Z) - MATNet: Multi-Level Fusion Transformer-Based Model for Day-Ahead PV
Generation Forecasting [0.47518865271427785]
MATNet is a novel self-attention transformer-based architecture for PV power generation forecasting.
It consists of a hybrid approach that combines the AI paradigm with the prior physical knowledge of PV power generation.
Results show that our proposed architecture significantly outperforms the current state-of-the-art methods.
arXiv Detail & Related papers (2023-06-17T14:03:09Z) - DA-LSTM: A Dynamic Drift-Adaptive Learning Framework for Interval Load
Forecasting with LSTM Networks [1.3342521220589318]
A drift magnitude threshold should be defined to design change detection methods to identify drifts.
We propose a dynamic drift-adaptive Long Short-Term Memory (DA-LSTM) framework that can improve the performance of load forecasting models.
arXiv Detail & Related papers (2023-05-15T16:26:03Z) - Forecasting Intraday Power Output by a Set of PV Systems using Recurrent Neural Networks and Physical Covariates [0.0]
Accurate forecasts of the power output by PhotoVoltaic (PV) systems are critical to improve the operation of energy distribution grids.
We describe a neural autoregressive model that aims to perform such intraday forecasts.
arXiv Detail & Related papers (2023-03-15T09:03:58Z) - A comparative assessment of deep learning models for day-ahead load
forecasting: Investigating key accuracy drivers [2.572906392867547]
Short-term load forecasting (STLF) is vital for the effective and economic operation of power grids and energy markets.
Several deep learning models have been proposed in the literature for STLF, reporting promising results.
arXiv Detail & Related papers (2023-02-23T17:11:04Z) - Towards Long-Term Time-Series Forecasting: Feature, Pattern, and
Distribution [57.71199089609161]
Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning.
Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism.
We propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects.
arXiv Detail & Related papers (2023-01-05T13:59:29Z) - How robust are pre-trained models to distribution shift? [82.08946007821184]
We show how spurious correlations affect the performance of popular self-supervised learning (SSL) and auto-encoder based models (AE)
We develop a novel evaluation scheme with the linear head trained on out-of-distribution (OOD) data, to isolate the performance of the pre-trained models from a potential bias of the linear head used for evaluation.
arXiv Detail & Related papers (2022-06-17T16:18:28Z) - Interpretable AI-based Large-scale 3D Pathloss Prediction Model for
enabling Emerging Self-Driving Networks [3.710841042000923]
We propose a Machine Learning-based model that leverages novel key predictors for estimating pathloss.
By quantitatively evaluating the ability of various ML algorithms in terms of predictive, generalization and computational performance, our results show that Light Gradient Boosting Machine (LightGBM) algorithm overall outperforms others.
arXiv Detail & Related papers (2022-01-30T19:50:16Z) - Learning representations with end-to-end models for improved remaining
useful life prognostics [64.80885001058572]
The remaining Useful Life (RUL) of equipment is defined as the duration between the current time and its failure.
We propose an end-to-end deep learning model based on multi-layer perceptron and long short-term memory layers (LSTM) to predict the RUL.
We will discuss how the proposed end-to-end model is able to achieve such good results and compare it to other deep learning and state-of-the-art methods.
arXiv Detail & Related papers (2021-04-11T16:45:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.