AutoPV: Automated photovoltaic forecasts with limited information using
an ensemble of pre-trained models
- URL: http://arxiv.org/abs/2212.06797v1
- Date: Tue, 13 Dec 2022 18:29:03 GMT
- Title: AutoPV: Automated photovoltaic forecasts with limited information using
an ensemble of pre-trained models
- Authors: Stefan Meisenbacher, Benedikt Heidrich, Tim Martin, Ralf Mikut, Veit
Hagenmeyer
- Abstract summary: We propose a new method for day-ahead PV power generation forecasts called AutoPV.
AutoPV is a weighted ensemble of forecasting models that represent different PV mounting configurations.
For a real-world data set with 11 PV plants, the accuracy of AutoPV is comparable to a model trained on two years of data and outperforms an incrementally trained model.
- Score: 0.20999222360659608
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Accurate PhotoVoltaic (PV) power generation forecasting is vital for the
efficient operation of Smart Grids. The automated design of such accurate
forecasting models for individual PV plants includes two challenges: First,
information about the PV mounting configuration (i.e. inclination and azimuth
angles) is often missing. Second, for new PV plants, the amount of historical
data available to train a forecasting model is limited (cold-start problem). We
address these two challenges by proposing a new method for day-ahead PV power
generation forecasts called AutoPV. AutoPV is a weighted ensemble of
forecasting models that represent different PV mounting configurations. This
representation is achieved by pre-training each forecasting model on a separate
PV plant and by scaling the model's output with the peak power rating of the
corresponding PV plant. To tackle the cold-start problem, we initially weight
each forecasting model in the ensemble equally. To tackle the problem of
missing information about the PV mounting configuration, we use new data that
become available during operation to adapt the ensemble weights to minimize the
forecasting error. AutoPV is advantageous as the unknown PV mounting
configuration is implicitly reflected in the ensemble weights, and only the PV
plant's peak power rating is required to re-scale the ensemble's output. AutoPV
also allows to represent PV plants with panels distributed on different roofs
with varying alignments, as these mounting configurations can be reflected
proportionally in the weighting. Additionally, the required computing memory is
decoupled when scaling AutoPV to hundreds of PV plants, which is beneficial in
Smart Grids with limited computing capabilities. For a real-world data set with
11 PV plants, the accuracy of AutoPV is comparable to a model trained on two
years of data and outperforms an incrementally trained model.
Related papers
- Cross-variable Linear Integrated ENhanced Transformer for Photovoltaic power forecasting [2.1799192736303783]
PV-Client employs an ENhanced Transformer module to capture complex interactions of various features in PV systems.
PV-Client streamlines the embedding and position encoding layers by replacing the Decoder module with a projection layer.
Experimental results on three real-world PV power datasets affirm PV-Client's state-of-the-art (SOTA) performance in PV power forecasting.
arXiv Detail & Related papers (2024-06-06T07:30:27Z) - Generalized Predictive Model for Autonomous Driving [75.39517472462089]
We introduce the first large-scale video prediction model in the autonomous driving discipline.
Our model, dubbed GenAD, handles the challenging dynamics in driving scenes with novel temporal reasoning blocks.
It can be adapted into an action-conditioned prediction model or a motion planner, holding great potential for real-world driving applications.
arXiv Detail & Related papers (2024-03-14T17:58:33Z) - FusionSF: Fuse Heterogeneous Modalities in a Vector Quantized Framework
for Robust Solar Power Forecasting [24.57911612111109]
We propose a multi-modality fusion framework to integrate historical power data, numerical weather prediction, and satellite images.
Our framework demonstrates strong zero-shot forecasting capability, which is especially useful for those newly installed plants.
Our model not only operates with robustness but also boosts accuracy in both zero-shot forecasting and scenarios rich with training data, surpassing leading models.
arXiv Detail & Related papers (2024-02-08T17:03:10Z) - AutoVP: An Automated Visual Prompting Framework and Benchmark [66.5618543577204]
Visual prompting (VP) is an emerging parameter-efficient fine-tuning approach to adapting pre-trained vision models to solve various downstream image-classification tasks.
We propose AutoVP, an end-to-end expandable framework for automating VP design choices, along with 12 downstream image-classification tasks.
Our experimental results show that AutoVP outperforms the best-known current VP methods by a substantial margin.
arXiv Detail & Related papers (2023-10-12T14:55:31Z) - MATNet: Multi-Level Fusion Transformer-Based Model for Day-Ahead PV
Generation Forecasting [0.47518865271427785]
MATNet is a novel self-attention transformer-based architecture for PV power generation forecasting.
It consists of a hybrid approach that combines the AI paradigm with the prior physical knowledge of PV power generation.
Results show that our proposed architecture significantly outperforms the current state-of-the-art methods.
arXiv Detail & Related papers (2023-06-17T14:03:09Z) - Getting ViT in Shape: Scaling Laws for Compute-Optimal Model Design [84.34416126115732]
Scaling laws have been recently employed to derive compute-optimal model size (number of parameters) for a given compute duration.
We advance and refine such methods to infer compute-optimal model shapes, such as width and depth, and successfully implement this in vision transformers.
Our shape-optimized vision transformer, SoViT, achieves results competitive with models that exceed twice its size, despite being pre-trained with an equivalent amount of compute.
arXiv Detail & Related papers (2023-05-22T13:39:28Z) - A Comparative Study on Generative Models for High Resolution Solar
Observation Imaging [59.372588316558826]
This work investigates capabilities of current state-of-the-art generative models to accurately capture the data distribution behind observed solar activity states.
Using distributed training on supercomputers, we are able to train generative models for up to 1024x1024 resolution that produce high quality samples indistinguishable to human experts.
arXiv Detail & Related papers (2023-04-14T14:40:32Z) - Forecasting Intraday Power Output by a Set of PV Systems using Recurrent
Neural Networks and Physical Covariates [0.0]
Accurate forecasts of the power output by PhotoVoltaic (PV) systems are critical to improve the operation of energy distribution grids.
We describe a neural autoregressive model which aims at performing such intraday forecasts.
arXiv Detail & Related papers (2023-03-15T09:03:58Z) - Spatio-temporal graph neural networks for multi-site PV power
forecasting [0.0]
We present two novel graph neural network models for deterministic multi-site forecasting.
The proposed models outperform state-of-the-art multi-site forecasting methods for prediction horizons of six hours ahead.
arXiv Detail & Related papers (2021-07-29T10:15:01Z) - Principal Component Density Estimation for Scenario Generation Using
Normalizing Flows [62.997667081978825]
We propose a dimensionality-reducing flow layer based on the linear principal component analysis (PCA) that sets up the normalizing flow in a lower-dimensional space.
We train the resulting principal component flow (PCF) on data of PV and wind power generation as well as load demand in Germany in the years 2013 to 2015.
arXiv Detail & Related papers (2021-04-21T08:42:54Z) - Ensemble Distillation for Robust Model Fusion in Federated Learning [72.61259487233214]
Federated Learning (FL) is a machine learning setting where many devices collaboratively train a machine learning model.
In most of the current training schemes the central model is refined by averaging the parameters of the server model and the updated parameters from the client side.
We propose ensemble distillation for model fusion, i.e. training the central classifier through unlabeled data on the outputs of the models from the clients.
arXiv Detail & Related papers (2020-06-12T14:49:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.