Wind Park Power Prediction: Attention-Based Graph Networks and Deep
Learning to Capture Wake Losses
- URL: http://arxiv.org/abs/2201.03229v1
- Date: Mon, 10 Jan 2022 09:30:40 GMT
- Title: Wind Park Power Prediction: Attention-Based Graph Networks and Deep
Learning to Capture Wake Losses
- Authors: Lars {\O}degaard Bentsen, Narada Dilp Warakagoda, Roy Stenbro and Paal
Engelstad
- Abstract summary: This paper proposes a modular framework for attention-based graph neural networks (GNN), where attention can be applied to any desired component of a graph block.
The results show that the model significantly outperforms a multilayer perceptron (MLP) and a bidirectional LSTM (BLSTM) model, while delivering performance on-par with a vanilla GNN model.
- Score: 1.278093617645299
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With the increased penetration of wind energy into the power grid, it has
become increasingly important to be able to predict the expected power
production for larger wind farms. Deep learning (DL) models can learn complex
patterns in the data and have found wide success in predicting wake losses and
expected power production. This paper proposes a modular framework for
attention-based graph neural networks (GNN), where attention can be applied to
any desired component of a graph block. The results show that the model
significantly outperforms a multilayer perceptron (MLP) and a bidirectional
LSTM (BLSTM) model, while delivering performance on-par with a vanilla GNN
model. Moreover, we argue that the proposed graph attention architecture can
easily adapt to different applications by offering flexibility into the desired
attention operations to be used, which might depend on the specific
application. Through analysis of the attention weights, it was showed that
employing attention-based GNNs can provide insights into what the models learn.
In particular, the attention networks seemed to realise turbine dependencies
that aligned with some physical intuition about wake losses.
Related papers
- Scalable Message Passing Neural Networks: No Need for Attention in Large Graph Representation Learning [15.317501970096743]
We show that by integrating standard convolutional message passing into a Pre-Layer Normalization Transformer-style block instead of attention, we can produce high-performing deep message-passing-based Graph Neural Networks (GNNs)
Results are competitive with the state-of-the-art in large graph transductive learning, without requiring the otherwise computationally and memory-expensive attention mechanism.
arXiv Detail & Related papers (2024-10-29T17:18:43Z) - Joint Feature and Differentiable $ k $-NN Graph Learning using Dirichlet
Energy [103.74640329539389]
We propose a deep FS method that simultaneously conducts feature selection and differentiable $ k $-NN graph learning.
We employ Optimal Transport theory to address the non-differentiability issue of learning $ k $-NN graphs in neural networks.
We validate the effectiveness of our model with extensive experiments on both synthetic and real-world datasets.
arXiv Detail & Related papers (2023-05-21T08:15:55Z) - EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning [92.71579608528907]
This paper aims to design an easy-to-use pipeline (termed as EasyDGL) composed of three key modules with both strong ability fitting and interpretability.
EasyDGL can effectively quantify the predictive power of frequency content that a model learn from the evolving graph data.
arXiv Detail & Related papers (2023-03-22T06:35:08Z) - End-to-end Wind Turbine Wake Modelling with Deep Graph Representation
Learning [7.850747042819504]
This work proposes a surrogate model for the representation of wind turbine wakes based on a graph representation learning method termed a graph neural network.
The proposed end-to-end deep learning model operates directly on unstructured meshes and has been validated against high-fidelity data.
A case study based upon a real world wind farm further demonstrates the capability of the proposed approach to predict farm scale power generation.
arXiv Detail & Related papers (2022-11-24T15:00:06Z) - Evaluating Distribution System Reliability with Hyperstructures Graph
Convolutional Nets [74.51865676466056]
We show how graph convolutional networks and hyperstructures representation learning framework can be employed for accurate, reliable, and computationally efficient distribution grid planning.
Our numerical experiments show that the proposed Hyper-GCNNs approach yields substantial gains in computational efficiency.
arXiv Detail & Related papers (2022-11-14T01:29:09Z) - Spiking Graph Convolutional Networks [19.36064180392385]
SpikingGCN is an end-to-end framework that aims to integrate the embedding of GCNs with the biofidelity characteristics of SNNs.
We show that SpikingGCN on a neuromorphic chip can bring a clear advantage of energy efficiency into graph data analysis.
arXiv Detail & Related papers (2022-05-05T16:44:36Z) - Graph Joint Attention Networks [24.258699912448257]
Graph attention networks (GATs) have been recognized as powerful tools for learning in graph structured data.
We propose Graph Joint Attention Networks (JATs) to address the aforementioned challenge.
JATs adopt novel joint attention mechanisms which can automatically determine the relative significance between node features.
We theoretically analyze the expressive power of JATs and further propose an improved strategy for the joint attention mechanisms.
arXiv Detail & Related papers (2021-02-05T12:51:47Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Attentive Graph Neural Networks for Few-Shot Learning [74.01069516079379]
Graph Neural Networks (GNN) has demonstrated the superior performance in many challenging applications, including the few-shot learning tasks.
Despite its powerful capacity to learn and generalize the model from few samples, GNN usually suffers from severe over-fitting and over-smoothing as the model becomes deep.
We propose a novel Attentive GNN to tackle these challenges, by incorporating a triple-attention mechanism.
arXiv Detail & Related papers (2020-07-14T07:43:09Z) - Graph Neural Networks for Leveraging Industrial Equipment Structure: An
application to Remaining Useful Life Estimation [21.297461316329453]
We propose to capture the structure of a complex equipment in the form of a graph, and use graph neural networks (GNNs) to model multi-sensor time-series data.
We observe that the proposed GNN-based RUL estimation model compares favorably to several strong baselines from literature such as those based on RNNs and CNNs.
arXiv Detail & Related papers (2020-06-30T06:38:08Z) - Spectral Graph Attention Network with Fast Eigen-approximation [103.93113062682633]
Spectral Graph Attention Network (SpGAT) learns representations for different frequency components regarding weighted filters and graph wavelets bases.
Fast approximation variant SpGAT-Cheby is proposed to reduce the computational cost brought by the eigen-decomposition.
We thoroughly evaluate the performance of SpGAT and SpGAT-Cheby in semi-supervised node classification tasks.
arXiv Detail & Related papers (2020-03-16T21:49:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.