Learning to Optimise Wind Farms with Graph Transformers
- URL: http://arxiv.org/abs/2311.12750v1
- Date: Tue, 21 Nov 2023 17:51:30 GMT
- Title: Learning to Optimise Wind Farms with Graph Transformers
- Authors: Siyi Li, Arnaud Robert, A. Aldo Faisal, Matthew D. Piggott
- Abstract summary: The proposed model functions by encoding a wind farm into a fully-connected graph and processing the graph representation through a graph transformer.
The graph transformer surrogate is shown to generalise well and is able to uncover latent structural patterns within the graph representation of wind farms.
- Score: 6.519940858545459
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This work proposes a novel data-driven model capable of providing accurate
predictions for the power generation of all wind turbines in wind farms of
arbitrary layout, yaw angle configurations and wind conditions. The proposed
model functions by encoding a wind farm into a fully-connected graph and
processing the graph representation through a graph transformer. The graph
transformer surrogate is shown to generalise well and is able to uncover latent
structural patterns within the graph representation of wind farms. It is
demonstrated how the resulting surrogate model can be used to optimise yaw
angle configurations using genetic algorithms, achieving similar levels of
accuracy to industrially-standard wind farm simulation tools while only taking
a fraction of the computational cost.
Related papers
- Data is missing again -- Reconstruction of power generation data using $k$-Nearest Neighbors and spectral graph theory [0.0]
We propose an imputation method that blends data-driven concepts with expert knowledge, by using the geometry of the wind farm.
Our method relies on learning Laplacian eigenmaps out of the graph of the wind farm through spectral graph theory.
arXiv Detail & Related papers (2024-08-30T23:58:28Z) - Automatic Graph Topology-Aware Transformer [50.2807041149784]
We build a comprehensive graph Transformer search space with the micro-level and macro-level designs.
EGTAS evolves graph Transformer topologies at the macro level and graph-aware strategies at the micro level.
We demonstrate the efficacy of EGTAS across a range of graph-level and node-level tasks.
arXiv Detail & Related papers (2024-05-30T07:44:31Z) - Deep Prompt Tuning for Graph Transformers [55.2480439325792]
Fine-tuning is resource-intensive and requires storing multiple copies of large models.
We propose a novel approach called deep graph prompt tuning as an alternative to fine-tuning.
By freezing the pre-trained parameters and only updating the added tokens, our approach reduces the number of free parameters and eliminates the need for multiple model copies.
arXiv Detail & Related papers (2023-09-18T20:12:17Z) - Curve Your Attention: Mixed-Curvature Transformers for Graph
Representation Learning [77.1421343649344]
We propose a generalization of Transformers towards operating entirely on the product of constant curvature spaces.
We also provide a kernelized approach to non-Euclidean attention, which enables our model to run in time and memory cost linear to the number of nodes and edges.
arXiv Detail & Related papers (2023-09-08T02:44:37Z) - An XAI framework for robust and transparent data-driven wind turbine
power curve models [0.8547032097715571]
Wind turbine power curve models translate ambient conditions into turbine power output.
In recent years, increasingly complex machine learning methods have become state-of-the-art for this task.
We introduce an explainable artificial intelligence framework to investigate and validate strategies learned by data-driven power curve models.
arXiv Detail & Related papers (2023-04-19T17:37:58Z) - Modeling Wind Turbine Performance and Wake Interactions with Machine
Learning [0.0]
Different machine learning (ML) models are trained on SCADA and meteorological data collected at an onshore wind farm.
ML methods for data quality control and pre-processing are applied to the data set under investigation.
A hybrid model is found to achieve high accuracy for modeling wind turbine power capture.
arXiv Detail & Related papers (2022-12-02T23:07:05Z) - End-to-end Wind Turbine Wake Modelling with Deep Graph Representation
Learning [7.850747042819504]
This work proposes a surrogate model for the representation of wind turbine wakes based on a graph representation learning method termed a graph neural network.
The proposed end-to-end deep learning model operates directly on unstructured meshes and has been validated against high-fidelity data.
A case study based upon a real world wind farm further demonstrates the capability of the proposed approach to predict farm scale power generation.
arXiv Detail & Related papers (2022-11-24T15:00:06Z) - Transformer for Graphs: An Overview from Architecture Perspective [86.3545861392215]
It's imperative to sort out the existing Transformer models for graphs and systematically investigate their effectiveness on various graph tasks.
We first disassemble the existing models and conclude three typical ways to incorporate the graph information into the vanilla Transformer.
Our experiments confirm the benefits of current graph-specific modules on Transformer and reveal their advantages on different kinds of graph tasks.
arXiv Detail & Related papers (2022-02-17T06:02:06Z) - Measuring Wind Turbine Health Using Drifting Concepts [55.87342698167776]
We propose two new approaches for the analysis of wind turbine health.
The first method aims at evaluating the decrease or increase in relatively high and low power production.
The second method evaluates the overall drift of the extracted concepts.
arXiv Detail & Related papers (2021-12-09T14:04:55Z) - Wind Power Projection using Weather Forecasts by Novel Deep Neural
Networks [0.0]
Using optimized machine learning algorithms, it is possible to find obscured patterns in the observations and obtain meaningful data.
The paper explores the use of both parametric and the non-parametric models for calculating wind power prediction using power curves.
arXiv Detail & Related papers (2021-08-22T17:46:36Z) - Learning Graphon Autoencoders for Generative Graph Modeling [91.32624399902755]
Graphon is a nonparametric model that generates graphs with arbitrary sizes and can be induced from graphs easily.
We propose a novel framework called textitgraphon autoencoder to build an interpretable and scalable graph generative model.
A linear graphon factorization model works as a decoder, leveraging the latent representations to reconstruct the induced graphons.
arXiv Detail & Related papers (2021-05-29T08:11:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.