Physically Meaningful Uncertainty Quantification in Probabilistic Wind
Turbine Power Curve Models as a Damage Sensitive Feature
- URL: http://arxiv.org/abs/2209.15579v1
- Date: Fri, 30 Sep 2022 16:45:15 GMT
- Title: Physically Meaningful Uncertainty Quantification in Probabilistic Wind
Turbine Power Curve Models as a Damage Sensitive Feature
- Authors: J.H. Mclean, M.R. Jones, B.J. O'Connell, A.E Maguire, T.J. Rogers
- Abstract summary: A power curve is a key part of structural health monitoring in wind turbines.
Many probabilistic power curve models have a key limitation in that they are not physically meaningful.
This paper investigates the use of two bounded Gaussian Processes in order to produce physically meaningful probabilistic power curve models.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: A wind turbines' power curve is easily accessible damage sensitive data, and
as such is a key part of structural health monitoring in wind turbines. Power
curve models can be constructed in a number of ways, but the authors argue that
probabilistic methods carry inherent benefits in this use case, such as
uncertainty quantification and allowing uncertainty propagation analysis. Many
probabilistic power curve models have a key limitation in that they are not
physically meaningful - they return mean and uncertainty predictions outside of
what is physically possible (the maximum and minimum power outputs of the wind
turbine). This paper investigates the use of two bounded Gaussian Processes in
order to produce physically meaningful probabilistic power curve models. The
first model investigated was a warped heteroscedastic Gaussian process, and was
found to be ineffective due to specific shortcomings of the Gaussian Process in
relation to the warping function. The second model - an approximated Gaussian
Process with a Beta likelihood was highly successful and demonstrated that a
working bounded probabilistic model results in better predictive uncertainty
than a corresponding unbounded one without meaningful loss in predictive
accuracy. Such a bounded model thus offers increased accuracy for performance
monitoring and increased operator confidence in the model due to guaranteed
physical plausibility.
Related papers
- Prediction of wind turbines power with physics-informed neural networks
and evidential uncertainty quantification [2.126171264016785]
We use physics-informed neural networks to reproduce historical data coming from 4 turbines in a wind farm.
The developed models for regression of the power, torque, and power coefficient showed great accuracy for both real data and physical equations governing the system.
arXiv Detail & Related papers (2023-07-27T07:58:38Z) - Accurate generation of stochastic dynamics based on multi-model
Generative Adversarial Networks [0.0]
Generative Adversarial Networks (GANs) have shown immense potential in fields such as text and image generation.
Here we quantitatively test this approach by applying it to a prototypical process on a lattice.
Importantly, the discreteness of the model is retained despite the noise.
arXiv Detail & Related papers (2023-05-25T10:41:02Z) - Bias in Pruned Vision Models: In-Depth Analysis and Countermeasures [93.17009514112702]
Pruning, setting a significant subset of the parameters of a neural network to zero, is one of the most popular methods of model compression.
Despite existing evidence for this phenomenon, the relationship between neural network pruning and induced bias is not well-understood.
arXiv Detail & Related papers (2023-04-25T07:42:06Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Estimation of Bivariate Structural Causal Models by Variational Gaussian
Process Regression Under Likelihoods Parametrised by Normalising Flows [74.85071867225533]
Causal mechanisms can be described by structural causal models.
One major drawback of state-of-the-art artificial intelligence is its lack of explainability.
arXiv Detail & Related papers (2021-09-06T14:52:58Z) - When in Doubt: Neural Non-Parametric Uncertainty Quantification for
Epidemic Forecasting [70.54920804222031]
Most existing forecasting models disregard uncertainty quantification, resulting in mis-calibrated predictions.
Recent works in deep neural models for uncertainty-aware time-series forecasting also have several limitations.
We model the forecasting task as a probabilistic generative process and propose a functional neural process model called EPIFNP.
arXiv Detail & Related papers (2021-06-07T18:31:47Z) - Probabilistic Neural Network to Quantify Uncertainty of Wind Power
Estimation [3.4376560669160385]
A probabilistic neural network with Monte Carlo dropout is considered to quantify the model uncertainty of the power curve estimation.
The developed network captures both model and noise uncertainty which is found to be useful tools in assessing performance.
arXiv Detail & Related papers (2021-06-04T19:15:53Z) - Latent Gaussian Model Boosting [0.0]
Tree-boosting shows excellent predictive accuracy on many data sets.
We obtain increased predictive accuracy compared to existing approaches in both simulated and real-world data experiments.
arXiv Detail & Related papers (2021-05-19T07:36:30Z) - Probabilistic robust linear quadratic regulators with Gaussian processes [73.0364959221845]
Probabilistic models such as Gaussian processes (GPs) are powerful tools to learn unknown dynamical systems from data for subsequent use in control design.
We present a novel controller synthesis for linearized GP dynamics that yields robust controllers with respect to a probabilistic stability margin.
arXiv Detail & Related papers (2021-05-17T08:36:18Z) - Discriminative Jackknife: Quantifying Uncertainty in Deep Learning via
Higher-Order Influence Functions [121.10450359856242]
We develop a frequentist procedure that utilizes influence functions of a model's loss functional to construct a jackknife (or leave-one-out) estimator of predictive confidence intervals.
The DJ satisfies (1) and (2), is applicable to a wide range of deep learning models, is easy to implement, and can be applied in a post-hoc fashion without interfering with model training or compromising its accuracy.
arXiv Detail & Related papers (2020-06-29T13:36:52Z) - Estimation of Accurate and Calibrated Uncertainties in Deterministic
models [0.8702432681310401]
We devise a method to transform a deterministic prediction into a probabilistic one.
We show that for doing so, one has to compromise between the accuracy and the reliability (calibration) of such a model.
We show several examples both with synthetic data, where the underlying hidden noise can accurately be recovered, and with large real-world datasets.
arXiv Detail & Related papers (2020-03-11T04:02:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.