Adaptation of Engineering Wake Models using Gaussian Process Regression
and High-Fidelity Simulation Data
- URL: http://arxiv.org/abs/2003.13323v1
- Date: Mon, 30 Mar 2020 10:22:57 GMT
- Title: Adaptation of Engineering Wake Models using Gaussian Process Regression
and High-Fidelity Simulation Data
- Authors: Leif Erik Andersson, Bart Doekemeijer, Daan van der Hoek, Jan-Willem
van Wingerden, Lars Imsland
- Abstract summary: This article investigates the optimization of yaw control inputs of a nine-turbine wind farm.
The wind farm is simulated using the high-fidelity simulator SOWFA.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This article investigates the optimization of yaw control inputs of a
nine-turbine wind farm. The wind farm is simulated using the high-fidelity
simulator SOWFA. The optimization is performed with a modifier adaptation
scheme based on Gaussian processes. Modifier adaptation corrects for the
mismatch between plant and model and helps to converge to the actual plan
optimum. In the case study the modifier adaptation approach is compared with
the Bayesian optimization approach. Moreover, the use of two different
covariance functions in the Gaussian process regression is discussed. Practical
recommendations concerning the data preparation and application of the approach
are given. It is shown that both the modifier adaptation and the Bayesian
optimization approach can improve the power production with overall smaller yaw
misalignments in comparison to the Gaussian wake model.
Related papers
- Enhancing Gaussian Process Surrogates for Optimization and Posterior Approximation via Random Exploration [2.984929040246293]
novel noise-free Bayesian optimization strategies that rely on a random exploration step to enhance the accuracy of Gaussian process surrogate models.
New algorithms retain the ease of implementation of the classical GP-UCB, but an additional exploration step facilitates their convergence.
arXiv Detail & Related papers (2024-01-30T14:16:06Z) - Manifold Gaussian Variational Bayes on the Precision Matrix [70.44024861252554]
We propose an optimization algorithm for Variational Inference (VI) in complex models.
We develop an efficient algorithm for Gaussian Variational Inference whose updates satisfy the positive definite constraint on the variational covariance matrix.
Due to its black-box nature, MGVBP stands as a ready-to-use solution for VI in complex models.
arXiv Detail & Related papers (2022-10-26T10:12:31Z) - Surrogate modeling for Bayesian optimization beyond a single Gaussian
process [62.294228304646516]
We propose a novel Bayesian surrogate model to balance exploration with exploitation of the search space.
To endow function sampling with scalability, random feature-based kernel approximation is leveraged per GP model.
To further establish convergence of the proposed EGP-TS to the global optimum, analysis is conducted based on the notion of Bayesian regret.
arXiv Detail & Related papers (2022-05-27T16:43:10Z) - Hybrid Evolutionary Optimization Approach for Oilfield Well Control
Optimization [0.0]
Oilfield production optimization is challenging due to subsurface model complexity and associated non-linearity.
This paper presents efficacy of two hybrid evolutionary optimization approaches for well control optimization of a waterflooding operation.
arXiv Detail & Related papers (2021-03-29T13:36:51Z) - Hyper-optimization with Gaussian Process and Differential Evolution
Algorithm [0.0]
This paper presents specific modifications of Gaussian Process optimization components from available scientific libraries.
presented modifications were submitted to BlackBox 2020 challenge, where it outperformed some conventionally available optimization libraries.
arXiv Detail & Related papers (2021-01-26T08:33:00Z) - Zeroth-Order Hybrid Gradient Descent: Towards A Principled Black-Box
Optimization Framework [100.36569795440889]
This work is on the iteration of zero-th-order (ZO) optimization which does not require first-order information.
We show that with a graceful design in coordinate importance sampling, the proposed ZO optimization method is efficient both in terms of complexity as well as as function query cost.
arXiv Detail & Related papers (2020-12-21T17:29:58Z) - Robust Optimal Transport with Applications in Generative Modeling and
Domain Adaptation [120.69747175899421]
Optimal Transport (OT) distances such as Wasserstein have been used in several areas such as GANs and domain adaptation.
We propose a computationally-efficient dual form of the robust OT optimization that is amenable to modern deep learning applications.
Our approach can train state-of-the-art GAN models on noisy datasets corrupted with outlier distributions.
arXiv Detail & Related papers (2020-10-12T17:13:40Z) - Yield Optimization using Hybrid Gaussian Process Regression and a
Genetic Multi-Objective Approach [0.0]
We propose a hybrid approach combining the reliability and accuracy of a Monte Carlo analysis with the efficiency of a surrogate model based on Gaussian Process Regression.
We present two optimization approaches. An adaptive Newton-MC to reduce the impact of uncertainty and a genetic multi-objective approach to optimize performance and robustness at the same time.
arXiv Detail & Related papers (2020-10-08T14:44:37Z) - Real-Time Optimization Meets Bayesian Optimization and Derivative-Free
Optimization: A Tale of Modifier Adaptation [0.0]
This paper investigates a new class of modifier-adaptation schemes to overcome plant-model mismatch in real-time optimization of uncertain processes.
The proposed schemes embed a physical model and rely on trust-region ideas to minimize risk during the exploration.
The benefits of using an acquisition function, knowing the process noise level, or specifying a nominal process model are illustrated.
arXiv Detail & Related papers (2020-09-18T12:57:17Z) - Robust, Accurate Stochastic Optimization for Variational Inference [68.83746081733464]
We show that common optimization methods lead to poor variational approximations if the problem is moderately large.
Motivated by these findings, we develop a more robust and accurate optimization framework by viewing the underlying algorithm as producing a Markov chain.
arXiv Detail & Related papers (2020-09-01T19:12:11Z) - Global Optimization of Gaussian processes [52.77024349608834]
We propose a reduced-space formulation with trained Gaussian processes trained on few data points.
The approach also leads to significantly smaller and computationally cheaper sub solver for lower bounding.
In total, we reduce time convergence by orders of orders of the proposed method.
arXiv Detail & Related papers (2020-05-21T20:59:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.