High-dimensional mixed-categorical Gaussian processes with application
to multidisciplinary design optimization for a green aircraft
- URL: http://arxiv.org/abs/2311.06130v2
- Date: Fri, 2 Feb 2024 09:20:22 GMT
- Title: High-dimensional mixed-categorical Gaussian processes with application
to multidisciplinary design optimization for a green aircraft
- Authors: Paul Saves, Youssef Diouane, Nathalie Bartoli, Thierry Lefebvre,
Joseph Morlier
- Abstract summary: This paper introduces an innovative dimension reduction algorithm that relies on partial least squares regression.
Our goal is to generalize classical dimension reduction techniques to handle mixed-categorical inputs.
The good potential of the proposed method is demonstrated in both structural and multidisciplinary application contexts.
- Score: 0.6749750044497732
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Recently, there has been a growing interest in mixed-categorical metamodels
based on Gaussian Process (GP) for Bayesian optimization. In this context,
different approaches can be used to build the mixed-categorical GP. Many of
these approaches involve a high number of hyperparameters; in fact, the more
general and precise the strategy used to build the GP, the greater the number
of hyperparameters to estimate. This paper introduces an innovative dimension
reduction algorithm that relies on partial least squares regression to reduce
the number of hyperparameters used to build a mixed-variable GP. Our goal is to
generalize classical dimension reduction techniques commonly used within GP
(for continuous inputs) to handle mixed-categorical inputs. The good potential
of the proposed method is demonstrated in both structural and multidisciplinary
application contexts. The targeted applications include the analysis of a
cantilever beam as well as the optimization of a green aircraft, resulting in a
significant 439-kilogram reduction in fuel consumption during a single mission.
Related papers
- Bayesian optimization for mixed variables using an adaptive dimension reduction process: applications to aircraft design [0.5420492913071214]
Multidisciplinary design optimization methods aim at adapting numerical optimization techniques to the design of engineering systems involving multiple disciplines.
Mixed continuous, integer and categorical variables might arise during the optimization process and practical applications involve a large number of design variables.
arXiv Detail & Related papers (2025-04-11T16:43:11Z) - Weighted Euclidean Distance Matrices over Mixed Continuous and Categorical Inputs for Gaussian Process Models [1.22995445255292]
We introduce WEighted Euclidean distance matrices Gaussian Process (WEGP)
We construct the kernel function for each categorical input by estimating the Euclidean distance matrix (EDM) among all categorical choices of this input.
We achieve superior performance on both synthetic and real-world optimization problems.
arXiv Detail & Related papers (2025-03-04T13:55:22Z) - Sparse Gaussian Process Hyperparameters: Optimize or Integrate? [5.949779668853556]
We propose an algorithm for sparse Gaussian process regression which leverages MCMC to sample from the hyperparameter posterior.
We compare this scheme against natural baselines in literature along with variational GPs (SVGPs) along with an extensive computational analysis.
arXiv Detail & Related papers (2022-11-04T14:06:59Z) - Computationally-efficient initialisation of GPs: The generalised
variogram method [1.0312968200748118]
Our strategy can be used as a pretraining stage to find initial conditions for maximum-likelihood (ML) training.
We provide experimental validation in terms of accuracy, consistency with ML and computational complexity for different kernels using synthetic and real-world data.
arXiv Detail & Related papers (2022-10-11T12:13:21Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - Surrogate modeling for Bayesian optimization beyond a single Gaussian
process [62.294228304646516]
We propose a novel Bayesian surrogate model to balance exploration with exploitation of the search space.
To endow function sampling with scalability, random feature-based kernel approximation is leveraged per GP model.
To further establish convergence of the proposed EGP-TS to the global optimum, analysis is conducted based on the notion of Bayesian regret.
arXiv Detail & Related papers (2022-05-27T16:43:10Z) - Scaling Gaussian Process Optimization by Evaluating a Few Unique
Candidates Multiple Times [119.41129787351092]
We show that sequential black-box optimization based on GPs can be made efficient by sticking to a candidate solution for multiple evaluation steps.
We modify two well-established GP-Opt algorithms, GP-UCB and GP-EI to adapt rules from batched GP-Opt.
arXiv Detail & Related papers (2022-01-30T20:42:14Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Parameters Fixing Strategy for Quantum Approximate Optimization
Algorithm [0.0]
We propose a strategy to give high approximation ratio on average, even at large circuit depths, by initializing QAOA with the optimal parameters obtained from the previous depths.
We test our strategy on the Max-cut problem of certain classes of graphs such as the 3-regular graphs and the Erd"os-R'enyi graphs.
arXiv Detail & Related papers (2021-08-11T15:44:16Z) - MuyGPs: Scalable Gaussian Process Hyperparameter Estimation Using Local
Cross-Validation [1.2233362977312945]
We present MuyGPs, a novel efficient GP hyper parameter estimation method.
MuyGPs builds upon prior methods that take advantage of the nearest neighbors structure of the data.
We show that our method outperforms all known competitors both in terms of time-to-solution and the root mean squared error of the predictions.
arXiv Detail & Related papers (2021-04-29T18:10:21Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - Optimizing generalization on the train set: a novel gradient-based
framework to train parameters and hyperparameters simultaneously [0.0]
Generalization is a central problem in Machine Learning.
We present a novel approach based on a new measure of risk that allows us to develop novel fully automatic procedures for generalization.
arXiv Detail & Related papers (2020-06-11T18:04:36Z) - Implicit differentiation of Lasso-type models for hyperparameter
optimization [82.73138686390514]
We introduce an efficient implicit differentiation algorithm, without matrix inversion, tailored for Lasso-type problems.
Our approach scales to high-dimensional data by leveraging the sparsity of the solutions.
arXiv Detail & Related papers (2020-02-20T18:43:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.