Optimal Design of Electric Machine with Efficient Handling of
Constraints and Surrogate Assistance
- URL: http://arxiv.org/abs/2206.01695v1
- Date: Fri, 3 Jun 2022 17:13:29 GMT
- Title: Optimal Design of Electric Machine with Efficient Handling of
Constraints and Surrogate Assistance
- Authors: Bhuvan Khoshoo, Julian Blank, Thang Q. Pham, Kalyanmoy Deb, Shanelle
N. Foster
- Abstract summary: This article proposes an optimization method incorporated into a popularly-used evolutionary multi-objective optimization algorithm - NSGA-II.
The proposed method exploits the inexpensiveness of geometric constraints to generate feasible designs by using a custom repair operator.
- Score: 5.387300498478744
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Electric machine design optimization is a computationally expensive
multi-objective optimization problem. While the objectives require
time-consuming finite element analysis, optimization constraints can often be
based on mathematical expressions, such as geometric constraints. This article
investigates this optimization problem of mixed computationally expensive
nature by proposing an optimization method incorporated into a popularly-used
evolutionary multi-objective optimization algorithm - NSGA-II. The proposed
method exploits the inexpensiveness of geometric constraints to generate
feasible designs by using a custom repair operator. The proposed method also
addresses the time-consuming objective functions by incorporating surrogate
models for predicting machine performance. The article successfully establishes
the superiority of the proposed method over the conventional optimization
approach. This study clearly demonstrates how a complex engineering design can
be optimized for multiple objectives and constraints requiring heterogeneous
evaluation times and optimal solutions can be analyzed to select a single
preferred solution and importantly harnessed to reveal vital design features
common to optimal solutions as design principles.
Related papers
- Learning Joint Models of Prediction and Optimization [56.04498536842065]
Predict-Then-Then framework uses machine learning models to predict unknown parameters of an optimization problem from features before solving.
This paper proposes an alternative method, in which optimal solutions are learned directly from the observable features by joint predictive models.
arXiv Detail & Related papers (2024-09-07T19:52:14Z) - Analyzing and Enhancing the Backward-Pass Convergence of Unrolled
Optimization [50.38518771642365]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
A central challenge in this setting is backpropagation through the solution of an optimization problem, which often lacks a closed form.
This paper provides theoretical insights into the backward pass of unrolled optimization, showing that it is equivalent to the solution of a linear system by a particular iterative method.
A system called Folded Optimization is proposed to construct more efficient backpropagation rules from unrolled solver implementations.
arXiv Detail & Related papers (2023-12-28T23:15:18Z) - Efficient Inverse Design Optimization through Multi-fidelity Simulations, Machine Learning, and Search Space Reduction Strategies [0.8646443773218541]
This paper introduces a methodology designed to augment the inverse design optimization process in scenarios constrained by limited compute.
The proposed methodology is analyzed on two distinct engineering inverse design problems: airfoil inverse design and the scalar field reconstruction problem.
Notably, this method is adaptable across any inverse design application, facilitating a synergy between a representative low-fidelity ML model, and high-fidelity simulation, and can be seamlessly applied across any variety of population-based optimization algorithms.
arXiv Detail & Related papers (2023-12-06T18:20:46Z) - Evolutionary Solution Adaption for Multi-Objective Metal Cutting Process
Optimization [59.45414406974091]
We introduce a framework for system flexibility that allows us to study the ability of an algorithm to transfer solutions from previous optimization tasks.
We study the flexibility of NSGA-II, which we extend by two variants: 1) varying goals, that optimize solutions for two tasks simultaneously to obtain in-between source solutions expected to be more adaptable, and 2) active-inactive genotype, that accommodates different possibilities that can be activated or deactivated.
Results show that adaption with standard NSGA-II greatly reduces the number of evaluations required for optimization to a target goal, while the proposed variants further improve the adaption costs.
arXiv Detail & Related papers (2023-05-31T12:07:50Z) - Diffusing the Optimal Topology: A Generative Optimization Approach [6.375982344506753]
Topology optimization seeks to find the best design that satisfies a set of constraints while maximizing system performance.
Traditional iterative optimization methods like SIMP can be computationally expensive and get stuck in local minima.
We propose a Generative Optimization method that integrates classic optimization like SIMP as a refining mechanism for the topology generated by a deep generative model.
arXiv Detail & Related papers (2023-03-17T03:47:10Z) - Fusion of ML with numerical simulation for optimized propeller design [0.6767885381740952]
We propose an alternative way to use ML model to surrogate the design process.
By using this trained surrogate model with the traditional optimization method, we can get the best of both worlds.
Empirical evaluations of propeller design problems show that a better efficient design can be found in fewer evaluations using SAO.
arXiv Detail & Related papers (2023-02-28T16:42:07Z) - Backpropagation of Unrolled Solvers with Folded Optimization [55.04219793298687]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
One typical strategy is algorithm unrolling, which relies on automatic differentiation through the operations of an iterative solver.
This paper provides theoretical insights into the backward pass of unrolled optimization, leading to a system for generating efficiently solvable analytical models of backpropagation.
arXiv Detail & Related papers (2023-01-28T01:50:42Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Multi-objective robust optimization using adaptive surrogate models for
problems with mixed continuous-categorical parameters [0.0]
Robust design optimization is traditionally considered when uncertainties are mainly affecting the objective function.
The resulting nested optimization problem may be solved using a general-purpose solver, herein the non-dominated sorting genetic algorithm (NSGA-II)
The proposed approach consists of sequentially carrying out NSGA-II while using an adaptively built Kriging model to estimate the quantiles.
arXiv Detail & Related papers (2022-03-03T20:23:18Z) - Speeding up Computational Morphogenesis with Online Neural Synthetic
Gradients [51.42959998304931]
A wide range of modern science and engineering applications are formulated as optimization problems with a system of partial differential equations (PDEs) as constraints.
These PDE-constrained optimization problems are typically solved in a standard discretize-then-optimize approach.
We propose a general framework to speed up PDE-constrained optimization using online neural synthetic gradients (ONSG) with a novel two-scale optimization scheme.
arXiv Detail & Related papers (2021-04-25T22:43:51Z) - Principal Component Analysis Applied to Gradient Fields in Band Gap
Optimization Problems for Metamaterials [0.7699714865575189]
This article describes the application of a related unsupervised machine learning technique, namely, principal component analysis, to approximate the gradient of the objective function of a band gap optimization problem for an acoustic metamaterial.
Numerical results show the effectiveness of the proposed method.
arXiv Detail & Related papers (2021-04-04T11:13:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.