Variational Autoencoder based Metamodeling for Multi-Objective Topology
Optimization of Electrical Machines
- URL: http://arxiv.org/abs/2201.08877v1
- Date: Fri, 21 Jan 2022 19:49:54 GMT
- Title: Variational Autoencoder based Metamodeling for Multi-Objective Topology
Optimization of Electrical Machines
- Authors: Vivek Parekh, Dominik Flore, Sebastian Sch\"ops
- Abstract summary: This paper presents a novel method for predicting Key Performance Indicators (KPIs) of differently parameterized electrical machine topologies at the same time.
After training, via a latent space, the decoder and multi-layer neural network will function as meta-models for sampling new designs and predicting associated topology, respectively.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Conventional magneto-static finite element analysis of electrical machine
models is time-consuming and computationally expensive. Since each machine
topology has a distinct set of parameters, design optimization is commonly
performed independently. This paper presents a novel method for predicting Key
Performance Indicators (KPIs) of differently parameterized electrical machine
topologies at the same time by mapping a high dimensional integrated design
parameters in a lower dimensional latent space using a variational autoencoder.
After training, via a latent space, the decoder and multi-layer neural network
will function as meta-models for sampling new designs and predicting associated
KPIs, respectively. This enables parameter-based concurrent multi-topology
optimization.
Related papers
- Vehicle Suspension Recommendation System: Multi-Fidelity Neural Network-based Mechanism Design Optimization [4.038368925548051]
Vehicle suspensions are designed to improve driving performance and ride comfort, but different types are available depending on the environment.
Traditional design process is multi-step, gradually reducing the number of design candidates while performing costly analyses to meet target performance.
Recently, AI models have been used to reduce the computational cost of FEA.
arXiv Detail & Related papers (2024-10-03T23:54:03Z) - Scaling Exponents Across Parameterizations and Optimizers [94.54718325264218]
We propose a new perspective on parameterization by investigating a key assumption in prior work.
Our empirical investigation includes tens of thousands of models trained with all combinations of threes.
We find that the best learning rate scaling prescription would often have been excluded by the assumptions in prior work.
arXiv Detail & Related papers (2024-07-08T12:32:51Z) - Parameter Efficient Fine-tuning via Cross Block Orchestration for Segment Anything Model [81.55141188169621]
We equip PEFT with a cross-block orchestration mechanism to enable the adaptation of the Segment Anything Model (SAM) to various downstream scenarios.
We propose an intra-block enhancement module, which introduces a linear projection head whose weights are generated from a hyper-complex layer.
Our proposed approach consistently improves the segmentation performance significantly on novel scenarios with only around 1K additional parameters.
arXiv Detail & Related papers (2023-11-28T11:23:34Z) - Differential Evolution Algorithm based Hyper-Parameters Selection of
Transformer Neural Network Model for Load Forecasting [0.0]
Transformer models have the potential to improve Load forecasting because of their ability to learn long-range dependencies derived from their Attention Mechanism.
Our work compares the proposed Transformer based Neural Network model integrated with different metaheuristic algorithms by their performance in Load forecasting based on numerical metrics such as Mean Squared Error (MSE) and Mean Absolute Percentage Error (MAPE)
arXiv Detail & Related papers (2023-07-28T04:29:53Z) - Multi-Objective Optimization of Electrical Machines using a Hybrid
Data-and Physics-Driven Approach [0.0]
We present the application of a hybrid data-and physics-driven model for numerical optimization of permanent magnet synchronous machines (PMSM)
Following the data-driven supervised training, deep neural network (DNN) will act as a meta-model to characterize the electromagnetic behavior of PMSM.
These intermediate measures are then post-processed with various physical models to compute the required key performance indicators.
arXiv Detail & Related papers (2023-06-15T12:47:56Z) - Deep learning based Meta-modeling for Multi-objective Technology
Optimization of Electrical Machines [0.0]
We present the application of a variational auto-encoder to optimize two different machine technologies simultaneously.
After training, we employ a deep neural network and a decoder as meta-models to predict global key performance indicators.
arXiv Detail & Related papers (2023-06-15T12:33:39Z) - Scaling Pre-trained Language Models to Deeper via Parameter-efficient
Architecture [68.13678918660872]
We design a more capable parameter-sharing architecture based on matrix product operator (MPO)
MPO decomposition can reorganize and factorize the information of a parameter matrix into two parts.
Our architecture shares the central tensor across all layers for reducing the model size.
arXiv Detail & Related papers (2023-03-27T02:34:09Z) - A Pareto-optimal compositional energy-based model for sampling and
optimization of protein sequences [55.25331349436895]
Deep generative models have emerged as a popular machine learning-based approach for inverse problems in the life sciences.
These problems often require sampling new designs that satisfy multiple properties of interest in addition to learning the data distribution.
arXiv Detail & Related papers (2022-10-19T19:04:45Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Deep Learning-based Prediction of Key Performance Indicators for
Electrical Machine [0.0]
A data-aided, deep learning-based meta-model is employed to predict the design of an electrical machine quickly and with high accuracy.
The results show a high prediction accuracy and proof that the validity of a deep learning-based meta-model to minimize the optimization time.
arXiv Detail & Related papers (2020-12-16T18:03:58Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.