MO-PaDGAN: Reparameterizing Engineering Designs for Augmented
Multi-objective Optimization
- URL: http://arxiv.org/abs/2009.07110v3
- Date: Tue, 28 Sep 2021 02:21:35 GMT
- Title: MO-PaDGAN: Reparameterizing Engineering Designs for Augmented
Multi-objective Optimization
- Authors: Wei Chen and Faez Ahmed
- Abstract summary: Multi-objective optimization is key to solving many Engineering Design problems.
Deep generative models can learn compact design representations.
Mo-PaDGAN adds a Determinantal Point Processes based loss function to the generative adversarial network.
- Score: 13.866787416457454
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multi-objective optimization is key to solving many Engineering Design
problems, where design parameters are optimized for several performance
indicators. However, optimization results are highly dependent on how the
designs are parameterized. Researchers have shown that deep generative models
can learn compact design representations, providing a new way of parameterizing
designs to achieve faster convergence and improved optimization performance.
Despite their success in capturing complex distributions, existing generative
models face three challenges when used for design problems: 1) generated
designs have limited design space coverage, 2) the generator ignores design
performance, and 3)~the new parameterization is unable to represent designs
beyond training data. To address these challenges, we propose MO-PaDGAN, which
adds a Determinantal Point Processes based loss function to the generative
adversarial network to simultaneously model diversity and (multi-variate)
performance. MO-PaDGAN can thus improve the performances and coverage of
generated designs, and even generate designs with performances exceeding those
from training data. When using MO-PaDGAN as a new parameterization in
multi-objective optimization, we can discover much better Pareto fronts even
though the training data do not cover those Pareto fronts. In a real-world
multi-objective airfoil design example, we demonstrate that MO-PaDGAN achieves,
on average, an over 180\% improvement in the hypervolume indicator when
compared to the vanilla GAN or other state-of-the-art parameterization methods.
Related papers
- Diffusion Model for Data-Driven Black-Box Optimization [54.25693582870226]
We focus on diffusion models, a powerful generative AI technology, and investigate their potential for black-box optimization.
We study two practical types of labels: 1) noisy measurements of a real-valued reward function and 2) human preference based on pairwise comparisons.
Our proposed method reformulates the design optimization problem into a conditional sampling problem, which allows us to leverage the power of diffusion models.
arXiv Detail & Related papers (2024-03-20T00:41:12Z) - Compositional Generative Inverse Design [69.22782875567547]
Inverse design, where we seek to design input variables in order to optimize an underlying objective function, is an important problem.
We show that by instead optimizing over the learned energy function captured by the diffusion model, we can avoid such adversarial examples.
In an N-body interaction task and a challenging 2D multi-airfoil design task, we demonstrate that by composing the learned diffusion model at test time, our method allows us to design initial states and boundary shapes.
arXiv Detail & Related papers (2024-01-24T01:33:39Z) - E^2VPT: An Effective and Efficient Approach for Visual Prompt Tuning [55.50908600818483]
Fine-tuning large-scale pretrained vision models for new tasks has become increasingly parameter-intensive.
We propose an Effective and Efficient Visual Prompt Tuning (E2VPT) approach for large-scale transformer-based model adaptation.
Our approach outperforms several state-of-the-art baselines on two benchmarks.
arXiv Detail & Related papers (2023-07-25T19:03:21Z) - Aligning Optimization Trajectories with Diffusion Models for Constrained
Design Generation [17.164961143132473]
We introduce a learning framework that demonstrates the efficacy of aligning the sampling trajectory of diffusion models with the optimization trajectory derived from traditional physics-based methods.
Our method allows for generating feasible and high-performance designs in as few as two steps without the need for expensive preprocessing, external surrogate models, or additional labeled data.
Our results demonstrate that TA outperforms state-of-the-art deep generative models on in-distribution configurations and halves the inference computational cost.
arXiv Detail & Related papers (2023-05-29T09:16:07Z) - Parameter-Efficient Fine-Tuning Design Spaces [63.954953653386106]
We present a parameter-efficient fine-tuning design paradigm and discover design patterns that are applicable to different experimental settings.
We show experimentally that these methods consistently and significantly outperform investigated parameter-efficient fine-tuning strategies across different backbone models and different tasks in natural language processing.
arXiv Detail & Related papers (2023-01-04T21:00:18Z) - A Pareto-optimal compositional energy-based model for sampling and
optimization of protein sequences [55.25331349436895]
Deep generative models have emerged as a popular machine learning-based approach for inverse problems in the life sciences.
These problems often require sampling new designs that satisfy multiple properties of interest in addition to learning the data distribution.
arXiv Detail & Related papers (2022-10-19T19:04:45Z) - Design Target Achievement Index: A Differentiable Metric to Enhance Deep
Generative Models in Multi-Objective Inverse Design [4.091593765662773]
Design Target Achievement Index (DTAI) is a differentiable, tunable metric that scores a design's ability to achieve designer-specified minimum performance targets.
We apply DTAI to a Performance-Augmented Diverse GAN (PaDGAN) and demonstrate superior generative performance compared to a set of baseline Deep Generative Models.
arXiv Detail & Related papers (2022-05-06T04:14:34Z) - Investigating Positive and Negative Qualities of Human-in-the-Loop
Optimization for Designing Interaction Techniques [55.492211642128446]
Designers reportedly struggle with design optimization tasks where they are asked to find a combination of design parameters that maximizes a given set of objectives.
Model-based computational design algorithms assist designers by generating design examples during design.
Black box methods for assistance, on the other hand, can work with any design problem.
arXiv Detail & Related papers (2022-04-15T20:40:43Z) - MO-PaDGAN: Generating Diverse Designs with Multivariate Performance
Enhancement [13.866787416457454]
Deep generative models have proven useful for automatic design synthesis and design space exploration.
They face three challenges when applied to engineering design: 1) generated designs lack diversity, 2) it is difficult to explicitly improve all the performance measures of generated designs, and 3) existing models generally do not generate high-performance novel designs.
We propose MO-PaDGAN, which contains a new Determinantal Point Processes based loss function for probabilistic modeling of diversity and performances.
arXiv Detail & Related papers (2020-07-07T21:57:29Z) - PaDGAN: A Generative Adversarial Network for Performance Augmented
Diverse Designs [13.866787416457454]
We develop a variant of the Generative Adversarial Network, named "Performance Augmented Diverse Generative Adversarial Network" or PaDGAN, which can generate novel high-quality designs with good coverage of the design space.
In comparison to a vanilla Generative Adversarial Network, on average, it generates samples with a 28% higher mean quality score with larger diversity and without the mode collapse issue.
arXiv Detail & Related papers (2020-02-26T04:53:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.