Aligning Optimization Trajectories with Diffusion Models for Constrained
Design Generation
- URL: http://arxiv.org/abs/2305.18470v1
- Date: Mon, 29 May 2023 09:16:07 GMT
- Title: Aligning Optimization Trajectories with Diffusion Models for Constrained
Design Generation
- Authors: Giorgio Giannone, Akash Srivastava, Ole Winther, Faez Ahmed
- Abstract summary: We introduce a learning framework that demonstrates the efficacy of aligning the sampling trajectory of diffusion models with the optimization trajectory derived from traditional physics-based methods.
Our method allows for generating feasible and high-performance designs in as few as two steps without the need for expensive preprocessing, external surrogate models, or additional labeled data.
Our results demonstrate that TA outperforms state-of-the-art deep generative models on in-distribution configurations and halves the inference computational cost.
- Score: 17.164961143132473
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generative models have had a profound impact on vision and language, paving
the way for a new era of multimodal generative applications. While these
successes have inspired researchers to explore using generative models in
science and engineering to accelerate the design process and reduce the
reliance on iterative optimization, challenges remain. Specifically,
engineering optimization methods based on physics still outperform generative
models when dealing with constrained environments where data is scarce and
precision is paramount. To address these challenges, we introduce Diffusion
Optimization Models (DOM) and Trajectory Alignment (TA), a learning framework
that demonstrates the efficacy of aligning the sampling trajectory of diffusion
models with the optimization trajectory derived from traditional physics-based
methods. This alignment ensures that the sampling process remains grounded in
the underlying physical principles. Our method allows for generating feasible
and high-performance designs in as few as two steps without the need for
expensive preprocessing, external surrogate models, or additional labeled data.
We apply our framework to structural topology optimization, a fundamental
problem in mechanical design, evaluating its performance on in- and
out-of-distribution configurations. Our results demonstrate that TA outperforms
state-of-the-art deep generative models on in-distribution configurations and
halves the inference computational cost. When coupled with a few steps of
optimization, it also improves manufacturability for out-of-distribution
conditions. By significantly improving performance and inference efficiency,
DOM enables us to generate high-quality designs in just a few steps and guide
them toward regions of high performance and manufacturability, paving the way
for the widespread application of generative models in large-scale data-driven
design.
Related papers
- Bridging Model-Based Optimization and Generative Modeling via Conservative Fine-Tuning of Diffusion Models [54.132297393662654]
We introduce a hybrid method that fine-tunes cutting-edge diffusion models by optimizing reward models through RL.
We demonstrate the capability of our approach to outperform the best designs in offline data, leveraging the extrapolation capabilities of reward models.
arXiv Detail & Related papers (2024-05-30T03:57:29Z) - Diffusion Model for Data-Driven Black-Box Optimization [54.25693582870226]
We focus on diffusion models, a powerful generative AI technology, and investigate their potential for black-box optimization.
We study two practical types of labels: 1) noisy measurements of a real-valued reward function and 2) human preference based on pairwise comparisons.
Our proposed method reformulates the design optimization problem into a conditional sampling problem, which allows us to leverage the power of diffusion models.
arXiv Detail & Related papers (2024-03-20T00:41:12Z) - Generative VS non-Generative Models in Engineering Shape Optimization [0.3749861135832073]
We compare the effectiveness and efficiency of generative and non-generative models in constructing design spaces.
Non-generative models generate robust latent spaces with none or significantly fewer invalid designs when compared to generative models.
arXiv Detail & Related papers (2024-02-13T15:45:20Z) - Compositional Generative Inverse Design [69.22782875567547]
Inverse design, where we seek to design input variables in order to optimize an underlying objective function, is an important problem.
We show that by instead optimizing over the learned energy function captured by the diffusion model, we can avoid such adversarial examples.
In an N-body interaction task and a challenging 2D multi-airfoil design task, we demonstrate that by composing the learned diffusion model at test time, our method allows us to design initial states and boundary shapes.
arXiv Detail & Related papers (2024-01-24T01:33:39Z) - Functional Graphical Models: Structure Enables Offline Data-Driven Optimization [111.28605744661638]
We show how structure can enable sample-efficient data-driven optimization.
We also present a data-driven optimization algorithm that infers the FGM structure itself.
arXiv Detail & Related papers (2024-01-08T22:33:14Z) - Diffusion Generative Inverse Design [28.04683283070957]
Inverse design refers to the problem of optimizing the input of an objective function in order to enact a target outcome.
Recent developments in learned graph neural networks (GNNs) can be used for accurate, efficient, differentiable estimation of simulator dynamics.
We show how denoising diffusion diffusion models can be used to solve inverse design problems efficiently and propose a particle sampling algorithm for further improving their efficiency.
arXiv Detail & Related papers (2023-09-05T08:32:07Z) - Diffusing the Optimal Topology: A Generative Optimization Approach [6.375982344506753]
Topology optimization seeks to find the best design that satisfies a set of constraints while maximizing system performance.
Traditional iterative optimization methods like SIMP can be computationally expensive and get stuck in local minima.
We propose a Generative Optimization method that integrates classic optimization like SIMP as a refining mechanism for the topology generated by a deep generative model.
arXiv Detail & Related papers (2023-03-17T03:47:10Z) - When to Update Your Model: Constrained Model-based Reinforcement
Learning [50.74369835934703]
We propose a novel and general theoretical scheme for a non-decreasing performance guarantee of model-based RL (MBRL)
Our follow-up derived bounds reveal the relationship between model shifts and performance improvement.
A further example demonstrates that learning models from a dynamically-varying number of explorations benefit the eventual returns.
arXiv Detail & Related papers (2022-10-15T17:57:43Z) - TopoDiff: A Performance and Constraint-Guided Diffusion Model for
Topology Optimization [4.091593765662773]
TopoDiff is a conditional diffusion-model-based architecture for performance-aware and manufacturability-aware topology optimization.
Our method significantly outperforms a state-of-art conditional GAN by reducing the average error on physical performance by a factor of eight.
arXiv Detail & Related papers (2022-08-20T03:26:00Z) - Conservative Objective Models for Effective Offline Model-Based
Optimization [78.19085445065845]
Computational design problems arise in a number of settings, from synthetic biology to computer architectures.
We propose a method that learns a model of the objective function that lower bounds the actual value of the ground-truth objective on out-of-distribution inputs.
COMs are simple to implement and outperform a number of existing methods on a wide range of MBO problems.
arXiv Detail & Related papers (2021-07-14T17:55:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.