Latent Diffusion Models for Structural Component Design
- URL: http://arxiv.org/abs/2309.11601v2
- Date: Sun, 24 Sep 2023 17:22:18 GMT
- Title: Latent Diffusion Models for Structural Component Design
- Authors: Ethan Herron, Jaydeep Rade, Anushrut Jignasu, Baskar
Ganapathysubramanian, Aditya Balu, Soumik Sarkar, Adarsh Krishnamurthy
- Abstract summary: This paper proposes a framework for the generative design of structural components.
We employ a Latent Diffusion model to generate potential designs of a component that can satisfy a set of problem-specific loading conditions.
- Score: 11.342098118480802
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Recent advances in generative modeling, namely Diffusion models, have
revolutionized generative modeling, enabling high-quality image generation
tailored to user needs. This paper proposes a framework for the generative
design of structural components. Specifically, we employ a Latent Diffusion
model to generate potential designs of a component that can satisfy a set of
problem-specific loading conditions. One of the distinct advantages our
approach offers over other generative approaches, such as generative
adversarial networks (GANs), is that it permits the editing of existing
designs. We train our model using a dataset of geometries obtained from
structural topology optimization utilizing the SIMP algorithm. Consequently,
our framework generates inherently near-optimal designs. Our work presents
quantitative results that support the structural performance of the generated
designs and the variability in potential candidate designs. Furthermore, we
provide evidence of the scalability of our framework by operating over voxel
domains with resolutions varying from $32^3$ to $128^3$. Our framework can be
used as a starting point for generating novel near-optimal designs similar to
topology-optimized designs.
Related papers
- Jet: A Modern Transformer-Based Normalizing Flow [62.2573739835562]
We revisit the design of the coupling-based normalizing flow models by carefully ablating prior design choices.
We achieve state-of-the-art quantitative and qualitative performance with a much simpler architecture.
arXiv Detail & Related papers (2024-12-19T18:09:42Z) - Cliqueformer: Model-Based Optimization with Structured Transformers [102.55764949282906]
Large neural networks excel at prediction tasks, but their application to design problems, such as protein engineering or materials discovery, requires solving offline model-based optimization (MBO) problems.
We present Cliqueformer, a transformer-based architecture that learns the black-box function's structure through functional graphical models (FGM)
Across various domains, including chemical and genetic design tasks, Cliqueformer demonstrates superior performance compared to existing methods.
arXiv Detail & Related papers (2024-10-17T00:35:47Z) - Evolutive Rendering Models [91.99498492855187]
We present textitevolutive rendering models, a methodology where rendering models possess the ability to evolve and adapt dynamically throughout rendering process.
In particular, we present a comprehensive learning framework that enables the optimization of three principal rendering elements.
A detailed analysis of gradient characteristics is performed to facilitate a stable goal-oriented elements evolution.
arXiv Detail & Related papers (2024-05-27T17:40:00Z) - Diffusion Model for Data-Driven Black-Box Optimization [54.25693582870226]
We focus on diffusion models, a powerful generative AI technology, and investigate their potential for black-box optimization.
We study two practical types of labels: 1) noisy measurements of a real-valued reward function and 2) human preference based on pairwise comparisons.
Our proposed method reformulates the design optimization problem into a conditional sampling problem, which allows us to leverage the power of diffusion models.
arXiv Detail & Related papers (2024-03-20T00:41:12Z) - DecompOpt: Controllable and Decomposed Diffusion Models for Structure-based Molecular Optimization [49.85944390503957]
DecompOpt is a structure-based molecular optimization method based on a controllable and diffusion model.
We show that DecompOpt can efficiently generate molecules with improved properties than strong de novo baselines.
arXiv Detail & Related papers (2024-03-07T02:53:40Z) - Generative VS non-Generative Models in Engineering Shape Optimization [0.3749861135832073]
We compare the effectiveness and efficiency of generative and non-generative models in constructing design spaces.
Non-generative models generate robust latent spaces with none or significantly fewer invalid designs when compared to generative models.
arXiv Detail & Related papers (2024-02-13T15:45:20Z) - Compositional Generative Inverse Design [69.22782875567547]
Inverse design, where we seek to design input variables in order to optimize an underlying objective function, is an important problem.
We show that by instead optimizing over the learned energy function captured by the diffusion model, we can avoid such adversarial examples.
In an N-body interaction task and a challenging 2D multi-airfoil design task, we demonstrate that by composing the learned diffusion model at test time, our method allows us to design initial states and boundary shapes.
arXiv Detail & Related papers (2024-01-24T01:33:39Z) - Aligning Optimization Trajectories with Diffusion Models for Constrained
Design Generation [17.164961143132473]
We introduce a learning framework that demonstrates the efficacy of aligning the sampling trajectory of diffusion models with the optimization trajectory derived from traditional physics-based methods.
Our method allows for generating feasible and high-performance designs in as few as two steps without the need for expensive preprocessing, external surrogate models, or additional labeled data.
Our results demonstrate that TA outperforms state-of-the-art deep generative models on in-distribution configurations and halves the inference computational cost.
arXiv Detail & Related papers (2023-05-29T09:16:07Z) - Hierarchical Deep Generative Models for Design Under Free-Form Geometric
Uncertainty [7.362287148334665]
We propose a Generative Adversarial Network-based Design under Uncertainty Framework (GAN-DUF)
It simultaneously learns a compact representation of nominal (ideal) designs and the conditional distribution of fabricated designs given any nominal design.
We can combine the proposed deep generative model with robust design optimization or reliability-based design optimization for design under uncertainty.
arXiv Detail & Related papers (2022-02-21T22:21:07Z) - PaDGAN: A Generative Adversarial Network for Performance Augmented
Diverse Designs [13.866787416457454]
We develop a variant of the Generative Adversarial Network, named "Performance Augmented Diverse Generative Adversarial Network" or PaDGAN, which can generate novel high-quality designs with good coverage of the design space.
In comparison to a vanilla Generative Adversarial Network, on average, it generates samples with a 28% higher mean quality score with larger diversity and without the mode collapse issue.
arXiv Detail & Related papers (2020-02-26T04:53:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.