Generative Topology Optimization: Exploring Diverse Solutions in Structural Design
- URL: http://arxiv.org/abs/2502.13174v1
- Date: Mon, 17 Feb 2025 21:24:18 GMT
- Title: Generative Topology Optimization: Exploring Diverse Solutions in Structural Design
- Authors: Andreas Radler, Eric Volkmann, Johannes Brandstetter, Arturs Berzins,
- Abstract summary: Topology optimization (TO) is a family of computational methods that derive near-optimal geometries from formal problem descriptions.
We introduce Generative Topology Optimization (GenTO) - a data-free method that trains a neural network to generate structurally compliant shapes.
We show that GenTO produces more diverse solutions than any prior method while maintaining near-optimality.
- Score: 11.585099298173285
- License:
- Abstract: Topology optimization (TO) is a family of computational methods that derive near-optimal geometries from formal problem descriptions. Despite their success, established TO methods are limited to generating single solutions, restricting the exploration of alternative designs. To address this limitation, we introduce Generative Topology Optimization (GenTO) - a data-free method that trains a neural network to generate structurally compliant shapes and explores diverse solutions through an explicit diversity constraint. The network is trained with a solver-in-the-loop, optimizing the material distribution in each iteration. The trained model produces diverse shapes that closely adhere to the design requirements. We validate GenTO on 2D and 3D TO problems. Our results demonstrate that GenTO produces more diverse solutions than any prior method while maintaining near-optimality and being an order of magnitude faster due to inherent parallelism. These findings open new avenues for engineering and design, offering enhanced flexibility and innovation in structural optimization.
Related papers
- Training Deep Learning Models with Norm-Constrained LMOs [56.00317694850397]
We study optimization methods that leverage the linear minimization oracle (LMO) over a norm-ball.
We propose a new family of algorithms that uses the LMO to adapt to the geometry of the problem and, perhaps surprisingly, show that they can be applied to unconstrained problems.
arXiv Detail & Related papers (2025-02-11T13:10:34Z) - Integrating Optimization Theory with Deep Learning for Wireless Network Design [38.257335693563554]
Traditional wireless network design relies on optimization algorithms derived from domain-specific mathematical models.
Deep learning has emerged as a promising alternative to overcome complexity and adaptability concerns.
This paper introduces a novel approach that integrates optimization theory with deep learning methodologies to address these issues.
arXiv Detail & Related papers (2024-12-11T20:27:48Z) - Learning Joint Models of Prediction and Optimization [56.04498536842065]
Predict-Then-Then framework uses machine learning models to predict unknown parameters of an optimization problem from features before solving.
This paper proposes an alternative method, in which optimal solutions are learned directly from the observable features by joint predictive models.
arXiv Detail & Related papers (2024-09-07T19:52:14Z) - Diffusion Model for Data-Driven Black-Box Optimization [54.25693582870226]
We focus on diffusion models, a powerful generative AI technology, and investigate their potential for black-box optimization.
We study two practical types of labels: 1) noisy measurements of a real-valued reward function and 2) human preference based on pairwise comparisons.
Our proposed method reformulates the design optimization problem into a conditional sampling problem, which allows us to leverage the power of diffusion models.
arXiv Detail & Related papers (2024-03-20T00:41:12Z) - Evolutionary Alternating Direction Method of Multipliers for Constrained
Multi-Objective Optimization with Unknown Constraints [17.392113376816788]
Constrained multi-objective optimization problems (CMOPs) pervade real-world applications in science, engineering, and design.
We present the first of its kind evolutionary optimization framework, inspired by the principles of the alternating direction method of multipliers that decouples objective and constraint functions.
Our framework tackles CMOPs with unknown constraints by reformulating the original problem into an additive form of two subproblems, each of which is allotted a dedicated evolutionary population.
arXiv Detail & Related papers (2024-01-02T00:38:20Z) - Analyzing and Enhancing the Backward-Pass Convergence of Unrolled
Optimization [50.38518771642365]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
A central challenge in this setting is backpropagation through the solution of an optimization problem, which often lacks a closed form.
This paper provides theoretical insights into the backward pass of unrolled optimization, showing that it is equivalent to the solution of a linear system by a particular iterative method.
A system called Folded Optimization is proposed to construct more efficient backpropagation rules from unrolled solver implementations.
arXiv Detail & Related papers (2023-12-28T23:15:18Z) - Diffusing the Optimal Topology: A Generative Optimization Approach [6.375982344506753]
Topology optimization seeks to find the best design that satisfies a set of constraints while maximizing system performance.
Traditional iterative optimization methods like SIMP can be computationally expensive and get stuck in local minima.
We propose a Generative Optimization method that integrates classic optimization like SIMP as a refining mechanism for the topology generated by a deep generative model.
arXiv Detail & Related papers (2023-03-17T03:47:10Z) - Backpropagation of Unrolled Solvers with Folded Optimization [55.04219793298687]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
One typical strategy is algorithm unrolling, which relies on automatic differentiation through the operations of an iterative solver.
This paper provides theoretical insights into the backward pass of unrolled optimization, leading to a system for generating efficiently solvable analytical models of backpropagation.
arXiv Detail & Related papers (2023-01-28T01:50:42Z) - Automated Circuit Sizing with Multi-objective Optimization based on
Differential Evolution and Bayesian Inference [1.1579778934294358]
We introduce a design optimization method based on Generalized Differential Evolution 3 (GDE3) and Gaussian Processes (GPs)
The proposed method is able to perform sizing for complex circuits with a large number of design variables and many conflicting objectives to be optimized.
We evaluate the introduced method on two voltage regulators showing different levels of complexity.
arXiv Detail & Related papers (2022-06-06T06:48:45Z) - Optimization-Inspired Learning with Architecture Augmentations and
Control Mechanisms for Low-Level Vision [74.9260745577362]
This paper proposes a unified optimization-inspired learning framework to aggregate Generative, Discriminative, and Corrective (GDC) principles.
We construct three propagative modules to effectively solve the optimization models with flexible combinations.
Experiments across varied low-level vision tasks validate the efficacy and adaptability of GDC.
arXiv Detail & Related papers (2020-12-10T03:24:53Z) - Physics-consistent deep learning for structural topology optimization [8.391633158275692]
Topology optimization has emerged as a popular approach to refine a component's design and increasing its performance.
Current state-of-the-art topology optimization frameworks are compute-intensive.
In this paper, we explore a deep learning-based framework for performing topology optimization for three-dimensional geometries with a reasonably fine (high) resolution.
arXiv Detail & Related papers (2020-12-09T23:05:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.