Robust Topology Optimization Using Variational Autoencoders
- URL: http://arxiv.org/abs/2107.10661v1
- Date: Mon, 19 Jul 2021 20:40:51 GMT
- Title: Robust Topology Optimization Using Variational Autoencoders
- Authors: Rini Jasmine Gladstone, Mohammad Amin Nabian, Vahid Keshavarzzadeh,
Hadi Meidani
- Abstract summary: In this work, we use neural network surrogates to enable a faster solution approach via surrogate-based optimization.
We also build a Variational Autoencoder (VAE) to transform the high dimensional design space into a low dimensional one.
The resulting gradient-based optimization algorithm produces optimal designs with lower robust compliances than those observed in the training set.
- Score: 2.580765958706854
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Topology Optimization is the process of finding the optimal arrangement of
materials within a design domain by minimizing a cost function, subject to some
performance constraints. Robust topology optimization (RTO) also incorporates
the effect of input uncertainties and produces a design with the best average
performance of the structure while reducing the response sensitivity to input
uncertainties. It is computationally expensive to carry out RTO using finite
element and Monte Carlo sampling. In this work, we use neural network
surrogates to enable a faster solution approach via surrogate-based
optimization and build a Variational Autoencoder (VAE) to transform the the
high dimensional design space into a low dimensional one. Furthermore, finite
element solvers will be replaced by a neural network surrogate. Also, to
further facilitate the design exploration, we limit our search to a subspace,
which consists of designs that are solutions to deterministic topology
optimization problems under different realizations of input uncertainties. With
these neural network approximations, a gradient-based optimization approach is
formed to minimize the predicted objective function over the low dimensional
design subspace. We demonstrate the effectiveness of the proposed approach on
two compliance minimization problems and show that VAE performs well on
learning the features of the design from minimal training data, and that
converting the design space into a low dimensional latent space makes the
problem computationally efficient. The resulting gradient-based optimization
algorithm produces optimal designs with lower robust compliances than those
observed in the training set.
Related papers
- A surrogate model for topology optimisation of elastic structures via parametric autoencoders [0.0]
Instead of learning the parametric solution of the state (and adjoint) problems, the proposed approach devises a surrogate version of the entire optimisation pipeline.<n>The method predicts a quasi-optimal topology for a given problem configuration as a surrogate model of high-fidelity topologies optimised with the homogenisation method.<n>Different architectures are proposed and the approximation and generalisation capabilities of the resulting models are numerically evaluated.
arXiv Detail & Related papers (2025-07-30T10:07:42Z) - Preference Optimization for Combinatorial Optimization Problems [54.87466279363487]
Reinforcement Learning (RL) has emerged as a powerful tool for neural optimization, enabling models learns that solve complex problems without requiring expert knowledge.<n>Despite significant progress, existing RL approaches face challenges such as diminishing reward signals and inefficient exploration in vast action spaces.<n>We propose Preference Optimization, a novel method that transforms quantitative reward signals into qualitative preference signals via statistical comparison modeling.
arXiv Detail & Related papers (2025-05-13T16:47:00Z) - Simultaneous and Meshfree Topology Optimization with Physics-informed Gaussian Processes [0.0]
Topology optimization (TO) provides a principled mathematical approach for optimizing the performance of a structure by designing its material spatial distribution in a pre-defined domain and subject to a set of constraints.
We develop a new class of TO methods based on the framework of Gaussian processes (GPs) whose mean functions are parameterized via deep neural networks.
To test our method against conventional TO approaches implemented in commercial software, we evaluate it on four problems involving the minimization of dissipated power in Stokes flow.
arXiv Detail & Related papers (2024-08-07T01:01:35Z) - Federated Conditional Stochastic Optimization [110.513884892319]
Conditional optimization has found in a wide range of machine learning tasks, such as in-variant learning tasks, AUPRC, andAML.
This paper proposes algorithms for distributed federated learning.
arXiv Detail & Related papers (2023-10-04T01:47:37Z) - Bayesian Quality-Diversity approaches for constrained optimization
problems with mixed continuous, discrete and categorical variables [0.3626013617212667]
A new Quality-Diversity methodology based on mixed variables is proposed in the context of limited simulation budget.
The proposed approach provides valuable trade-offs for decision-markers for complex system design.
arXiv Detail & Related papers (2023-09-11T14:29:47Z) - A Multi-Head Ensemble Multi-Task Learning Approach for Dynamical
Computation Offloading [62.34538208323411]
We propose a multi-head ensemble multi-task learning (MEMTL) approach with a shared backbone and multiple prediction heads (PHs)
MEMTL outperforms benchmark methods in both the inference accuracy and mean square error without requiring additional training data.
arXiv Detail & Related papers (2023-09-02T11:01:16Z) - Backpropagation of Unrolled Solvers with Folded Optimization [55.04219793298687]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
One typical strategy is algorithm unrolling, which relies on automatic differentiation through the operations of an iterative solver.
This paper provides theoretical insights into the backward pass of unrolled optimization, leading to a system for generating efficiently solvable analytical models of backpropagation.
arXiv Detail & Related papers (2023-01-28T01:50:42Z) - Multi-objective robust optimization using adaptive surrogate models for
problems with mixed continuous-categorical parameters [0.0]
Robust design optimization is traditionally considered when uncertainties are mainly affecting the objective function.
The resulting nested optimization problem may be solved using a general-purpose solver, herein the non-dominated sorting genetic algorithm (NSGA-II)
The proposed approach consists of sequentially carrying out NSGA-II while using an adaptively built Kriging model to estimate the quantiles.
arXiv Detail & Related papers (2022-03-03T20:23:18Z) - Offline Model-Based Optimization via Normalized Maximum Likelihood
Estimation [101.22379613810881]
We consider data-driven optimization problems where one must maximize a function given only queries at a fixed set of points.
This problem setting emerges in many domains where function evaluation is a complex and expensive process.
We propose a tractable approximation that allows us to scale our method to high-capacity neural network models.
arXiv Detail & Related papers (2021-02-16T06:04:27Z) - An AI-Assisted Design Method for Topology Optimization Without
Pre-Optimized Training Data [68.8204255655161]
An AI-assisted design method based on topology optimization is presented, which is able to obtain optimized designs in a direct way.
Designs are provided by an artificial neural network, the predictor, on the basis of boundary conditions and degree of filling as input data.
arXiv Detail & Related papers (2020-12-11T14:33:27Z) - Efficient and Sparse Neural Networks by Pruning Weights in a
Multiobjective Learning Approach [0.0]
We propose a multiobjective perspective on the training of neural networks by treating its prediction accuracy and the network complexity as two individual objective functions.
Preliminary numerical results on exemplary convolutional neural networks confirm that large reductions in the complexity of neural networks with neglibile loss of accuracy are possible.
arXiv Detail & Related papers (2020-08-31T13:28:03Z) - Optimal Bayesian experimental design for subsurface flow problems [77.34726150561087]
We propose a novel approach for development of chaos expansion (PCE) surrogate model for the design utility function.
This novel technique enables the derivation of a reasonable quality response surface for the targeted objective function with a computational budget comparable to several single-point evaluations.
arXiv Detail & Related papers (2020-08-10T09:42:59Z) - Objective-Sensitive Principal Component Analysis for High-Dimensional
Inverse Problems [0.0]
We present a novel approach for adaptive, differentiable parameterization of large-scale random fields.
The developed technique is based on principal component analysis (PCA) but modifies a purely data-driven basis of principal components considering objective function behavior.
Three algorithms for optimal parameter decomposition are presented and applied to an objective of 2D synthetic history matching.
arXiv Detail & Related papers (2020-06-02T18:51:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.