Topology optimization of 2D structures with nonlinearities using deep
learning
- URL: http://arxiv.org/abs/2002.01896v4
- Date: Mon, 13 Apr 2020 18:51:11 GMT
- Title: Topology optimization of 2D structures with nonlinearities using deep
learning
- Authors: Diab W. Abueidda, Seid Koric, Nahil A. Sobh
- Abstract summary: Cloud computing has made it possible to search for optimal nonlinear structures.
We develop convolutional neural network models to predict optimized designs.
The developed models are capable of accurately predicting the optimized designs without requiring an iterative scheme.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The field of optimal design of linear elastic structures has seen many
exciting successes that resulted in new architected materials and structural
designs. With the availability of cloud computing, including high-performance
computing, machine learning, and simulation, searching for optimal nonlinear
structures is now within reach. In this study, we develop convolutional neural
network models to predict optimized designs for a given set of boundary
conditions, loads, and optimization constraints. We have considered the case of
materials with a linear elastic response with and without stress constraint.
Also, we have considered the case of materials with a hyperelastic response,
where material and geometric nonlinearities are involved. For the nonlinear
elastic case, the neo-Hookean model is utilized. For this purpose, we generate
datasets composed of the optimized designs paired with the corresponding
boundary conditions, loads, and constraints, using a topology optimization
framework to train and validate the neural network models. The developed models
are capable of accurately predicting the optimized designs without requiring an
iterative scheme and with negligible inference computational time. The
suggested pipeline can be generalized to other nonlinear mechanics scenarios
and design domains.
Related papers
- Cliqueformer: Model-Based Optimization with Structured Transformers [102.55764949282906]
We develop a model that learns the structure of an MBO task and empirically leads to improved designs.
We evaluate Cliqueformer on various tasks, ranging from high-dimensional black-box functions to real-world tasks of chemical and genetic design.
arXiv Detail & Related papers (2024-10-17T00:35:47Z) - Diffusion Model for Data-Driven Black-Box Optimization [54.25693582870226]
We focus on diffusion models, a powerful generative AI technology, and investigate their potential for black-box optimization.
We study two practical types of labels: 1) noisy measurements of a real-valued reward function and 2) human preference based on pairwise comparisons.
Our proposed method reformulates the design optimization problem into a conditional sampling problem, which allows us to leverage the power of diffusion models.
arXiv Detail & Related papers (2024-03-20T00:41:12Z) - Functional Graphical Models: Structure Enables Offline Data-Driven Optimization [111.28605744661638]
We show how structure can enable sample-efficient data-driven optimization.
We also present a data-driven optimization algorithm that infers the FGM structure itself.
arXiv Detail & Related papers (2024-01-08T22:33:14Z) - The Convex Landscape of Neural Networks: Characterizing Global Optima
and Stationary Points via Lasso Models [75.33431791218302]
Deep Neural Network Network (DNN) models are used for programming purposes.
In this paper we examine the use of convex neural recovery models.
We show that all the stationary non-dimensional objective objective can be characterized as the standard a global subsampled convex solvers program.
We also show that all the stationary non-dimensional objective objective can be characterized as the standard a global subsampled convex solvers program.
arXiv Detail & Related papers (2023-12-19T23:04:56Z) - Neural Metamaterial Networks for Nonlinear Material Design [29.65492571110993]
We propose Metamaterial Networks -- neural representations that encode the nonlinear mechanics of entire metamaterial families.
We use this approach to automatically design materials with desired strain-stress curves, prescribed directional stiffness and Poisson ratio profiles.
arXiv Detail & Related papers (2023-09-15T13:50:43Z) - Optimistic Estimate Uncovers the Potential of Nonlinear Models [3.0041514772139166]
We propose an optimistic estimate to evaluate the best possible fitting performance of nonlinear models.
We estimate the optimistic sample sizes for matrix factorization models, deep models, and deep neural networks (DNNs) with fully-connected or convolutional architecture.
arXiv Detail & Related papers (2023-07-18T01:37:57Z) - Deep convolutional neural network for shape optimization using level-set
approach [0.0]
This article presents a reduced-order modeling methodology for shape optimization applications via deep convolutional neural networks (CNNs)
A CNN-based reduced-order model (ROM) is constructed in a completely data-driven manner, and suited for non-intrusive applications.
arXiv Detail & Related papers (2022-01-17T04:41:51Z) - LQF: Linear Quadratic Fine-Tuning [114.3840147070712]
We present the first method for linearizing a pre-trained model that achieves comparable performance to non-linear fine-tuning.
LQF consists of simple modifications to the architecture, loss function and optimization typically used for classification.
arXiv Detail & Related papers (2020-12-21T06:40:20Z) - An AI-Assisted Design Method for Topology Optimization Without
Pre-Optimized Training Data [68.8204255655161]
An AI-assisted design method based on topology optimization is presented, which is able to obtain optimized designs in a direct way.
Designs are provided by an artificial neural network, the predictor, on the basis of boundary conditions and degree of filling as input data.
arXiv Detail & Related papers (2020-12-11T14:33:27Z) - QRnet: optimal regulator design with LQR-augmented neural networks [2.8725913509167156]
We propose a new computational method for designing optimal regulators for high-dimensional nonlinear systems.
The proposed approach leverages physics-informed machine learning to solve high-dimensional Hamilton-Jacobi-Bellman equations.
We train the augmented models on data generated without discretizing the state space, enabling application to high-dimensional problems.
arXiv Detail & Related papers (2020-09-11T23:50:17Z) - The role of optimization geometry in single neuron learning [12.891722496444036]
Recent experiments have demonstrated the choice of optimization geometry can impact generalization performance when learning expressive neural model networks.
We show how the interplay between geometry and the feature geometry sets the out-of-sample leads and improves performance.
arXiv Detail & Related papers (2020-06-15T17:39:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.