Multi-scale Topology Optimization using Neural Networks
- URL: http://arxiv.org/abs/2404.08708v1
- Date: Thu, 11 Apr 2024 18:00:22 GMT
- Title: Multi-scale Topology Optimization using Neural Networks
- Authors: Hongrui Chen, Xingchen Liu, Levent Burak Kara,
- Abstract summary: A long-standing challenge is designing multi-scale structures with good connectivity between cells.
We propose a new method for direct multi-scale topology optimization using neural networks.
- Score: 2.8154992696398784
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: A long-standing challenge is designing multi-scale structures with good connectivity between cells while optimizing each cell to reach close to the theoretical performance limit. We propose a new method for direct multi-scale topology optimization using neural networks. Our approach focuses on inverse homogenization that seamlessly maintains compatibility across neighboring microstructure cells. Our approach consists of a topology neural network that optimizes the microstructure shape and distribution across the design domain as a continuous field. Each microstructure cell is optimized based on a specified elasticity tensor that also accommodates in-plane rotations. The neural network takes as input the local coordinates within a cell to represent the density distribution within a cell, as well as the global coordinates of each cell to design spatially varying microstructure cells. As such, our approach models an n-dimensional multi-scale optimization problem as a 2n-dimensional inverse homogenization problem using neural networks. During the inverse homogenization of each unit cell, we extend the boundary of each cell by scaling the input coordinates such that the boundaries of neighboring cells are combined. Inverse homogenization on the combined cell improves connectivity. We demonstrate our method through the design and optimization of graded multi-scale structures.
Related papers
- Attending to Topological Spaces: The Cellular Transformer [37.84207797241944]
Topological Deep Learning seeks to enhance the predictive performance of neural network models by harnessing topological structures in input data.
We introduce the Cellular Transformer (CT), a novel architecture that generalizes graph-based transformers to cell complexes.
CT achieves state-of-the-art performance, but it does so without the need for more complex enhancements.
arXiv Detail & Related papers (2024-05-23T01:48:32Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Topology Optimization using Neural Networks with Conditioning Field
Initialization for Improved Efficiency [2.575019247624295]
We show that by using a prior initial field on the unoptimized domain, the efficiency of neural network based topology optimization can be further improved.
We employ the strain energy field calculated on the initial design domain as an additional conditioning field input to the neural network throughout the optimization.
arXiv Detail & Related papers (2023-05-17T07:42:24Z) - Multi-agent Reinforcement Learning with Graph Q-Networks for Antenna
Tuning [60.94661435297309]
The scale of mobile networks makes it challenging to optimize antenna parameters using manual intervention or hand-engineered strategies.
We propose a new multi-agent reinforcement learning algorithm to optimize mobile network configurations globally.
We empirically demonstrate the performance of the algorithm on an antenna tilt tuning problem and a joint tilt and power control problem in a simulated environment.
arXiv Detail & Related papers (2023-01-20T17:06:34Z) - Concurrent build direction, part segmentation, and topology optimization
for additive manufacturing using neural networks [2.2911466677853065]
We propose a neural network-based approach to topology optimization that aims to reduce the use of support structures in additive manufacturing.
Our approach uses a network architecture that allows the simultaneous determination of an optimized: (1) part segmentation, (2) the topology of each part, and (3) the build direction of each part.
arXiv Detail & Related papers (2022-10-04T02:17:54Z) - Learning Autonomy in Management of Wireless Random Networks [102.02142856863563]
This paper presents a machine learning strategy that tackles a distributed optimization task in a wireless network with an arbitrary number of randomly interconnected nodes.
We develop a flexible deep neural network formalism termed distributed message-passing neural network (DMPNN) with forward and backward computations independent of the network topology.
arXiv Detail & Related papers (2021-06-15T09:03:28Z) - Data-Driven Multiscale Design of Cellular Composites with Multiclass
Microstructures for Natural Frequency Maximization [14.337297795182181]
We propose a data-driven topology optimization (TO) approach to enable the multiscale design of cellular structures.
The framework can be easily extended to other multi-scale TO problems, such as thermal compliance and dynamic response optimization.
arXiv Detail & Related papers (2021-06-11T15:59:33Z) - EBM-Fold: Fully-Differentiable Protein Folding Powered by Energy-based
Models [53.17320541056843]
We propose a fully-differentiable approach for protein structure optimization, guided by a data-driven generative network.
Our EBM-Fold approach can efficiently produce high-quality decoys, compared against traditional Rosetta-based structure optimization routines.
arXiv Detail & Related papers (2021-05-11T03:40:29Z) - IH-GAN: A Conditional Generative Model for Implicit Surface-Based
Inverse Design of Cellular Structures [15.540823405781337]
We propose a deep generative model that generates diverse cellular unit cells conditioned on desired material properties.
Results show that our method can 1) generate various unit cells that satisfy given material properties with high accuracy (relative error 5%), 2) create functionally graded cellular structures with high-quality interface connectivity (98.7% average overlap area at interfaces), and 3) improve the structural performance over the conventional topology-optimized variable-density structure.
arXiv Detail & Related papers (2021-03-03T18:39:25Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Communication-Efficient Distributed Stochastic AUC Maximization with
Deep Neural Networks [50.42141893913188]
We study a distributed variable for large-scale AUC for a neural network as with a deep neural network.
Our model requires a much less number of communication rounds and still a number of communication rounds in theory.
Our experiments on several datasets show the effectiveness of our theory and also confirm our theory.
arXiv Detail & Related papers (2020-05-05T18:08:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.