Speeding up Computational Morphogenesis with Online Neural Synthetic
Gradients
- URL: http://arxiv.org/abs/2104.12282v2
- Date: Tue, 27 Apr 2021 01:27:57 GMT
- Title: Speeding up Computational Morphogenesis with Online Neural Synthetic
Gradients
- Authors: Yuyu Zhang, Heng Chi, Binghong Chen, Tsz Ling Elaine Tang, Lucia
Mirabella, Le Song, Glaucio H. Paulino
- Abstract summary: A wide range of modern science and engineering applications are formulated as optimization problems with a system of partial differential equations (PDEs) as constraints.
These PDE-constrained optimization problems are typically solved in a standard discretize-then-optimize approach.
We propose a general framework to speed up PDE-constrained optimization using online neural synthetic gradients (ONSG) with a novel two-scale optimization scheme.
- Score: 51.42959998304931
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A wide range of modern science and engineering applications are formulated as
optimization problems with a system of partial differential equations (PDEs) as
constraints. These PDE-constrained optimization problems are typically solved
in a standard discretize-then-optimize approach. In many industry applications
that require high-resolution solutions, the discretized constraints can easily
have millions or even billions of variables, making it very slow for the
standard iterative optimizer to solve the exact gradients. In this work, we
propose a general framework to speed up PDE-constrained optimization using
online neural synthetic gradients (ONSG) with a novel two-scale optimization
scheme. We successfully apply our ONSG framework to computational
morphogenesis, a representative and challenging class of PDE-constrained
optimization problems. Extensive experiments have demonstrated that our method
can significantly speed up computational morphogenesis (also known as topology
optimization), and meanwhile maintain the quality of final solution compared to
the standard optimizer. On a large-scale 3D optimal design problem with around
1,400,000 design variables, our method achieves up to 7.5x speedup while
producing optimized designs with comparable objectives.
Related papers
- Variational Quantum Framework for Partial Differential Equation Constrained Optimization [0.6138671548064355]
We present a novel variational quantum framework for PDE constrained optimization problems.
The proposed framework utilizes the variational quantum linear (VQLS) algorithm and a black box as its main building blocks.
arXiv Detail & Related papers (2024-05-26T18:06:43Z) - Gradient-free neural topology optimization [0.0]
gradient-free algorithms require many more iterations to converge when compared to gradient-based algorithms.
This has made them unviable for topology optimization due to the high computational cost per iteration and high dimensionality of these problems.
We propose a pre-trained neural reparameterization strategy that leads to at least one order of magnitude decrease in iteration count when optimizing the designs in latent space.
arXiv Detail & Related papers (2024-03-07T23:00:49Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Bi-level Physics-Informed Neural Networks for PDE Constrained
Optimization using Broyden's Hypergradients [29.487375792661005]
We present a novel bi-level optimization framework to solve PDE constrained optimization problems.
For the inner loop optimization, we adopt PINNs to solve the PDE constraints only.
For the outer loop, we design a novel method by using Broyden'simat method based on the Implicit Function Theorem.
arXiv Detail & Related papers (2022-09-15T06:21:24Z) - Optimal Design of Electric Machine with Efficient Handling of
Constraints and Surrogate Assistance [5.387300498478744]
This article proposes an optimization method incorporated into a popularly-used evolutionary multi-objective optimization algorithm - NSGA-II.
The proposed method exploits the inexpensiveness of geometric constraints to generate feasible designs by using a custom repair operator.
arXiv Detail & Related papers (2022-06-03T17:13:29Z) - Neural Stochastic Dual Dynamic Programming [99.80617899593526]
We introduce a trainable neural model that learns to map problem instances to a piece-wise linear value function.
$nu$-SDDP can significantly reduce problem solving cost without sacrificing solution quality.
arXiv Detail & Related papers (2021-12-01T22:55:23Z) - Physics-informed neural networks with hard constraints for inverse
design [3.8191831921441337]
We propose a new deep learning method -- physics-informed neural networks with hard constraints (hPINNs) -- for solving topology optimization.
We demonstrate the effectiveness of hPINN for a holography problem in optics and a fluid problem of Stokes flow.
arXiv Detail & Related papers (2021-02-09T03:18:15Z) - A Primer on Zeroth-Order Optimization in Signal Processing and Machine
Learning [95.85269649177336]
ZO optimization iteratively performs three major steps: gradient estimation, descent direction, and solution update.
We demonstrate promising applications of ZO optimization, such as evaluating and generating explanations from black-box deep learning models, and efficient online sensor management.
arXiv Detail & Related papers (2020-06-11T06:50:35Z) - Cross Entropy Hyperparameter Optimization for Constrained Problem
Hamiltonians Applied to QAOA [68.11912614360878]
Hybrid quantum-classical algorithms such as Quantum Approximate Optimization Algorithm (QAOA) are considered as one of the most encouraging approaches for taking advantage of near-term quantum computers in practical applications.
Such algorithms are usually implemented in a variational form, combining a classical optimization method with a quantum machine to find good solutions to an optimization problem.
In this study we apply a Cross-Entropy method to shape this landscape, which allows the classical parameter to find better parameters more easily and hence results in an improved performance.
arXiv Detail & Related papers (2020-03-11T13:52:41Z) - Self-Directed Online Machine Learning for Topology Optimization [58.920693413667216]
Self-directed Online Learning Optimization integrates Deep Neural Network (DNN) with Finite Element Method (FEM) calculations.
Our algorithm was tested by four types of problems including compliance minimization, fluid-structure optimization, heat transfer enhancement and truss optimization.
It reduced the computational time by 2 5 orders of magnitude compared with directly using methods, and outperformed all state-of-the-art algorithms tested in our experiments.
arXiv Detail & Related papers (2020-02-04T20:00:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.