Amortized Synthesis of Constrained Configurations Using a Differentiable
Surrogate
- URL: http://arxiv.org/abs/2106.09019v1
- Date: Wed, 16 Jun 2021 17:59:45 GMT
- Title: Amortized Synthesis of Constrained Configurations Using a Differentiable
Surrogate
- Authors: Xingyuan Sun, Tianju Xue, Szymon M. Rusinkiewicz, Ryan P. Adams
- Abstract summary: In design, fabrication, and control problems, we are often faced with the task of synthesis.
This many-to-one map presents challenges to the supervised learning of feed-forward synthesis.
We address both of these problems with a two-stage neural network architecture that we may consider to be an autoencoder.
- Score: 25.125736560730864
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In design, fabrication, and control problems, we are often faced with the
task of synthesis, in which we must generate an object or configuration that
satisfies a set of constraints while maximizing one or more objective
functions. The synthesis problem is typically characterized by a physical
process in which many different realizations may achieve the goal. This
many-to-one map presents challenges to the supervised learning of feed-forward
synthesis, as the set of viable designs may have a complex structure. In
addition, the non-differentiable nature of many physical simulations prevents
direct optimization. We address both of these problems with a two-stage neural
network architecture that we may consider to be an autoencoder. We first learn
the decoder: a differentiable surrogate that approximates the many-to-one
physical realization process. We then learn the encoder, which maps from goal
to design, while using the fixed decoder to evaluate the quality of the
realization. We evaluate the approach on two case studies: extruder path
planning in additive manufacturing and constrained soft robot inverse
kinematics. We compare our approach to direct optimization of design using the
learned surrogate, and to supervised learning of the synthesis problem. We find
that our approach produces higher quality solutions than supervised learning,
while being competitive in quality with direct optimization, at a greatly
reduced computational cost.
Related papers
- Automated Placement of Analog Integrated Circuits using Priority-based Constructive Heuristic [0.0]
We focus on the specific class of analog placement, which requires so-called pockets, their possible merging, and parametrizable minimum distances between devices.
Our solution minimizes the perimeter of the circuit's bounding box and the approximated wire length.
We show the quality of the proposed method on both synthetically generated and real-life industrial instances accompanied by manually created designs.
arXiv Detail & Related papers (2024-10-18T07:16:59Z) - LInK: Learning Joint Representations of Design and Performance Spaces through Contrastive Learning for Mechanism Synthesis [15.793704096341523]
In this paper, we introduce LInK, a novel framework that integrates contrastive learning of performance and design space with optimization techniques.
By leveraging a multimodal and transformation-invariant contrastive learning framework, LInK learns a joint representation that captures complex physics and design representations of mechanisms.
Our results demonstrate that LInK not only advances the field of mechanism design but also broadens the applicability of contrastive learning and optimization to other areas of engineering.
arXiv Detail & Related papers (2024-05-31T03:04:57Z) - Improving Subject-Driven Image Synthesis with Subject-Agnostic Guidance [62.15866177242207]
We show that through constructing a subject-agnostic condition, one could obtain outputs consistent with both the given subject and input text prompts.
Our approach is conceptually simple and requires only minimal code modifications, but leads to substantial quality improvements.
arXiv Detail & Related papers (2024-05-02T15:03:41Z) - Compositional Generative Inverse Design [69.22782875567547]
Inverse design, where we seek to design input variables in order to optimize an underlying objective function, is an important problem.
We show that by instead optimizing over the learned energy function captured by the diffusion model, we can avoid such adversarial examples.
In an N-body interaction task and a challenging 2D multi-airfoil design task, we demonstrate that by composing the learned diffusion model at test time, our method allows us to design initial states and boundary shapes.
arXiv Detail & Related papers (2024-01-24T01:33:39Z) - End-to-End Meta-Bayesian Optimisation with Transformer Neural Processes [52.818579746354665]
This paper proposes the first end-to-end differentiable meta-BO framework that generalises neural processes to learn acquisition functions via transformer architectures.
We enable this end-to-end framework with reinforcement learning (RL) to tackle the lack of labelled acquisition data.
arXiv Detail & Related papers (2023-05-25T10:58:46Z) - An End-to-End Differentiable Framework for Contact-Aware Robot Design [37.715596272425316]
We build an end-to-end differentiable framework for contact-aware robot design.
A novel deformation-based parameterization allows for the design of articulated rigid robots with arbitrary, complex geometry.
A differentiable rigid body simulator can handle contact-rich scenarios and computes analytical gradients for a full spectrum of kinematic and dynamic parameters.
arXiv Detail & Related papers (2021-07-15T17:53:44Z) - Collaborative Multidisciplinary Design Optimization with Neural Networks [1.2691047660244335]
We show that, in the case of Collaborative Optimization, faster and more reliable convergence can be obtained by solving an interesting instance of binary classification.
We propose to train a neural network with an asymmetric loss function, a structure that guarantees Lipshitz continuity, and a regularization towards respecting basic distance function properties.
arXiv Detail & Related papers (2021-06-11T00:03:47Z) - Machine Learning Framework for Quantum Sampling of Highly-Constrained,
Continuous Optimization Problems [101.18253437732933]
We develop a generic, machine learning-based framework for mapping continuous-space inverse design problems into surrogate unconstrained binary optimization problems.
We showcase the framework's performance on two inverse design problems by optimizing thermal emitter topologies for thermophotovoltaic applications and (ii) diffractive meta-gratings for highly efficient beam steering.
arXiv Detail & Related papers (2021-05-06T02:22:23Z) - Offline Model-Based Optimization via Normalized Maximum Likelihood
Estimation [101.22379613810881]
We consider data-driven optimization problems where one must maximize a function given only queries at a fixed set of points.
This problem setting emerges in many domains where function evaluation is a complex and expensive process.
We propose a tractable approximation that allows us to scale our method to high-capacity neural network models.
arXiv Detail & Related papers (2021-02-16T06:04:27Z) - Automatically Learning Compact Quality-aware Surrogates for Optimization
Problems [55.94450542785096]
Solving optimization problems with unknown parameters requires learning a predictive model to predict the values of the unknown parameters and then solving the problem using these values.
Recent work has shown that including the optimization problem as a layer in a complex training model pipeline results in predictions of iteration of unobserved decision making.
We show that we can improve solution quality by learning a low-dimensional surrogate model of a large optimization problem.
arXiv Detail & Related papers (2020-06-18T19:11:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.