LinEasyBO: Scalable Bayesian Optimization Approach for Analog Circuit
Synthesis via One-Dimensional Subspaces
- URL: http://arxiv.org/abs/2109.00617v1
- Date: Wed, 1 Sep 2021 21:25:25 GMT
- Title: LinEasyBO: Scalable Bayesian Optimization Approach for Analog Circuit
Synthesis via One-Dimensional Subspaces
- Authors: Shuhan Zhang, Fan Yang, Changhao Yan, Dian Zhou, Xuan Zeng
- Abstract summary: We propose a fast and robust Bayesian optimization approach via one-dimensional subspaces for analog circuit synthesis.
Our proposed algorithm can accelerate the optimization procedure by up to 9x and 38x compared to LP-EI and REMBOpBO respectively when the batch size is 15.
- Score: 11.64233949999656
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: A large body of literature has proved that the Bayesian optimization
framework is especially efficient and effective in analog circuit synthesis.
However, most of the previous research works only focus on designing
informative surrogate models or efficient acquisition functions. Even if
searching for the global optimum over the acquisition function surface is
itself a difficult task, it has been largely ignored. In this paper, we propose
a fast and robust Bayesian optimization approach via one-dimensional subspaces
for analog circuit synthesis. By solely focusing on optimizing one-dimension
subspaces at each iteration, we greatly reduce the computational overhead of
the Bayesian optimization framework while safely maximizing the acquisition
function. By combining the benefits of different dimension selection
strategies, we adaptively balancing between searching globally and locally. By
leveraging the batch Bayesian optimization framework, we further accelerate the
optimization procedure by making full use of the hardware resources.
Experimental results quantitatively show that our proposed algorithm can
accelerate the optimization procedure by up to 9x and 38x compared to LP-EI and
REMBOpBO respectively when the batch size is 15.
Related papers
- Provably Faster Algorithms for Bilevel Optimization via Without-Replacement Sampling [96.47086913559289]
gradient-based algorithms are widely used in bilevel optimization.
We introduce a without-replacement sampling based algorithm which achieves a faster convergence rate.
We validate our algorithms over both synthetic and real-world applications.
arXiv Detail & Related papers (2024-11-07T17:05:31Z) - Optimizing Posterior Samples for Bayesian Optimization via Rootfinding [2.94944680995069]
We introduce an efficient global optimization strategy for posterior samples based on global rootfinding.
We demonstrate remarkable improvement in both inner- and outer-loop optimization.
We also propose a sample-average formulation of GP-TS, which has a parameter to explicitly control exploitation.
arXiv Detail & Related papers (2024-10-29T17:57:16Z) - Localized Zeroth-Order Prompt Optimization [54.964765668688806]
We propose a novel algorithm, namely localized zeroth-order prompt optimization (ZOPO)
ZOPO incorporates a Neural Tangent Kernel-based derived Gaussian process into standard zeroth-order optimization for an efficient search of well-performing local optima in prompt optimization.
Remarkably, ZOPO outperforms existing baselines in terms of both the optimization performance and the query efficiency.
arXiv Detail & Related papers (2024-03-05T14:18:15Z) - Analyzing and Enhancing the Backward-Pass Convergence of Unrolled
Optimization [50.38518771642365]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
A central challenge in this setting is backpropagation through the solution of an optimization problem, which often lacks a closed form.
This paper provides theoretical insights into the backward pass of unrolled optimization, showing that it is equivalent to the solution of a linear system by a particular iterative method.
A system called Folded Optimization is proposed to construct more efficient backpropagation rules from unrolled solver implementations.
arXiv Detail & Related papers (2023-12-28T23:15:18Z) - Parallel Bayesian Optimization Using Satisficing Thompson Sampling for
Time-Sensitive Black-Box Optimization [0.0]
We propose satisficing Thompson sampling-based parallel BO approaches, including synchronous and asynchronous versions.
We shift the target from an optimal solution to a satisficing solution that is easier to learn.
The effectiveness of the proposed methods is demonstrated on a fast-charging design problem of Lithium-ion batteries.
arXiv Detail & Related papers (2023-10-19T07:03:51Z) - Learning Regions of Interest for Bayesian Optimization with Adaptive
Level-Set Estimation [84.0621253654014]
We propose a framework, called BALLET, which adaptively filters for a high-confidence region of interest.
We show theoretically that BALLET can efficiently shrink the search space, and can exhibit a tighter regret bound than standard BO.
arXiv Detail & Related papers (2023-07-25T09:45:47Z) - SOBER: Highly Parallel Bayesian Optimization and Bayesian Quadrature
over Discrete and Mixed Spaces [6.573393706476156]
We present a novel diversified global optimisation quadrature with arbitrary kernels over discrete and mixed spaces.
Batch quadrature can efficiently solve both tasks by balancing the merits of exploitative Bayesian quadrature.
We show that SOBER outperforms competitive baselinesefficient batch and scalable real-world tasks.
arXiv Detail & Related papers (2023-01-27T16:36:33Z) - Optimistic Optimization of Gaussian Process Samples [30.226274682578172]
A competing, computationally more efficient, global optimization framework is optimistic optimization, which exploits prior knowledge about the geometry of the search space in form of a dissimilarity function.
We argue that there is a new research domain between geometric and probabilistic search, i.e. methods that run drastically faster than traditional Bayesian optimization, while retaining some of the crucial functionality of Bayesian optimization.
arXiv Detail & Related papers (2022-09-02T09:06:24Z) - Non-Convex Optimization with Certificates and Fast Rates Through Kernel
Sums of Squares [68.8204255655161]
We consider potentially non- optimized approximation problems.
In this paper, we propose an algorithm that achieves close to optimal a priori computational guarantees.
arXiv Detail & Related papers (2022-04-11T09:37:04Z) - An Efficient Batch Constrained Bayesian Optimization Approach for Analog
Circuit Synthesis via Multi-objective Acquisition Ensemble [11.64233949999656]
We propose an efficient parallelizable Bayesian optimization algorithm via Multi-objective ACquisition function Ensemble (MACE)
Our proposed algorithm can reduce the overall simulation time by up to 74 times compared to differential evolution (DE) for the unconstrained optimization problem when the batch size is 15.
For the constrained optimization problem, our proposed algorithm can speed up the optimization process by up to 15 times compared to the weighted expected improvement based Bayesian optimization (WEIBO) approach, when the batch size is 15.
arXiv Detail & Related papers (2021-06-28T13:21:28Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.