A Framework for Discovering Optimal Solutions in Photonic Inverse Design
- URL: http://arxiv.org/abs/2106.08419v1
- Date: Thu, 3 Jun 2021 22:11:03 GMT
- Title: A Framework for Discovering Optimal Solutions in Photonic Inverse Design
- Authors: Jagrit Digani, Phillip Hon, Artur R. Davoyan
- Abstract summary: Photonic inverse design has emerged as an indispensable engineering tool for complex optical systems.
Finding solutions approaching global optimum may present a computationally intractable task.
We develop a framework that allows expediting the search of solutions close to global optimum on complex optimization spaces.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Photonic inverse design has emerged as an indispensable engineering tool for
complex optical systems. In many instances it is important to optimize for both
material and geometry configurations, which results in complex non-smooth
search spaces with multiple local minima. Finding solutions approaching global
optimum may present a computationally intractable task. Here, we develop a
framework that allows expediting the search of solutions close to global
optimum on complex optimization spaces. We study the way representative black
box optimization algorithms work, including genetic algorithm (GA), particle
swarm optimization (PSO), simulated annealing (SA), and mesh adaptive direct
search (NOMAD). We then propose and utilize a two-step approach that identifies
best performance algorithms on arbitrarily complex search spaces. We reveal a
connection between the search space complexity and algorithm performance and
find that PSO and NOMAD consistently deliver better performance for mixed
integer problems encountered in photonic inverse design, particularly with the
account of material combinations. Our results differ from a commonly
anticipated advantage of GA. Our findings will foster more efficient design of
photonic systems with optimal performance.
Related papers
- Halfway Escape Optimization: A Quantum-Inspired Solution for Complex Optimization Problems [6.3816899727206895]
Halfway Escape Optimization (HEO) algorithm is a novel quantum-inspired metaheuristic designed to address complex optimization problems characterized by rugged landscapes and high-dimensionality with an efficient convergence rate.
The study presents a comprehensive comparative evaluation of HEO's performance against established optimization algorithms, including Particle Swarm Optimization (PSO), Genetic Algorithm (GA), Artificial Fish Swarm Algorithm (A), Grey Wolf (GWO), and Quantum behaved Particle Swarm Optimization (QPSO)
The simple test of HEO in Traveling Salesman Problem (TSP) also infers its feasibility in real-time applications.
arXiv Detail & Related papers (2024-05-05T08:43:07Z) - Federated Conditional Stochastic Optimization [110.513884892319]
Conditional optimization has found in a wide range of machine learning tasks, such as in-variant learning tasks, AUPRC, andAML.
This paper proposes algorithms for distributed federated learning.
arXiv Detail & Related papers (2023-10-04T01:47:37Z) - Sample Complexity for Quadratic Bandits: Hessian Dependent Bounds and
Optimal Algorithms [64.10576998630981]
We show the first tight characterization of the optimal Hessian-dependent sample complexity.
A Hessian-independent algorithm universally achieves the optimal sample complexities for all Hessian instances.
The optimal sample complexities achieved by our algorithm remain valid for heavy-tailed noise distributions.
arXiv Detail & Related papers (2023-06-21T17:03:22Z) - Approaching Globally Optimal Energy Efficiency in Interference Networks
via Machine Learning [22.926877147296594]
This work presents a machine learning approach to optimize the energy efficiency (EE) in a multi-cell wireless network.
Results show that the method achieves an EE close to the optimum by the branch-and- computation testing.
arXiv Detail & Related papers (2022-11-25T08:36:34Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Efficient Non-Parametric Optimizer Search for Diverse Tasks [93.64739408827604]
We present the first efficient scalable and general framework that can directly search on the tasks of interest.
Inspired by the innate tree structure of the underlying math expressions, we re-arrange the spaces into a super-tree.
We adopt an adaptation of the Monte Carlo method to tree search, equipped with rejection sampling and equivalent- form detection.
arXiv Detail & Related papers (2022-09-27T17:51:31Z) - Fighting the curse of dimensionality: A machine learning approach to
finding global optima [77.34726150561087]
This paper shows how to find global optima in structural optimization problems.
By exploiting certain cost functions we either obtain the global at best or obtain superior results at worst when compared to established optimization procedures.
arXiv Detail & Related papers (2021-10-28T09:50:29Z) - LinEasyBO: Scalable Bayesian Optimization Approach for Analog Circuit
Synthesis via One-Dimensional Subspaces [11.64233949999656]
We propose a fast and robust Bayesian optimization approach via one-dimensional subspaces for analog circuit synthesis.
Our proposed algorithm can accelerate the optimization procedure by up to 9x and 38x compared to LP-EI and REMBOpBO respectively when the batch size is 15.
arXiv Detail & Related papers (2021-09-01T21:25:25Z) - Robust Topology Optimization Using Variational Autoencoders [2.580765958706854]
In this work, we use neural network surrogates to enable a faster solution approach via surrogate-based optimization.
We also build a Variational Autoencoder (VAE) to transform the high dimensional design space into a low dimensional one.
The resulting gradient-based optimization algorithm produces optimal designs with lower robust compliances than those observed in the training set.
arXiv Detail & Related papers (2021-07-19T20:40:51Z) - Manifold Interpolation for Large-Scale Multi-Objective Optimization via
Generative Adversarial Networks [12.18471608552718]
Large-scale multiobjective optimization problems (LSMOPs) are characterized as involving hundreds or even thousands of decision variables and multiple conflicting objectives.
Previous research has shown that these optimal solutions are uniformly distributed on the manifold structure in the low-dimensional space.
In this work, a generative adversarial network (GAN)-based manifold framework is proposed to learn the manifold and generate high-quality solutions.
arXiv Detail & Related papers (2021-01-08T09:38:38Z) - Towards Optimally Efficient Tree Search with Deep Learning [76.64632985696237]
This paper investigates the classical integer least-squares problem which estimates signals integer from linear models.
The problem is NP-hard and often arises in diverse applications such as signal processing, bioinformatics, communications and machine learning.
We propose a general hyper-accelerated tree search (HATS) algorithm by employing a deep neural network to estimate the optimal estimation for the underlying simplified memory-bounded A* algorithm.
arXiv Detail & Related papers (2021-01-07T08:00:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.