Principal Component Analysis Applied to Gradient Fields in Band Gap
Optimization Problems for Metamaterials
- URL: http://arxiv.org/abs/2104.02588v1
- Date: Sun, 4 Apr 2021 11:13:37 GMT
- Title: Principal Component Analysis Applied to Gradient Fields in Band Gap
Optimization Problems for Metamaterials
- Authors: Giorgio Gnecco, Andrea Bacigalupo, Francesca Fantoni, and Daniela
Selvi
- Abstract summary: This article describes the application of a related unsupervised machine learning technique, namely, principal component analysis, to approximate the gradient of the objective function of a band gap optimization problem for an acoustic metamaterial.
Numerical results show the effectiveness of the proposed method.
- Score: 0.7699714865575189
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A promising technique for the spectral design of acoustic metamaterials is
based on the formulation of suitable constrained nonlinear optimization
problems. Unfortunately, the straightforward application of classical
gradient-based iterative optimization algorithms to the numerical solution of
such problems is typically highly demanding, due to the complexity of the
underlying physical models. Nevertheless, supervised machine learning
techniques can reduce such a computational effort, e.g., by replacing the
original objective functions of such optimization problems with more-easily
computable approximations. In this framework, the present article describes the
application of a related unsupervised machine learning technique, namely,
principal component analysis, to approximate the gradient of the objective
function of a band gap optimization problem for an acoustic metamaterial, with
the aim of making the successive application of a gradient-based iterative
optimization algorithm faster. Numerical results show the effectiveness of the
proposed method.
Related papers
- Analyzing and Enhancing the Backward-Pass Convergence of Unrolled
Optimization [50.38518771642365]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
A central challenge in this setting is backpropagation through the solution of an optimization problem, which often lacks a closed form.
This paper provides theoretical insights into the backward pass of unrolled optimization, showing that it is equivalent to the solution of a linear system by a particular iterative method.
A system called Folded Optimization is proposed to construct more efficient backpropagation rules from unrolled solver implementations.
arXiv Detail & Related papers (2023-12-28T23:15:18Z) - Linearization Algorithms for Fully Composite Optimization [61.20539085730636]
This paper studies first-order algorithms for solving fully composite optimization problems convex compact sets.
We leverage the structure of the objective by handling differentiable and non-differentiable separately, linearizing only the smooth parts.
arXiv Detail & Related papers (2023-02-24T18:41:48Z) - Backpropagation of Unrolled Solvers with Folded Optimization [55.04219793298687]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
One typical strategy is algorithm unrolling, which relies on automatic differentiation through the operations of an iterative solver.
This paper provides theoretical insights into the backward pass of unrolled optimization, leading to a system for generating efficiently solvable analytical models of backpropagation.
arXiv Detail & Related papers (2023-01-28T01:50:42Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Meta-Learning Digitized-Counterdiabatic Quantum Optimization [3.0638256603183054]
We tackle the problem of finding suitable initial parameters for variational optimization by employing a meta-learning technique using recurrent neural networks.
We investigate this technique with the recently proposed digitized-counterdiabatic quantum approximate optimization algorithm (DC-QAOA)
The combination of meta learning and DC-QAOA enables us to find optimal initial parameters for different models, such as MaxCut problem and the Sherrington-Kirkpatrick model.
arXiv Detail & Related papers (2022-06-20T18:57:50Z) - Optimal Design of Electric Machine with Efficient Handling of
Constraints and Surrogate Assistance [5.387300498478744]
This article proposes an optimization method incorporated into a popularly-used evolutionary multi-objective optimization algorithm - NSGA-II.
The proposed method exploits the inexpensiveness of geometric constraints to generate feasible designs by using a custom repair operator.
arXiv Detail & Related papers (2022-06-03T17:13:29Z) - Amortized Implicit Differentiation for Stochastic Bilevel Optimization [53.12363770169761]
We study a class of algorithms for solving bilevel optimization problems in both deterministic and deterministic settings.
We exploit a warm-start strategy to amortize the estimation of the exact gradient.
By using this framework, our analysis shows these algorithms to match the computational complexity of methods that have access to an unbiased estimate of the gradient.
arXiv Detail & Related papers (2021-11-29T15:10:09Z) - Quantum variational optimization: The role of entanglement and problem
hardness [0.0]
We study the role of entanglement, the structure of the variational quantum circuit, and the structure of the optimization problem.
Our numerical results indicate an advantage in adapting the distribution of entangling gates to the problem's topology.
We find evidence that applying conditional value at risk type cost functions improves the optimization, increasing the probability of overlap with the optimal solutions.
arXiv Detail & Related papers (2021-03-26T14:06:54Z) - A Primer on Zeroth-Order Optimization in Signal Processing and Machine
Learning [95.85269649177336]
ZO optimization iteratively performs three major steps: gradient estimation, descent direction, and solution update.
We demonstrate promising applications of ZO optimization, such as evaluating and generating explanations from black-box deep learning models, and efficient online sensor management.
arXiv Detail & Related papers (2020-06-11T06:50:35Z) - Objective-Sensitive Principal Component Analysis for High-Dimensional
Inverse Problems [0.0]
We present a novel approach for adaptive, differentiable parameterization of large-scale random fields.
The developed technique is based on principal component analysis (PCA) but modifies a purely data-driven basis of principal components considering objective function behavior.
Three algorithms for optimal parameter decomposition are presented and applied to an objective of 2D synthetic history matching.
arXiv Detail & Related papers (2020-06-02T18:51:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.