Approximation of optimization problems with constraints through kernel
Sum-Of-Squares
- URL: http://arxiv.org/abs/2301.06339v2
- Date: Wed, 21 Feb 2024 15:14:39 GMT
- Title: Approximation of optimization problems with constraints through kernel
Sum-Of-Squares
- Authors: Pierre-Cyril Aubin-Frankowski and Alessandro Rudi
- Abstract summary: We show that pointwise inequalities are turned into equalities within a class of nonnegative kSoS functions.
We also show that focusing on pointwise equality constraints enables the use of scattering inequalities to mitigate the curse of dimensionality in sampling the constraints.
- Score: 77.27820145069515
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Handling an infinite number of inequality constraints in infinite-dimensional
spaces occurs in many fields, from global optimization to optimal transport.
These problems have been tackled individually in several previous articles
through kernel Sum-Of-Squares (kSoS) approximations. We propose here a unified
theorem to prove convergence guarantees for these schemes. Pointwise
inequalities are turned into equalities within a class of nonnegative kSoS
functions. Assuming further that the functions appearing in the problem are
smooth, focusing on pointwise equality constraints enables the use of
scattering inequalities to mitigate the curse of dimensionality in sampling the
constraints. Our approach is illustrated in learning vector fields with side
information, here the invariance of a set.
Related papers
- Convergence guarantee for linearly-constrained combinatorial optimization with a quantum alternating operator ansatz [0.0]
We present a quantum alternating operator ansatz (QAOA$+$) that solves a class of linearly constrained optimization problems.
For problems in this class, we devise circuits that provably converge to the optimal solution as the number of circuit layers increases.
This analysis extends QAOA$+$ performance guarantees to a more general set of linearly-constrained problems and provides tools for future generalizations.
arXiv Detail & Related papers (2024-09-27T15:23:47Z) - Primal Methods for Variational Inequality Problems with Functional Constraints [25.261426717550293]
We propose a primal method, termed Constrained Gradient Method (CGM), for addressing functional constrained variational inequality problems.
We establish a non-asymptotic convergence analysis of the algorithm for variational inequality problems with monotone operators under smooth constraints.
Our algorithms match the complexity of projection-based methods in terms of operator queries for both monotone and strongly monotone settings.
arXiv Detail & Related papers (2024-03-19T16:03:03Z) - High-Probability Bounds for Stochastic Optimization and Variational
Inequalities: the Case of Unbounded Variance [59.211456992422136]
We propose algorithms with high-probability convergence results under less restrictive assumptions.
These results justify the usage of the considered methods for solving problems that do not fit standard functional classes in optimization.
arXiv Detail & Related papers (2023-02-02T10:37:23Z) - Fixed-Time Convergence for a Class of Nonconvex-Nonconcave Min-Max
Problems [5.787117733071416]
We develop a fixed-time convergent saddle point dynamical system for solving min-max problems.
The proposed method achieves fast compared to any other state method.
arXiv Detail & Related papers (2022-07-26T12:25:05Z) - Minimax Optimization: The Case of Convex-Submodular [50.03984152441271]
Minimax problems extend beyond the continuous domain to mixed continuous-discrete domains or even fully discrete domains.
We introduce the class of convex-submodular minimax problems, where the objective is convex with respect to the continuous variable and submodular with respect to the discrete variable.
Our proposed algorithms are iterative and combine tools from both discrete and continuous optimization.
arXiv Detail & Related papers (2021-11-01T21:06:35Z) - Lifting the Convex Conjugate in Lagrangian Relaxations: A Tractable
Approach for Continuous Markov Random Fields [53.31927549039624]
We show that a piecewise discretization preserves better contrast from existing discretization problems.
We apply this theory to the problem of matching two images.
arXiv Detail & Related papers (2021-07-13T12:31:06Z) - Optimal oracle inequalities for solving projected fixed-point equations [53.31620399640334]
We study methods that use a collection of random observations to compute approximate solutions by searching over a known low-dimensional subspace of the Hilbert space.
We show how our results precisely characterize the error of a class of temporal difference learning methods for the policy evaluation problem with linear function approximation.
arXiv Detail & Related papers (2020-12-09T20:19:32Z) - Conditional gradient methods for stochastically constrained convex
minimization [54.53786593679331]
We propose two novel conditional gradient-based methods for solving structured convex optimization problems.
The most important feature of our framework is that only a subset of the constraints is processed at each iteration.
Our algorithms rely on variance reduction and smoothing used in conjunction with conditional gradient steps, and are accompanied by rigorous convergence guarantees.
arXiv Detail & Related papers (2020-07-07T21:26:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.