The polarization hierarchy for polynomial optimization over convex bodies, with applications to nonnegative matrix rank
- URL: http://arxiv.org/abs/2406.09506v1
- Date: Thu, 13 Jun 2024 18:00:09 GMT
- Title: The polarization hierarchy for polynomial optimization over convex bodies, with applications to nonnegative matrix rank
- Authors: Martin Plávala, Laurens T. Ligthart, David Gross,
- Abstract summary: We construct a convergent family of outer approximations for the problem of optimizing functions over convex subject bodies to constraints.
A numerical implementation of the third level of the hierarchy is shown to give rise to a very tight approximation for this problem.
- Score: 0.6963971634605796
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We construct a convergent family of outer approximations for the problem of optimizing polynomial functions over convex bodies subject to polynomial constraints. This is achieved by generalizing the polarization hierarchy, which has previously been introduced for the study of polynomial optimization problems over state spaces of $C^*$-algebras, to convex cones in finite dimensions. If the convex bodies can be characterized by linear or semidefinite programs, then the same is true for our hierarchy. Convergence is proven by relating the problem to a certain de Finetti theorem for general probabilistic theories, which are studied as possible generalizations of quantum mechanics. We apply the method to the problem of nonnegative matrix factorization, and in particular to the nested rectangles problem. A numerical implementation of the third level of the hierarchy is shown to give rise to a very tight approximation for this problem.
Related papers
- Decomposition Polyhedra of Piecewise Linear Functions [4.594829845106234]
We contribute to the frequently studied question of how to decompose a continuous piecewise linear (CPWL) function into a convex CPWL function.
Every CPWL function has infinitely decompositions, but a difference two convex CPWL functions.
We apply our framework to find decompositions with as few linear pieces as possible.
arXiv Detail & Related papers (2024-10-07T10:48:36Z) - Covering Number of Real Algebraic Varieties and Beyond: Improved Bounds and Applications [8.438718130535296]
We prove upper bounds on the covering number of sets in Euclidean space.
We show that bounds improve the best known general bound by Yomdin-Comte.
We illustrate the power of the result on three computational applications.
arXiv Detail & Related papers (2023-11-09T03:06:59Z) - Hierarchies for Semidefinite Optimization in
$\mathcal{C}^\star$-Algebras [0.0]
This paper presents a way for finite-dimensional relaxations of general cone programs on $mathcalCstar$-algebras.
We show that well-known hierarchies for generalized problems like NPA but also Lasserre's hierarchy and to some extend symmetry reductions of generic SDPs.
arXiv Detail & Related papers (2023-09-25T09:01:30Z) - Linearization Algorithms for Fully Composite Optimization [61.20539085730636]
This paper studies first-order algorithms for solving fully composite optimization problems convex compact sets.
We leverage the structure of the objective by handling differentiable and non-differentiable separately, linearizing only the smooth parts.
arXiv Detail & Related papers (2023-02-24T18:41:48Z) - State polynomials: positivity, optimization and nonlinear Bell
inequalities [3.9692590090301683]
This paper introduces states in noncommuting variables and formal states of their products.
It shows that states, positive over all and matricial states, are sums of squares with denominators.
It is also established that avinetengle Kritivsatz fails to hold in the state setting.
arXiv Detail & Related papers (2023-01-29T18:52:21Z) - Approximation of optimization problems with constraints through kernel
Sum-Of-Squares [77.27820145069515]
We show that pointwise inequalities are turned into equalities within a class of nonnegative kSoS functions.
We also show that focusing on pointwise equality constraints enables the use of scattering inequalities to mitigate the curse of dimensionality in sampling the constraints.
arXiv Detail & Related papers (2023-01-16T10:30:04Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Sparse Polynomial Optimization: Theory and Practice [5.27013884159732]
Book presents several efforts to tackle this challenge with important scientific implications.
It provides alternative optimization schemes that scale well in terms of computational complexity.
We present sparsity-exploiting hierarchies of relaxations, for either unconstrained or constrained problems.
arXiv Detail & Related papers (2022-08-23T18:56:05Z) - Partial Counterfactual Identification from Observational and
Experimental Data [83.798237968683]
We develop effective Monte Carlo algorithms to approximate the optimal bounds from an arbitrary combination of observational and experimental data.
Our algorithms are validated extensively on synthetic and real-world datasets.
arXiv Detail & Related papers (2021-10-12T02:21:30Z) - Conditional gradient methods for stochastically constrained convex
minimization [54.53786593679331]
We propose two novel conditional gradient-based methods for solving structured convex optimization problems.
The most important feature of our framework is that only a subset of the constraints is processed at each iteration.
Our algorithms rely on variance reduction and smoothing used in conjunction with conditional gradient steps, and are accompanied by rigorous convergence guarantees.
arXiv Detail & Related papers (2020-07-07T21:26:35Z) - Convex Geometry and Duality of Over-parameterized Neural Networks [70.15611146583068]
We develop a convex analytic approach to analyze finite width two-layer ReLU networks.
We show that an optimal solution to the regularized training problem can be characterized as extreme points of a convex set.
In higher dimensions, we show that the training problem can be cast as a finite dimensional convex problem with infinitely many constraints.
arXiv Detail & Related papers (2020-02-25T23:05:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.