On the convex hull of convex quadratic optimization problems with
indicators
- URL: http://arxiv.org/abs/2201.00387v1
- Date: Sun, 2 Jan 2022 18:04:52 GMT
- Title: On the convex hull of convex quadratic optimization problems with
indicators
- Authors: Linchuan Wei, Alper Atamt\"urk, Andr\'es G\'omez, Simge
K\"u\c{c}\"ukyavuz
- Abstract summary: We show that a convex hull description of the associated mixed-integer set in an extended space with a quadratic number of additional variables consists of a single positive semidefinite constraint and linear constraints.
The new theory presented here paves the way toward utilizing polyhedral methods to analyze the convex hull of mixed-integer nonlinear sets.
- Score: 2.867517731896504
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider the convex quadratic optimization problem with indicator
variables and arbitrary constraints on the indicators. We show that a convex
hull description of the associated mixed-integer set in an extended space with
a quadratic number of additional variables consists of a single positive
semidefinite constraint (explicitly stated) and linear constraints. In
particular, convexification of this class of problems reduces to describing a
polyhedral set in an extended formulation. We also give descriptions in the
original space of variables: we provide a description based on an infinite
number of conic-quadratic inequalities, which are "finitely generated." In
particular, it is possible to characterize whether a given inequality is
necessary to describe the convex-hull. The new theory presented here unifies
several previously established results, and paves the way toward utilizing
polyhedral methods to analyze the convex hull of mixed-integer nonlinear sets.
Related papers
- Constrained Optimization of Rank-One Functions with Indicator Variables [0.0]
optimization problems involving a rank-one convex function over constraints modeling restrictions on the support of the decision variables emerge in various machine learning applications.
We propose a constructive approach that exploits a hidden conic structure induced by perspective functions.
This enables us to systematically give perspective formulations for the convex hull descriptions of sets with nonlinear separable or non-separable objective functions, sign constraints on continuous variables, and constraints on indicator variables.
arXiv Detail & Related papers (2023-03-31T15:51:56Z) - Hessian Based Smoothing Splines for Manifold Learning [0.228438857884398]
We propose a multidimensional smoothing spline algorithm in the context of manifold learning.
We generalize the bending energy penalty of thin-plate splines to a quadratic form on the Sobolev space of a flat manifold.
The existence and uniqueness of the solution is shown by applying the theory of reproducing Hilbert spaces.
arXiv Detail & Related papers (2023-02-10T02:49:05Z) - Approximation of optimization problems with constraints through kernel
Sum-Of-Squares [77.27820145069515]
We show that pointwise inequalities are turned into equalities within a class of nonnegative kSoS functions.
We also show that focusing on pointwise equality constraints enables the use of scattering inequalities to mitigate the curse of dimensionality in sampling the constraints.
arXiv Detail & Related papers (2023-01-16T10:30:04Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Lifting the Convex Conjugate in Lagrangian Relaxations: A Tractable
Approach for Continuous Markov Random Fields [53.31927549039624]
We show that a piecewise discretization preserves better contrast from existing discretization problems.
We apply this theory to the problem of matching two images.
arXiv Detail & Related papers (2021-07-13T12:31:06Z) - Supermodularity and valid inequalities for quadratic optimization with
indicators [3.8073142980733]
We show that the underlying set function for the rank-one quadratic can be minimized in linear time.
Explicit forms of the convexhull are given, both in the original space variables and in an extended formulation via conic quadratic-representable inequalities.
Experiments indicate that the lifted supermodular inequalities in conic quadratic form are quite effective in reducing the integrality gap for quadratic optimization with indicators.
arXiv Detail & Related papers (2020-12-29T07:03:09Z) - Optimal oracle inequalities for solving projected fixed-point equations [53.31620399640334]
We study methods that use a collection of random observations to compute approximate solutions by searching over a known low-dimensional subspace of the Hilbert space.
We show how our results precisely characterize the error of a class of temporal difference learning methods for the policy evaluation problem with linear function approximation.
arXiv Detail & Related papers (2020-12-09T20:19:32Z) - Conditional gradient methods for stochastically constrained convex
minimization [54.53786593679331]
We propose two novel conditional gradient-based methods for solving structured convex optimization problems.
The most important feature of our framework is that only a subset of the constraints is processed at each iteration.
Our algorithms rely on variance reduction and smoothing used in conjunction with conditional gradient steps, and are accompanied by rigorous convergence guarantees.
arXiv Detail & Related papers (2020-07-07T21:26:35Z) - Ideal formulations for constrained convex optimization problems with
indicator variables [2.578242050187029]
We study the convexification of a class of convex optimization problems with indicator variables and constraints on the indicators.
Unlike most of the previous work on convexification of sparse regression problems, we simultaneously consider the nonlinear non-separable objective, indicator variables, and constraints.
We derive ideal convexifications for problems with hierarchy, multi-collinearity, and sparsity constraints.
arXiv Detail & Related papers (2020-06-30T21:07:10Z) - Convex Geometry and Duality of Over-parameterized Neural Networks [70.15611146583068]
We develop a convex analytic approach to analyze finite width two-layer ReLU networks.
We show that an optimal solution to the regularized training problem can be characterized as extreme points of a convex set.
In higher dimensions, we show that the training problem can be cast as a finite dimensional convex problem with infinitely many constraints.
arXiv Detail & Related papers (2020-02-25T23:05:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.