Mixed-Projection Conic Optimization: A New Paradigm for Modeling Rank
Constraints
- URL: http://arxiv.org/abs/2009.10395v2
- Date: Fri, 2 Apr 2021 20:45:08 GMT
- Title: Mixed-Projection Conic Optimization: A New Paradigm for Modeling Rank
Constraints
- Authors: Dimitris Bertsimas, Ryan Cory-Wright, Jean Pauphilet
- Abstract summary: We provide a framework for solving low-rank optimization problems to certifiable optimality.
Our framework also provides near-optimal solutions through rounding and local search techniques.
- Score: 3.179831861897336
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a framework for modeling and solving low-rank optimization
problems to certifiable optimality. We introduce symmetric projection matrices
that satisfy $Y^2=Y$, the matrix analog of binary variables that satisfy
$z^2=z$, to model rank constraints. By leveraging regularization and strong
duality, we prove that this modeling paradigm yields tractable convex
optimization problems over the non-convex set of orthogonal projection
matrices. Furthermore, we design outer-approximation algorithms to solve
low-rank problems to certifiable optimality, compute lower bounds via their
semidefinite relaxations, and provide near-optimal solutions through rounding
and local search techniques. We implement these numerical ingredients and, for
the first time, solve low-rank optimization problems to certifiable optimality.
Using currently available spatial branch-and-bound codes, not tailored to
projection matrices, we can scale our exact (resp. near-exact) algorithms to
matrices with up to 30 (resp. 600) rows/columns. Our algorithms also supply
certifiably near-optimal solutions for larger problem sizes and outperform
existing heuristics, by deriving an alternative to the popular nuclear norm
relaxation which generalizes the perspective relaxation from vectors to
matrices. All in all, our framework, which we name Mixed-Projection Conic
Optimization, solves low-rank problems to certifiable optimality in a tractable
and unified fashion.
Related papers
- Global Optimization: A Machine Learning Approach [7.052596485478637]
Bertsimas and Ozturk (2023) proposed OCTHaGOn as a way of solving black-box global optimization problems.
We provide extensions to this approach by approximating the original problem using other MIO-representable ML models.
We show improvements in solution feasibility and optimality in the majority of instances.
arXiv Detail & Related papers (2023-11-03T06:33:38Z) - Accelerating Cutting-Plane Algorithms via Reinforcement Learning
Surrogates [49.84541884653309]
A current standard approach to solving convex discrete optimization problems is the use of cutting-plane algorithms.
Despite the existence of a number of general-purpose cut-generating algorithms, large-scale discrete optimization problems continue to suffer from intractability.
We propose a method for accelerating cutting-plane algorithms via reinforcement learning.
arXiv Detail & Related papers (2023-07-17T20:11:56Z) - Optimal Low-Rank Matrix Completion: Semidefinite Relaxations and
Eigenvector Disjunctions [6.537257913467247]
Low-rank matrix completion consists of a matrix of minimal complexity that recovers a given set of observations as accurately as possible.
New convex relaxations decrease the optimal by magnitude compared to existing methods.
arXiv Detail & Related papers (2023-05-20T22:04:34Z) - Symmetric Tensor Networks for Generative Modeling and Constrained
Combinatorial Optimization [72.41480594026815]
Constrained optimization problems abound in industry, from portfolio optimization to logistics.
One of the major roadblocks in solving these problems is the presence of non-trivial hard constraints which limit the valid search space.
In this work, we encode arbitrary integer-valued equality constraints of the form Ax=b, directly into U(1) symmetric networks (TNs) and leverage their applicability as quantum-inspired generative models.
arXiv Detail & Related papers (2022-11-16T18:59:54Z) - Optimal Diagonal Preconditioning: Theory and Practice [23.79536881427839]
This paper presents the problem of optimal diagonal preconditioning to achieve maximal reduction in any full-rank number of rows or columns or simultaneously.
We provide a baseline bisection algorithm that is easy to implement in practice.
Next, we specialize to one-sided optimal diagonal preconditioning problems, and demonstrate that they can be formulated as standard dual SDP problems.
arXiv Detail & Related papers (2022-09-02T04:21:28Z) - Low-Rank Mirror-Prox for Nonsmooth and Low-Rank Matrix Optimization
Problems [19.930021245647705]
Low-rank and nonsmooth matrix optimization problems capture many fundamental tasks in statistics and machine learning.
In this paper we consider standard convex relaxations for such problems.
We prove that under a textitstrict complementarity condition and under the relatively mild assumption that the nonsmooth objective can be written as a maximum of smooth functions, approximated variants of two popular textitmirror-prox methods.
arXiv Detail & Related papers (2022-06-23T08:10:54Z) - Sparse Quadratic Optimisation over the Stiefel Manifold with Application
to Permutation Synchronisation [71.27989298860481]
We address the non- optimisation problem of finding a matrix on the Stiefel manifold that maximises a quadratic objective function.
We propose a simple yet effective sparsity-promoting algorithm for finding the dominant eigenspace matrix.
arXiv Detail & Related papers (2021-09-30T19:17:35Z) - Convergence of adaptive algorithms for weakly convex constrained
optimization [59.36386973876765]
We prove the $mathcaltilde O(t-1/4)$ rate of convergence for the norm of the gradient of Moreau envelope.
Our analysis works with mini-batch size of $1$, constant first and second order moment parameters, and possibly smooth optimization domains.
arXiv Detail & Related papers (2020-06-11T17:43:19Z) - Effective Dimension Adaptive Sketching Methods for Faster Regularized
Least-Squares Optimization [56.05635751529922]
We propose a new randomized algorithm for solving L2-regularized least-squares problems based on sketching.
We consider two of the most popular random embeddings, namely, Gaussian embeddings and the Subsampled Randomized Hadamard Transform (SRHT)
arXiv Detail & Related papers (2020-06-10T15:00:09Z) - Global Optimization of Gaussian processes [52.77024349608834]
We propose a reduced-space formulation with trained Gaussian processes trained on few data points.
The approach also leads to significantly smaller and computationally cheaper sub solver for lower bounding.
In total, we reduce time convergence by orders of orders of the proposed method.
arXiv Detail & Related papers (2020-05-21T20:59:11Z) - Stochastic Coordinate Minimization with Progressive Precision for
Stochastic Convex Optimization [16.0251555430107]
A framework based on iterative coordinate minimization (CM) is developed for convex optimization.
We establish the optimal precision control and the resulting order-optimal regret performance.
The proposed algorithm is amenable to online implementation and inherits the scalability and parallelizability properties of CM for large-scale optimization.
arXiv Detail & Related papers (2020-03-11T18:42:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.