A Class of Linear Programs Solvable by Coordinate-Wise Minimization
- URL: http://arxiv.org/abs/2001.10467v5
- Date: Mon, 14 Sep 2020 10:47:55 GMT
- Title: A Class of Linear Programs Solvable by Coordinate-Wise Minimization
- Authors: Tom\'a\v{s} Dlask, Tom\'a\v{s} Werner
- Abstract summary: We present a class of linear programs that coordinate-wise minimization solves exactly.
We show that dual LP relaxations of several well-known optimization problems are in this class.
Our results are theoretically non-trivial and can lead to new large-scale optimization algorithms in the future.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Coordinate-wise minimization is a simple popular method for large-scale
optimization. Unfortunately, for general (non-differentiable) convex problems
it may not find global minima. We present a class of linear programs that
coordinate-wise minimization solves exactly. We show that dual LP relaxations
of several well-known combinatorial optimization problems are in this class and
the method finds a global minimum with sufficient accuracy in reasonable
runtimes. Moreover, for extensions of these problems that no longer are in this
class the method yields reasonably good suboptima. Though the presented LP
relaxations can be solved by more efficient methods (such as max-flow), our
results are theoretically non-trivial and can lead to new large-scale
optimization algorithms in the future.
Related papers
- An Accelerated Gradient Method for Convex Smooth Simple Bilevel Optimization [16.709026203727007]
We present a novel bilevel optimization method that locally approximates the solution set of the lower-level problem.
We measure the performance of our method in terms of suboptimality and infeasibility errors.
arXiv Detail & Related papers (2024-02-12T22:34:53Z) - Optimal Guarantees for Algorithmic Reproducibility and Gradient
Complexity in Convex Optimization [55.115992622028685]
Previous work suggests that first-order methods would need to trade-off convergence rate (gradient convergence rate) for better.
We demonstrate that both optimal complexity and near-optimal convergence guarantees can be achieved for smooth convex minimization and smooth convex-concave minimax problems.
arXiv Detail & Related papers (2023-10-26T19:56:52Z) - Minimizing Quasi-Self-Concordant Functions by Gradient Regularization of
Newton Method [4.62316736194615]
We study the composite convex optimization problems with a Quasi-Self-Concordant smooth component.
This problem class naturally interpolates between classic Self-Concordant functions and functions with Lipschitz continuous Hessian.
We show that for minimizing Quasi-Self-Concordant functions we can use instead the basic Newton Method with Gradient Regularization.
arXiv Detail & Related papers (2023-08-28T17:43:04Z) - Accelerating Cutting-Plane Algorithms via Reinforcement Learning
Surrogates [49.84541884653309]
A current standard approach to solving convex discrete optimization problems is the use of cutting-plane algorithms.
Despite the existence of a number of general-purpose cut-generating algorithms, large-scale discrete optimization problems continue to suffer from intractability.
We propose a method for accelerating cutting-plane algorithms via reinforcement learning.
arXiv Detail & Related papers (2023-07-17T20:11:56Z) - Linearization Algorithms for Fully Composite Optimization [61.20539085730636]
This paper studies first-order algorithms for solving fully composite optimization problems convex compact sets.
We leverage the structure of the objective by handling differentiable and non-differentiable separately, linearizing only the smooth parts.
arXiv Detail & Related papers (2023-02-24T18:41:48Z) - Optimal Algorithms for Stochastic Complementary Composite Minimization [55.26935605535377]
Inspired by regularization techniques in statistics and machine learning, we study complementary composite minimization.
We provide novel excess risk bounds, both in expectation and with high probability.
Our algorithms are nearly optimal, which we prove via novel lower complexity bounds for this class of problems.
arXiv Detail & Related papers (2022-11-03T12:40:24Z) - A Constrained Optimization Approach to Bilevel Optimization with
Multiple Inner Minima [49.320758794766185]
We propose a new approach, which convert the bilevel problem to an equivalent constrained optimization, and then the primal-dual algorithm can be used to solve the problem.
Such an approach enjoys a few advantages including (a) addresses the multiple inner minima challenge; (b) fully first-order efficiency without Jacobian computations.
arXiv Detail & Related papers (2022-03-01T18:20:01Z) - Coordinate Descent Methods for DC Minimization [12.284934135116515]
Difference-of-Convex (DC) refers to the problem of minimizing the difference of two convex functions.
This paper proposes a non-dimensional one-dimensional subproblem globally, and it is guaranteed to converge to a coordinatewise stationary point.
arXiv Detail & Related papers (2021-09-09T12:44:06Z) - Complementary Composite Minimization, Small Gradients in General Norms,
and Applications to Regression Problems [14.759688428864157]
Composite minimization is a powerful framework in large-scale convex optimization.
We introduce a new algorithmic framework for complementary composite minimization.
We prove that the algorithms resulting from our framework are near-optimal in most of the standard optimization settings.
arXiv Detail & Related papers (2021-01-26T19:21:28Z) - Recent Theoretical Advances in Non-Convex Optimization [56.88981258425256]
Motivated by recent increased interest in analysis of optimization algorithms for non- optimization in deep networks and other problems in data, we give an overview of recent results of theoretical optimization algorithms for non- optimization.
arXiv Detail & Related papers (2020-12-11T08:28:51Z) - Meta-learning based Alternating Minimization Algorithm for Non-convex
Optimization [9.774392581946108]
We propose a novel solution for challenging non-problems of multiple variables.
Our proposed approach is able to achieve effective iterations in cases while other methods would typically fail.
arXiv Detail & Related papers (2020-09-09T10:45:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.