An Inertial Block Majorization Minimization Framework for Nonsmooth
Nonconvex Optimization
- URL: http://arxiv.org/abs/2010.12133v3
- Date: Tue, 20 Sep 2022 12:28:52 GMT
- Title: An Inertial Block Majorization Minimization Framework for Nonsmooth
Nonconvex Optimization
- Authors: Le Thi Khanh Hien, Duy Nhat Phan, Nicolas Gillis
- Abstract summary: We introduce TITAN, a novel inerTIal block majorizaTion minimizAtioN for non-smooth non-coordinate problems.
We illustrate the effectiveness of TITAN on two important machine learning problems.
- Score: 17.49766938060264
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we introduce TITAN, a novel inerTIal block majorizaTion
minimizAtioN framework for non-smooth non-convex optimization problems. To the
best of our knowledge, TITAN is the first framework of block-coordinate update
method that relies on the majorization-minimization framework while embedding
inertial force to each step of the block updates. The inertial force is
obtained via an extrapolation operator that subsumes heavy-ball and
Nesterov-type accelerations for block proximal gradient methods as special
cases. By choosing various surrogate functions, such as proximal, Lipschitz
gradient, Bregman, quadratic, and composite surrogate functions, and by varying
the extrapolation operator, TITAN produces a rich set of inertial
block-coordinate update methods. We study sub-sequential convergence as well as
global convergence for the generated sequence of TITAN. We illustrate the
effectiveness of TITAN on two important machine learning problems, namely
sparse non-negative matrix factorization and matrix completion.
Related papers
- Submodular Framework for Structured-Sparse Optimal Transport [7.030105924295838]
Unbalanced optimal transport (UOT) has recently gained much attention due to its flexible framework for handling unnormalized measures and its robustness.
In this work, we explore learning (structured) sparse transport plans in the UOT setting, i.e., transport plans have an upper bound on the number of non-sparse entries in each column.
We propose novel sparsity-constrained UOT formulations building on the recently explored mean discrepancy based UOT.
arXiv Detail & Related papers (2024-06-07T13:11:04Z) - Energy-Guided Continuous Entropic Barycenter Estimation for General Costs [95.33926437521046]
We propose a novel algorithm for approximating the continuous Entropic OT (EOT) barycenter for arbitrary OT cost functions.
Our approach is built upon the dual reformulation of the EOT problem based on weak OT.
arXiv Detail & Related papers (2023-10-02T11:24:36Z) - Multiblock ADMM for nonsmooth nonconvex optimization with nonlinear
coupling constraints [3.2815423235774634]
We propose a method of multipliers for solving a class of multiblock nonsmooth alternating optimization problem with nonlinear constraints.
We employ a major sequenceization procedure in update of each block of the primal variables.
arXiv Detail & Related papers (2022-01-19T15:31:30Z) - Revisiting and Advancing Fast Adversarial Training Through The Lens of
Bi-Level Optimization [60.72410937614299]
We propose a new tractable bi-level optimization problem, design and analyze a new set of algorithms termed Bi-level AT (FAST-BAT)
FAST-BAT is capable of defending sign-based projected descent (PGD) attacks without calling any gradient sign method and explicit robust regularization.
arXiv Detail & Related papers (2021-12-23T06:25:36Z) - Value-Function-based Sequential Minimization for Bi-level Optimization [52.39882976848064]
gradient-based Bi-Level Optimization (BLO) methods have been widely applied to handle modern learning tasks.
There are almost no gradient-based methods able to solve BLO in challenging scenarios, such as BLO with functional constraints and pessimistic BLO.
We provide Bi-level Value-Function-based Sequential Minimization (BVFSM) to address the above issues.
arXiv Detail & Related papers (2021-10-11T03:13:39Z) - Optimization on manifolds: A symplectic approach [127.54402681305629]
We propose a dissipative extension of Dirac's theory of constrained Hamiltonian systems as a general framework for solving optimization problems.
Our class of (accelerated) algorithms are not only simple and efficient but also applicable to a broad range of contexts.
arXiv Detail & Related papers (2021-07-23T13:43:34Z) - Block Alternating Bregman Majorization Minimization with Extrapolation [16.04690575393738]
We consider a class of nonsmooth non optimization problems whose objective is the sum of a block relative smooth function and a proper separable function.
We propose a block alternating Bregman Majorization-Minimization framework with Extrapolation (BMME)
We prove the effectiveness of BMME on the penalized nonnegative matrix under stronger conditions.
arXiv Detail & Related papers (2021-07-09T12:47:00Z) - A Stochastic Composite Augmented Lagrangian Method For Reinforcement
Learning [9.204659134755795]
We consider the linear programming (LP) formulation for deep reinforcement learning.
The augmented Lagrangian method suffers the double-sampling obstacle in solving the LP.
A deep parameterized augment Lagrangian method is proposed.
arXiv Detail & Related papers (2021-05-20T13:08:06Z) - Conditional gradient methods for stochastically constrained convex
minimization [54.53786593679331]
We propose two novel conditional gradient-based methods for solving structured convex optimization problems.
The most important feature of our framework is that only a subset of the constraints is processed at each iteration.
Our algorithms rely on variance reduction and smoothing used in conjunction with conditional gradient steps, and are accompanied by rigorous convergence guarantees.
arXiv Detail & Related papers (2020-07-07T21:26:35Z) - Cogradient Descent for Bilinear Optimization [124.45816011848096]
We introduce a Cogradient Descent algorithm (CoGD) to address the bilinear problem.
We solve one variable by considering its coupling relationship with the other, leading to a synchronous gradient descent.
Our algorithm is applied to solve problems with one variable under the sparsity constraint.
arXiv Detail & Related papers (2020-06-16T13:41:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.