Primal-dual extrapolation methods for monotone inclusions under local
Lipschitz continuity with applications to variational inequality, conic
constrained saddle point, and convex conic optimization problems
- URL: http://arxiv.org/abs/2206.00973v1
- Date: Thu, 2 Jun 2022 10:31:45 GMT
- Title: Primal-dual extrapolation methods for monotone inclusions under local
Lipschitz continuity with applications to variational inequality, conic
constrained saddle point, and convex conic optimization problems
- Authors: Zhaosong Lu and Sanyou Mei
- Abstract summary: We consider a class of structured monotone inclusion (MI) problems that consist of finding a zero in the sum of two monotone operators.
We first propose a primal-dual extrapolation (PDE) method for solving a structured strongly MI problem by modifying the classical forward-backward splitting method.
We then propose another PDE method for solving a structured non-strongly MI problem by applying the above PDE method to approximately solve a sequence of structured strongly MI problems.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper we consider a class of structured monotone inclusion (MI)
problems that consist of finding a zero in the sum of two monotone operators,
in which one is maximal monotone while another is locally Lipschitz continuous.
In particular, we first propose a primal-dual extrapolation (PDE) method for
solving a structured strongly MI problem by modifying the classical
forward-backward splitting method by using a point and operator extrapolation
technique, in which the parameters are adaptively updated by a backtracking
line search scheme. The proposed PDE method is almost parameter-free, equipped
with a verifiable termination criterion, and enjoys an operation complexity of
${\cal O}(\log \epsilon^{-1})$, measured by the amount of fundamental
operations consisting only of evaluations of one operator and resolvent of
another operator, for finding an $\epsilon$-residual solution of the structured
strongly MI problem. We then propose another PDE method for solving a
structured non-strongly MI problem by applying the above PDE method to
approximately solve a sequence of structured strongly MI problems. The
resulting PDE method is parameter-free, equipped with a verifiable termination
criterion, and enjoys an operation complexity of ${\cal O}(\epsilon^{-1}\log
\epsilon^{-1})$ for finding an $\epsilon$-residual solution of the structured
non-strongly MI problem. As a consequence, we apply the latter PDE method to
convex conic optimization, conic constrained saddle point, and variational
inequality problems, and obtain complexity results for finding an
$\epsilon$-KKT or $\epsilon$-residual solution of them under local Lipschitz
continuity. To the best of our knowledge, no prior studies were conducted to
investigate methods with complexity guarantees for solving the aforementioned
problems under local Lipschitz continuity. All the complexity results obtained
in this paper are entirely new.
Related papers
- Accelerated Stochastic Min-Max Optimization Based on Bias-corrected Momentum [30.01198677588252]
First-order algorithms require at least $mathcalO(varepsilonepsilon-4)$ complexity to find an $varepsilon-stationary point.
We introduce novel momentum algorithms utilizing efficient variable complexity.
The effectiveness of the method is validated through robust logistic regression using real-world datasets.
arXiv Detail & Related papers (2024-06-18T20:14:52Z) - Accelerated Variance-Reduced Forward-Reflected Methods for Root-Finding Problems [8.0153031008486]
We propose a novel class of Nesterov's accelerated forward-reflected-based methods with variance reduction to solve root-finding problems.
Our algorithm is single-loop and leverages a new family of unbiased variance-reduced estimators specifically designed for root-finding problems.
arXiv Detail & Related papers (2024-06-04T15:23:29Z) - A Newton-CG based barrier-augmented Lagrangian method for general nonconvex conic optimization [53.044526424637866]
In this paper we consider finding an approximate second-order stationary point (SOSP) that minimizes a twice different subject general non conic optimization.
In particular, we propose a Newton-CG based-augmentedconjugate method for finding an approximate SOSP.
arXiv Detail & Related papers (2023-01-10T20:43:29Z) - Explicit Second-Order Min-Max Optimization Methods with Optimal Convergence Guarantee [86.05440220344755]
We propose and analyze inexact regularized Newton-type methods for finding a global saddle point of emphcon unconstrained min-max optimization problems.
We show that the proposed methods generate iterates that remain within a bounded set and that the iterations converge to an $epsilon$-saddle point within $O(epsilon-2/3)$ in terms of a restricted function.
arXiv Detail & Related papers (2022-10-23T21:24:37Z) - Multi-block-Single-probe Variance Reduced Estimator for Coupled
Compositional Optimization [49.58290066287418]
We propose a novel method named Multi-block-probe Variance Reduced (MSVR) to alleviate the complexity of compositional problems.
Our results improve upon prior ones in several aspects, including the order of sample complexities and dependence on strongity.
arXiv Detail & Related papers (2022-07-18T12:03:26Z) - Perseus: A Simple and Optimal High-Order Method for Variational
Inequalities [81.32967242727152]
A VI involves finding $xstar in mathcalX$ such that $langle F(x), x - xstarrangle geq 0$ for all $x in mathcalX$.
We propose a $pth$-order method that does textitnot require any line search procedure and provably converges to a weak solution at a rate of $O(epsilon-2/(p+1))$.
arXiv Detail & Related papers (2022-05-06T13:29:14Z) - A Stochastic Halpern Iteration with Variance Reduction for Stochastic
Monotone Inclusion Problems [17.597000481404883]
We study monotone inclusion problems, which widely appear in machine learning applications.
Our algorithm attains $epsilon$ norm of the operator with $mathcalO(frac1epsilon3)$ operator evaluations.
arXiv Detail & Related papers (2022-03-17T16:48:57Z) - A Projection-free Algorithm for Constrained Stochastic Multi-level
Composition Optimization [12.096252285460814]
We propose a projection-free conditional gradient-type algorithm for composition optimization.
We show that the number of oracles and the linear-minimization oracle required by the proposed algorithm, are of order $mathcalO_T(epsilon-2)$ and $mathcalO_T(epsilon-3)$ respectively.
arXiv Detail & Related papers (2022-02-09T06:05:38Z) - Second-Order Information in Non-Convex Stochastic Optimization: Power
and Limitations [54.42518331209581]
We find an algorithm which finds.
epsilon$-approximate stationary point (with $|nabla F(x)|le epsilon$) using.
$(epsilon,gamma)$surimate random random points.
Our lower bounds here are novel even in the noiseless case.
arXiv Detail & Related papers (2020-06-24T04:41:43Z) - Complexity of Finding Stationary Points of Nonsmooth Nonconvex Functions [84.49087114959872]
We provide the first non-asymptotic analysis for finding stationary points of nonsmooth, nonsmooth functions.
In particular, we study Hadamard semi-differentiable functions, perhaps the largest class of nonsmooth functions.
arXiv Detail & Related papers (2020-02-10T23:23:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.