Sion's Minimax Theorem in Geodesic Metric Spaces and a Riemannian
Extragradient Algorithm
- URL: http://arxiv.org/abs/2202.06950v2
- Date: Mon, 29 May 2023 00:45:03 GMT
- Title: Sion's Minimax Theorem in Geodesic Metric Spaces and a Riemannian
Extragradient Algorithm
- Authors: Peiyuan Zhang, Jingzhao Zhang, Suvrit Sra
- Abstract summary: This paper takes a step towards understanding problems that do remain tractable.
The first result is a class of geodesic space version Sion's minimax problems.
The second result is a result to geodesically completeian complexity.
- Score: 46.97925335733651
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deciding whether saddle points exist or are approximable for
nonconvex-nonconcave problems is usually intractable. This paper takes a step
towards understanding a broad class of nonconvex-nonconcave minimax problems
that do remain tractable. Specifically, it studies minimax problems over
geodesic metric spaces, which provide a vast generalization of the usual
convex-concave saddle point problems. The first main result of the paper is a
geodesic metric space version of Sion's minimax theorem; we believe our proof
is novel and broadly accessible as it relies on the finite intersection
property alone. The second main result is a specialization to geodesically
complete Riemannian manifolds: here, we devise and analyze the complexity of
first-order methods for smooth minimax problems.
Related papers
- Riemannian stochastic optimization methods avoid strict saddle points [68.80251170757647]
We show that policies under study avoid strict saddle points / submanifolds with probability 1.
This result provides an important sanity check as it shows that, almost always, the limit state of an algorithm can only be a local minimizer.
arXiv Detail & Related papers (2023-11-04T11:12:24Z) - Extragradient Type Methods for Riemannian Variational Inequality Problems [25.574847201669144]
We show that the average-rate convergence of both REG and RPEG is $Oleft(frac1Tright)$, aligning with in the 2020ean case.
Results are enabled by addressing judiciously the holonomy effect so additional observations can be reduced.
arXiv Detail & Related papers (2023-09-25T14:08:02Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Escaping limit cycles: Global convergence for constrained
nonconvex-nonconcave minimax problems [46.71914490521821]
This paper introduces a new extragradient-type algorithm for a class of non-nonconcave minimax problems.
The proposed algorithm is applicable to constrained and regularized problems.
arXiv Detail & Related papers (2023-02-20T08:37:08Z) - Decentralized Riemannian Algorithm for Nonconvex Minimax Problems [82.50374560598493]
The minimax algorithms for neural networks have been developed to solve many problems.
In this paper, we propose two types of minimax algorithms.
For the setting, we propose DRSGDA and prove that our method achieves a gradient.
arXiv Detail & Related papers (2023-02-08T01:42:45Z) - What is a Good Metric to Study Generalization of Minimax Learners? [24.577243536475233]
Minimax optimization has served as backbone of many machine learning (ML) problems.
How the solution trained on data performs on metric testing has been relatively underexplored.
We propose a new metric generalization minimax learners: the primal, to answer these issues.
arXiv Detail & Related papers (2022-06-09T13:39:06Z) - A singular Riemannian geometry approach to Deep Neural Networks I.
Theoretical foundations [77.86290991564829]
Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis.
We study a particular sequence of maps between manifold, with the last manifold of the sequence equipped with a Riemannian metric.
We investigate the theoretical properties of the maps of such sequence, eventually we focus on the case of maps between implementing neural networks of practical interest.
arXiv Detail & Related papers (2021-12-17T11:43:30Z) - Minimax Optimization: The Case of Convex-Submodular [50.03984152441271]
Minimax problems extend beyond the continuous domain to mixed continuous-discrete domains or even fully discrete domains.
We introduce the class of convex-submodular minimax problems, where the objective is convex with respect to the continuous variable and submodular with respect to the discrete variable.
Our proposed algorithms are iterative and combine tools from both discrete and continuous optimization.
arXiv Detail & Related papers (2021-11-01T21:06:35Z) - On the Local Linear Rate of Consensus on the Stiefel Manifold [39.750623187256735]
We restrict the convergence of properties of the Stfelian gradient over the Stiefel problem (for an un connected problem)
The main challenges include (i) developing a technical for analysis, and (ii) to identify the conditions (e.g., suitable step-size and under which the always stays in the local region) under which the always stays in the local region.
arXiv Detail & Related papers (2021-01-22T21:52:38Z) - On the Distribution of Minima in Intrinsic-Metric Rotation Averaging [14.226664427133901]
This paper studies the spatial distribution of local minima in a Rotation Averaging problem.
By recognizing the quotient manifold geometry of the problem we achieve an improvement to general $l_p$ costs.
Our results suggest using connectivity as an indicator of problem difficulty.
arXiv Detail & Related papers (2020-03-18T16:03:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.