Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion
Problems
- URL: http://arxiv.org/abs/2212.07844v1
- Date: Thu, 15 Dec 2022 14:05:32 GMT
- Title: Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion
Problems
- Authors: J\'er\^ome Bolte (TSE-R), Edouard Pauwels (IRIT-ADRIA), Antonio Jos\'e
Silveti-Falls (CVN, OPIS)
- Abstract summary: A direct consequence of our result is that these solutions happen to be differentiable almost everywhere.
Our approach is fully compatible with automatic differentiation and comes with assumptions which are easy to check.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We leverage path differentiability and a recent result on nonsmooth implicit
differentiation calculus to give sufficient conditions ensuring that the
solution to a monotone inclusion problem will be path differentiable, with
formulas for computing its generalized gradient. A direct consequence of our
result is that these solutions happen to be differentiable almost everywhere.
Our approach is fully compatible with automatic differentiation and comes with
assumptions which are easy to check, roughly speaking: semialgebraicity and
strong monotonicity. We illustrate the scope of our results by considering
three fundamental composite problem settings: strongly convex problems, dual
solutions to convex minimization problems and primal-dual solutions to min-max
problems.
Related papers
- Augmented neural forms with parametric boundary-matching operators for solving ordinary differential equations [0.0]
This paper introduces a formalism for systematically crafting proper neural forms with boundary matches that are amenable to optimization.
It describes a novel technique for converting problems with Neumann or Robin conditions into equivalent problems with parametric Dirichlet conditions.
The proposed augmented neural forms approach was tested on a set of diverse problems, encompassing first- and second-order ordinary differential equations, as well as first-order systems.
arXiv Detail & Related papers (2024-04-30T11:10:34Z) - High-Probability Bounds for Stochastic Optimization and Variational
Inequalities: the Case of Unbounded Variance [59.211456992422136]
We propose algorithms with high-probability convergence results under less restrictive assumptions.
These results justify the usage of the considered methods for solving problems that do not fit standard functional classes in optimization.
arXiv Detail & Related papers (2023-02-02T10:37:23Z) - A Deep Double Ritz Method for solving Partial Differential Equations [0.5161531917413708]
Residual minimization is a widely used technique for solving Partial Differential Equations in variational form.
It minimizes the dual norm of the residual, which naturally yields a saddle-point (min-max) problem over the so-called trial and test spaces.
arXiv Detail & Related papers (2022-11-07T15:34:07Z) - Learning Proximal Operators to Discover Multiple Optima [66.98045013486794]
We present an end-to-end method to learn the proximal operator across non-family problems.
We show that for weakly-ized objectives and under mild conditions, the method converges globally.
arXiv Detail & Related papers (2022-01-28T05:53:28Z) - Minimax Optimization: The Case of Convex-Submodular [50.03984152441271]
Minimax problems extend beyond the continuous domain to mixed continuous-discrete domains or even fully discrete domains.
We introduce the class of convex-submodular minimax problems, where the objective is convex with respect to the continuous variable and submodular with respect to the discrete variable.
Our proposed algorithms are iterative and combine tools from both discrete and continuous optimization.
arXiv Detail & Related papers (2021-11-01T21:06:35Z) - Efficient and Modular Implicit Differentiation [68.74748174316989]
We propose a unified, efficient and modular approach for implicit differentiation of optimization problems.
We show that seemingly simple principles allow to recover many recently proposed implicit differentiation methods and create new ones easily.
arXiv Detail & Related papers (2021-05-31T17:45:58Z) - Stability and Generalization of Stochastic Gradient Methods for Minimax
Problems [71.60601421935844]
Many machine learning problems can be formulated as minimax problems such as Generative Adversarial Networks (GANs)
We provide a comprehensive generalization analysis of examples from training gradient methods for minimax problems.
arXiv Detail & Related papers (2021-05-08T22:38:00Z) - Stochastic Variance Reduction for Variational Inequality Methods [19.061953585686986]
We propose variance reduced algorithms for solving convex-concave saddle point problems, monotone variational inequalities, and monotone inclusions.
Our framework applies to extragradient, forward-backward-forward, and forward-reflected-backward methods both in Euclidean and Bregman.
arXiv Detail & Related papers (2021-02-16T18:39:16Z) - Conditional gradient methods for stochastically constrained convex
minimization [54.53786593679331]
We propose two novel conditional gradient-based methods for solving structured convex optimization problems.
The most important feature of our framework is that only a subset of the constraints is processed at each iteration.
Our algorithms rely on variance reduction and smoothing used in conjunction with conditional gradient steps, and are accompanied by rigorous convergence guarantees.
arXiv Detail & Related papers (2020-07-07T21:26:35Z) - Halpern Iteration for Near-Optimal and Parameter-Free Monotone Inclusion
and Strong Solutions to Variational Inequalities [14.848525762485872]
We leverage the connections between nonexpansive maps, monotone Lipschitz operators, and proximal mappings to obtain near-optimal solutions to monotone inclusion problems.
These results translate into near-optimal guarantees for approximating strong solutions to variational inequality problems, approximating convex-concave min-max optimization problems, and minimizing the norm of the gradient in min-max optimization problems.
arXiv Detail & Related papers (2020-02-20T17:12:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.