Mirrorless Mirror Descent: A Natural Derivation of Mirror Descent
- URL: http://arxiv.org/abs/2004.01025v3
- Date: Fri, 2 Jul 2021 00:33:01 GMT
- Title: Mirrorless Mirror Descent: A Natural Derivation of Mirror Descent
- Authors: Suriya Gunasekar, Blake Woodworth, Nathan Srebro
- Abstract summary: We present a primal only derivation of Mirror Descent where the metric tensor is the Hessian of the Mirror Descent potential.
We contrast this discretization to Natural Gradient Descent, which is obtained by a "full" forward discretization.
- Score: 29.502762393574702
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: We present a primal only derivation of Mirror Descent as a "partial"
discretization of gradient flow on a Riemannian manifold where the metric
tensor is the Hessian of the Mirror Descent potential. We contrast this
discretization to Natural Gradient Descent, which is obtained by a "full"
forward Euler discretization. This view helps shed light on the relationship
between the methods and allows generalizing Mirror Descent to general
Riemannian geometries, even when the metric tensor is {\em not} a Hessian, and
thus there is no "dual."
Related papers
- Entropic Mirror Descent for Linear Systems: Polyak's Stepsize and Implicit Bias [55.72269695392027]
This paper focuses on applying entropic mirror descent to solve linear systems.<n>The main challenge for the convergence analysis stems from the unboundedness of the domain.<n>To overcome this without imposing restrictive assumptions, we introduce a variant of Polyak-type stepsizes.
arXiv Detail & Related papers (2025-05-05T12:33:18Z) - MirrorGaussian: Reflecting 3D Gaussians for Reconstructing Mirror Reflections [58.003014868772254]
MirrorGaussian is the first method for mirror scene reconstruction with real-time rendering based on 3D Gaussian Splatting.
We introduce an intuitive dual-rendering strategy that enables differentiableization of both the real-world 3D Gaussians and the mirrored counterpart.
Our approach significantly outperforms existing methods, achieving state-of-the-art results.
arXiv Detail & Related papers (2024-05-20T09:58:03Z) - Intrinsic Bayesian Cramér-Rao Bound with an Application to Covariance Matrix Estimation [49.67011673289242]
This paper presents a new performance bound for estimation problems where the parameter to estimate lies in a smooth manifold.
It induces a geometry for the parameter manifold, as well as an intrinsic notion of the estimation error measure.
arXiv Detail & Related papers (2023-11-08T15:17:13Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Implicit Bias of Gradient Descent on Reparametrized Models: On
Equivalence to Mirror Descent [64.26008239544085]
gradient flow with any commuting parametrization is equivalent to continuous mirror descent with a related Legendre function.
continuous mirror descent with any Legendre function can be viewed as gradient flow with a related commuting parametrization.
arXiv Detail & Related papers (2022-07-08T17:47:11Z) - Mirror Descent with Relative Smoothness in Measure Spaces, with
application to Sinkhorn and EM [11.007661197604065]
This paper studies the convergence of the mirror descent algorithm in an infinite-dimensional setting.
Applying our result to joint distributions and the Kullback--Leibler divergence, we show that Sinkhorn's primal iterations for optimal transport correspond to a mirror descent.
arXiv Detail & Related papers (2022-06-17T16:19:47Z) - Implicit Regularization in Matrix Sensing via Mirror Descent [29.206451882562867]
We study discrete-time mirror descent applied to the unregularized empirical risk in matrix sensing.
We show that gradient descent with full-rank factorized parametrization is a first-order approximation to mirror descent.
arXiv Detail & Related papers (2021-05-28T13:46:47Z) - Nearly Minimax-Optimal Rates for Noisy Sparse Phase Retrieval via
Early-Stopped Mirror Descent [29.206451882562867]
This paper studies early-stopped mirror descent applied to noisy noisy phase retrieval.
We find that a simple algorithm does not rely on explicit regularization or threshold steps to promote sparsity.
arXiv Detail & Related papers (2021-05-08T11:22:19Z) - A Continuous-Time Mirror Descent Approach to Sparse Phase Retrieval [24.17778927729799]
We analyze continuous-time mirror applied to sparse phase retrieval.
It is the problem of recovering sparse signals from a set of only measurements.
We provide a convergence analysis algorithm for this problem.
arXiv Detail & Related papers (2020-10-20T10:03:44Z) - Disentangling by Subspace Diffusion [72.1895236605335]
We show that fully unsupervised factorization of a data manifold is possible if the true metric of the manifold is known.
Our work reduces the question of whether unsupervised metric learning is possible, providing a unifying insight into the geometric nature of representation learning.
arXiv Detail & Related papers (2020-06-23T13:33:19Z) - Quantum Geometric Confinement and Dynamical Transmission in Grushin
Cylinder [68.8204255655161]
We classify the self-adjoint realisations of the Laplace-Beltrami operator minimally defined on an infinite cylinder.
We retrieve those distinguished extensions previously identified in the recent literature, namely the most confining and the most transmitting.
arXiv Detail & Related papers (2020-03-16T11:37:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.