A Continuous-Time Mirror Descent Approach to Sparse Phase Retrieval
- URL: http://arxiv.org/abs/2010.10168v1
- Date: Tue, 20 Oct 2020 10:03:44 GMT
- Title: A Continuous-Time Mirror Descent Approach to Sparse Phase Retrieval
- Authors: Fan Wu and Patrick Rebeschini
- Abstract summary: We analyze continuous-time mirror applied to sparse phase retrieval.
It is the problem of recovering sparse signals from a set of only measurements.
We provide a convergence analysis algorithm for this problem.
- Score: 24.17778927729799
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We analyze continuous-time mirror descent applied to sparse phase retrieval,
which is the problem of recovering sparse signals from a set of magnitude-only
measurements. We apply mirror descent to the unconstrained empirical risk
minimization problem (batch setting), using the square loss and square
measurements. We provide a convergence analysis of the algorithm in this
non-convex setting and prove that, with the hypentropy mirror map, mirror
descent recovers any $k$-sparse vector $\mathbf{x}^\star\in\mathbb{R}^n$ with
minimum (in modulus) non-zero entry on the order of $\| \mathbf{x}^\star
\|_2/\sqrt{k}$ from $k^2$ Gaussian measurements, modulo logarithmic terms. This
yields a simple algorithm which, unlike most existing approaches to sparse
phase retrieval, adapts to the sparsity level, without including thresholding
steps or adding regularization terms. Our results also provide a principled
theoretical understanding for Hadamard Wirtinger flow [58], as Euclidean
gradient descent applied to the empirical risk problem with Hadamard
parametrization can be recovered as a first-order approximation to mirror
descent in discrete time.
Related papers
- A Sample Efficient Alternating Minimization-based Algorithm For Robust Phase Retrieval [56.67706781191521]
In this work, we present a robust phase retrieval problem where the task is to recover an unknown signal.
Our proposed oracle avoids the need for computationally spectral descent, using a simple gradient step and outliers.
arXiv Detail & Related papers (2024-09-07T06:37:23Z) - Stable Phase Retrieval with Mirror Descent [0.5312303275762104]
We show that mirror descent converges to a critical point of the phase retrieval problem.
We provide global convergence guarantees that ensure that with high probability, mirror descent converges to a global minimizer near the true vector.
arXiv Detail & Related papers (2024-05-17T13:07:14Z) - A Mean-Field Analysis of Neural Stochastic Gradient Descent-Ascent for Functional Minimax Optimization [90.87444114491116]
This paper studies minimax optimization problems defined over infinite-dimensional function classes of overparametricized two-layer neural networks.
We address (i) the convergence of the gradient descent-ascent algorithm and (ii) the representation learning of the neural networks.
Results show that the feature representation induced by the neural networks is allowed to deviate from the initial one by the magnitude of $O(alpha-1)$, measured in terms of the Wasserstein distance.
arXiv Detail & Related papers (2024-04-18T16:46:08Z) - Provably Accelerating Ill-Conditioned Low-rank Estimation via Scaled
Gradient Descent, Even with Overparameterization [48.65416821017865]
This chapter introduces a new algorithmic approach, dubbed scaled gradient (ScaledGD)
It converges linearly at a constant rate independent of the condition number of the low-rank object.
It maintains the low periteration cost of gradient descent for a variety of tasks.
arXiv Detail & Related papers (2023-10-09T21:16:57Z) - Provable Phase Retrieval with Mirror Descent [1.1662472705038338]
We consider the problem of phase retrieval, which consists of recovering an $n$-m real vector from the magnitude of its behaviour.
For two measurements, we show that when the number of measurements $n$ is enough, then with high probability, for almost all initializers, the original vector recovers up to a sign.
arXiv Detail & Related papers (2022-10-17T16:40:02Z) - Nonconvex Stochastic Scaled-Gradient Descent and Generalized Eigenvector
Problems [98.34292831923335]
Motivated by the problem of online correlation analysis, we propose the emphStochastic Scaled-Gradient Descent (SSD) algorithm.
We bring these ideas together in an application to online correlation analysis, deriving for the first time an optimal one-time-scale algorithm with an explicit rate of local convergence to normality.
arXiv Detail & Related papers (2021-12-29T18:46:52Z) - Towards Sample-Optimal Compressive Phase Retrieval with Sparse and
Generative Priors [59.33977545294148]
We show that $O(k log L)$ samples suffice to guarantee that the signal is close to any vector that minimizes an amplitude-based empirical loss function.
We adapt this result to sparse phase retrieval, and show that $O(s log n)$ samples are sufficient for a similar guarantee when the underlying signal is $s$-sparse and $n$-dimensional.
arXiv Detail & Related papers (2021-06-29T12:49:54Z) - Nearly Minimax-Optimal Rates for Noisy Sparse Phase Retrieval via
Early-Stopped Mirror Descent [29.206451882562867]
This paper studies early-stopped mirror descent applied to noisy noisy phase retrieval.
We find that a simple algorithm does not rely on explicit regularization or threshold steps to promote sparsity.
arXiv Detail & Related papers (2021-05-08T11:22:19Z) - Learning Minimax Estimators via Online Learning [55.92459567732491]
We consider the problem of designing minimax estimators for estimating parameters of a probability distribution.
We construct an algorithm for finding a mixed-case Nash equilibrium.
arXiv Detail & Related papers (2020-06-19T22:49:42Z) - The Statistical Complexity of Early-Stopped Mirror Descent [27.393821783237186]
We study the statistical guarantees on the excess risk achieved by early-stopped unconstrained mirror descent algorithms.
By completing an inequality that characterizes convexity for the squared loss, we identify an intrinsic link between offset Rademacher complexities and potential-based convergence analysis of mirror descent methods.
arXiv Detail & Related papers (2020-02-01T11:05:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.