A Non-Asymptotic Framework for Approximate Message Passing in Spiked
Models
- URL: http://arxiv.org/abs/2208.03313v1
- Date: Fri, 5 Aug 2022 17:59:06 GMT
- Title: A Non-Asymptotic Framework for Approximate Message Passing in Spiked
Models
- Authors: Gen Li, Yuting Wei
- Abstract summary: Approximate message passing (AMP) emerges as an effective iterative paradigm for solving high-dimensional statistical problems.
Prior AMP theory fell short of predicting the AMP dynamics when the number of iterations surpasses $obig(fraclog nloglog nbig)$.
This paper develops a non-asymptotic framework for understanding AMP in spiked matrix estimation.
- Score: 24.786030482013437
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Approximate message passing (AMP) emerges as an effective iterative paradigm
for solving high-dimensional statistical problems. However, prior AMP theory --
which focused mostly on high-dimensional asymptotics -- fell short of
predicting the AMP dynamics when the number of iterations surpasses
$o\big(\frac{\log n}{\log\log n}\big)$ (with $n$ the problem dimension). To
address this inadequacy, this paper develops a non-asymptotic framework for
understanding AMP in spiked matrix estimation. Built upon new decomposition of
AMP updates and controllable residual terms, we lay out an analysis recipe to
characterize the finite-sample behavior of AMP in the presence of an
independent initialization, which is further generalized to allow for spectral
initialization. As two concrete consequences of the proposed analysis recipe:
(i) when solving $\mathbb{Z}_2$ synchronization, we predict the behavior of
spectrally initialized AMP for up to $O\big(\frac{n}{\mathrm{poly}\log n}\big)$
iterations, showing that the algorithm succeeds without the need of a
subsequent refinement stage (as conjectured recently by
\citet{celentano2021local}); (ii) we characterize the non-asymptotic behavior
of AMP in sparse PCA (in the spiked Wigner model) for a broad range of
signal-to-noise ratio.
Related papers
- Non-asymptotic bounds for forward processes in denoising diffusions: Ornstein-Uhlenbeck is hard to beat [49.1574468325115]
This paper presents explicit non-asymptotic bounds on the forward diffusion error in total variation (TV)
We parametrise multi-modal data distributions in terms of the distance $R$ to their furthest modes and consider forward diffusions with additive and multiplicative noise.
arXiv Detail & Related papers (2024-08-25T10:28:31Z) - A Random Matrix Approach to Low-Multilinear-Rank Tensor Approximation [24.558241146742205]
We characterize the large-dimensional spectral behavior of the unfoldings of the data tensor and exhibit relevant signal-to-noise ratios governing the detectability of the principal directions of the signal.
Results allow to accurately predict the reconstruction performance of truncated multilinear SVD (MLSVD) in the non-trivial regime.
arXiv Detail & Related papers (2024-02-05T16:38:30Z) - A non-asymptotic distributional theory of approximate message passing
for sparse and robust regression [20.830017611900832]
This paper develops non-asymptotic distributional characterizations for approximate message passing (AMP)
AMP is a family of iterative algorithms that prove effective as both fast estimators and powerful theoretical machinery.
arXiv Detail & Related papers (2024-01-08T14:34:35Z) - Restoration-Degradation Beyond Linear Diffusions: A Non-Asymptotic
Analysis For DDIM-Type Samplers [90.45898746733397]
We develop a framework for non-asymptotic analysis of deterministic samplers used for diffusion generative modeling.
We show that one step along the probability flow ODE can be expressed as two steps: 1) a restoration step that runs ascent on the conditional log-likelihood at some infinitesimally previous time, and 2) a degradation step that runs the forward process using noise pointing back towards the current gradient.
arXiv Detail & Related papers (2023-03-06T18:59:19Z) - Approximate message passing from random initialization with applications
to $\mathbb{Z}_{2}$ synchronization [25.511475484340156]
This paper is concerned with the problem of reconstructing an unknown rank-one matrix with prior structural information from observations.
We provide the first non-asymotic characterization of Approximate Message Passing (AMP) in this model.
arXiv Detail & Related papers (2023-02-07T18:58:08Z) - Heavy-tailed Streaming Statistical Estimation [58.70341336199497]
We consider the task of heavy-tailed statistical estimation given streaming $p$ samples.
We design a clipped gradient descent and provide an improved analysis under a more nuanced condition on the noise of gradients.
arXiv Detail & Related papers (2021-08-25T21:30:27Z) - Towards Sample-Optimal Compressive Phase Retrieval with Sparse and
Generative Priors [59.33977545294148]
We show that $O(k log L)$ samples suffice to guarantee that the signal is close to any vector that minimizes an amplitude-based empirical loss function.
We adapt this result to sparse phase retrieval, and show that $O(s log n)$ samples are sufficient for a similar guarantee when the underlying signal is $s$-sparse and $n$-dimensional.
arXiv Detail & Related papers (2021-06-29T12:49:54Z) - PCA Initialization for Approximate Message Passing in Rotationally
Invariant Models [29.039655256171088]
Principal Component Analysis provides a natural estimator, and sharp results on its performance have been obtained in the high-dimensional regime.
Recently, an Approximate Message Passing (AMP) algorithm has been proposed as an alternative estimator with the potential to improve the accuracy of PCA.
In this work, we combine the two methods, initialize AMP with PCA, and propose a rigorous characterization of the performance of this estimator.
arXiv Detail & Related papers (2021-06-04T09:13:51Z) - Approximate Message Passing with Spectral Initialization for Generalized
Linear Models [35.618694363241744]
We focus on estimators based on approximate message passing (AMP)
We propose an AMP algorithm with a spectral estimator.
We also provide numerical results that demonstrate the validity of the proposed approach.
arXiv Detail & Related papers (2020-10-07T14:52:35Z) - A maximum-entropy approach to off-policy evaluation in average-reward
MDPs [54.967872716145656]
This work focuses on off-policy evaluation (OPE) with function approximation in infinite-horizon undiscounted Markov decision processes (MDPs)
We provide the first finite-sample OPE error bound, extending existing results beyond the episodic and discounted cases.
We show that this results in an exponential-family distribution whose sufficient statistics are the features, paralleling maximum-entropy approaches in supervised learning.
arXiv Detail & Related papers (2020-06-17T18:13:37Z) - Is Temporal Difference Learning Optimal? An Instance-Dependent Analysis [102.29671176698373]
We address the problem of policy evaluation in discounted decision processes, and provide Markov-dependent guarantees on the $ell_infty$error under a generative model.
We establish both and non-asymptotic versions of local minimax lower bounds for policy evaluation, thereby providing an instance-dependent baseline by which to compare algorithms.
arXiv Detail & Related papers (2020-03-16T17:15:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.