PCA Initialization for Approximate Message Passing in Rotationally
Invariant Models
- URL: http://arxiv.org/abs/2106.02356v1
- Date: Fri, 4 Jun 2021 09:13:51 GMT
- Title: PCA Initialization for Approximate Message Passing in Rotationally
Invariant Models
- Authors: Marco Mondelli and Ramji Venkataramanan
- Abstract summary: Principal Component Analysis provides a natural estimator, and sharp results on its performance have been obtained in the high-dimensional regime.
Recently, an Approximate Message Passing (AMP) algorithm has been proposed as an alternative estimator with the potential to improve the accuracy of PCA.
In this work, we combine the two methods, initialize AMP with PCA, and propose a rigorous characterization of the performance of this estimator.
- Score: 29.039655256171088
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the problem of estimating a rank-$1$ signal in the presence of
rotationally invariant noise-a class of perturbations more general than
Gaussian noise. Principal Component Analysis (PCA) provides a natural
estimator, and sharp results on its performance have been obtained in the
high-dimensional regime. Recently, an Approximate Message Passing (AMP)
algorithm has been proposed as an alternative estimator with the potential to
improve the accuracy of PCA. However, the existing analysis of AMP requires an
initialization that is both correlated with the signal and independent of the
noise, which is often unrealistic in practice. In this work, we combine the two
methods, and propose to initialize AMP with PCA. Our main result is a rigorous
asymptotic characterization of the performance of this estimator. Both the AMP
algorithm and its analysis differ from those previously derived in the Gaussian
setting: at every iteration, our AMP algorithm requires a specific term to
account for PCA initialization, while in the Gaussian case, PCA initialization
affects only the first iteration of AMP. The proof is based on a two-phase
artificial AMP that first approximates the PCA estimator and then mimics the
true AMP. Our numerical simulations show an excellent agreement between AMP
results and theoretical predictions, and suggest an interesting open direction
on achieving Bayes-optimal performance.
Related papers
- Unrolled denoising networks provably learn optimal Bayesian inference [54.79172096306631]
We prove the first rigorous learning guarantees for neural networks based on unrolling approximate message passing (AMP)
For compressed sensing, we prove that when trained on data drawn from a product prior, the layers of the network converge to the same denoisers used in Bayes AMP.
arXiv Detail & Related papers (2024-09-19T17:56:16Z) - A non-asymptotic distributional theory of approximate message passing
for sparse and robust regression [20.830017611900832]
This paper develops non-asymptotic distributional characterizations for approximate message passing (AMP)
AMP is a family of iterative algorithms that prove effective as both fast estimators and powerful theoretical machinery.
arXiv Detail & Related papers (2024-01-08T14:34:35Z) - Optimal Algorithms for the Inhomogeneous Spiked Wigner Model [89.1371983413931]
We derive an approximate message-passing algorithm (AMP) for the inhomogeneous problem.
We identify in particular the existence of a statistical-to-computational gap where known algorithms require a signal-to-noise ratio bigger than the information-theoretic threshold to perform better than random.
arXiv Detail & Related papers (2023-02-13T19:57:17Z) - Approximate message passing from random initialization with applications
to $\mathbb{Z}_{2}$ synchronization [25.511475484340156]
This paper is concerned with the problem of reconstructing an unknown rank-one matrix with prior structural information from observations.
We provide the first non-asymotic characterization of Approximate Message Passing (AMP) in this model.
arXiv Detail & Related papers (2023-02-07T18:58:08Z) - Optimization of Annealed Importance Sampling Hyperparameters [77.34726150561087]
Annealed Importance Sampling (AIS) is a popular algorithm used to estimates the intractable marginal likelihood of deep generative models.
We present a parameteric AIS process with flexible intermediary distributions and optimize the bridging distributions to use fewer number of steps for sampling.
We assess the performance of our optimized AIS for marginal likelihood estimation of deep generative models and compare it to other estimators.
arXiv Detail & Related papers (2022-09-27T07:58:25Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - A Non-Asymptotic Framework for Approximate Message Passing in Spiked
Models [24.786030482013437]
Approximate message passing (AMP) emerges as an effective iterative paradigm for solving high-dimensional statistical problems.
Prior AMP theory fell short of predicting the AMP dynamics when the number of iterations surpasses $obig(fraclog nloglog nbig)$.
This paper develops a non-asymptotic framework for understanding AMP in spiked matrix estimation.
arXiv Detail & Related papers (2022-08-05T17:59:06Z) - Statistical Approach to Quantum Phase Estimation [62.92678804023415]
We introduce a new statistical and variational approach to the phase estimation algorithm (PEA)
Unlike the traditional and iterative PEAs which return only an eigenphase estimate, the proposed method can determine any unknown eigenstate-eigenphase pair.
We show the simulation results of the method with the Qiskit package on the IBM Q platform and on a local computer.
arXiv Detail & Related papers (2021-04-21T00:02:00Z) - Empirical Bayes PCA in high dimensions [11.806200054814772]
Principal Components Analysis is known to exhibit problematic phenomena of high-dimensional noise.
We propose an Empirical Bayes PCA method that reduces this noise by estimating a structural prior for the joint distributions of the principal components.
arXiv Detail & Related papers (2020-12-21T20:43:44Z) - Plug-And-Play Learned Gaussian-mixture Approximate Message Passing [71.74028918819046]
We propose a plug-and-play compressed sensing (CS) recovery algorithm suitable for any i.i.d. source prior.
Our algorithm builds upon Borgerding's learned AMP (LAMP), yet significantly improves it by adopting a universal denoising function within the algorithm.
Numerical evaluation shows that the L-GM-AMP algorithm achieves state-of-the-art performance without any knowledge of the source prior.
arXiv Detail & Related papers (2020-11-18T16:40:45Z) - Approximate Message Passing with Spectral Initialization for Generalized
Linear Models [35.618694363241744]
We focus on estimators based on approximate message passing (AMP)
We propose an AMP algorithm with a spectral estimator.
We also provide numerical results that demonstrate the validity of the proposed approach.
arXiv Detail & Related papers (2020-10-07T14:52:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.