Memory Approximate Message Passing
- URL: http://arxiv.org/abs/2012.10861v2
- Date: Wed, 10 Feb 2021 14:31:38 GMT
- Title: Memory Approximate Message Passing
- Authors: Lei Liu, Shunqi Huang and Brian M. Kurkoski
- Abstract summary: Approximate message passing (AMP) is a low-cost iterative parameter-estimation technique.
This paper proposes a low-complexity memory AMP (MAMP) for unitarily-invariant matrices.
- Score: 9.116196799517262
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Approximate message passing (AMP) is a low-cost iterative
parameter-estimation technique for certain high-dimensional linear systems with
non-Gaussian distributions. However, AMP only applies to the independent
identically distributed (IID) transform matrices, but may become unreliable
(e.g. perform poorly or even diverge) for other matrix ensembles, especially
for ill-conditioned ones. To handle this difficulty, orthogonal/vector AMP
(OAMP/VAMP) was proposed for general bi-unitarily-invariant matrices. However,
the Bayes-optimal OAMP/VAMP requires high-complexity linear minimum mean square
error (MMSE) estimator. This limits the application of OAMP/VAMP to large-scale
systems.
To solve the disadvantages of AMP and OAMP/VAMP, this paper proposes a
low-complexity memory AMP (MAMP) for unitarily-invariant matrices. MAMP is
consisted of an orthogonal non-linear estimator (NLE) for denoising (same as
OAMP/VAMP), and an orthogonal long-memory matched filter (MF) for interference
suppression. Orthogonal principle is used to guarantee the asymptotic
Gaussianity of estimation errors in MAMP. A state evolution is derived to
asymptotically characterize the performance of MAMP. The relaxation parameters
and damping vector in MAMP are analytically optimized based on the state
evolution to guarantee and improve the convergence. MAMP has comparable
complexity to AMP. Furthermore, for all unitarily-invariant matrices, the
optimized MAMP converges to the high-complexity OAMP/VAMP, and thus is
Bayes-optimal if it has a unique fixed point. Finally, simulations are provided
to verify the validity and accuracy of the theoretical results.
Related papers
- Optimal Algorithms for the Inhomogeneous Spiked Wigner Model [89.1371983413931]
We derive an approximate message-passing algorithm (AMP) for the inhomogeneous problem.
We identify in particular the existence of a statistical-to-computational gap where known algorithms require a signal-to-noise ratio bigger than the information-theoretic threshold to perform better than random.
arXiv Detail & Related papers (2023-02-13T19:57:17Z) - Optimization of Annealed Importance Sampling Hyperparameters [77.34726150561087]
Annealed Importance Sampling (AIS) is a popular algorithm used to estimates the intractable marginal likelihood of deep generative models.
We present a parameteric AIS process with flexible intermediary distributions and optimize the bridging distributions to use fewer number of steps for sampling.
We assess the performance of our optimized AIS for marginal likelihood estimation of deep generative models and compare it to other estimators.
arXiv Detail & Related papers (2022-09-27T07:58:25Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - Unitary Approximate Message Passing for Matrix Factorization [90.84906091118084]
We consider matrix factorization (MF) with certain constraints, which finds wide applications in various areas.
We develop a Bayesian approach to MF with an efficient message passing implementation, called UAMPMF.
We show that UAMPMF significantly outperforms state-of-the-art algorithms in terms of recovery accuracy, robustness and computational complexity.
arXiv Detail & Related papers (2022-07-31T12:09:32Z) - Sufficient Statistic Memory Approximate Message Passing [5.708490209087275]
A key feature of the AMP-type algorithms is that their dynamics can be correctly described by state evolution.
This paper proposes a memory AMP (MAMP) under a sufficient statistic condition, named sufficient statistic MAMP (SS-MAMP)
arXiv Detail & Related papers (2022-06-23T13:06:00Z) - Sufficient-Statistic Memory AMP [12.579567275436343]
A key feature of the AMP-type algorithms is that their dynamics can be correctly described by state evolution.
This paper proposes a sufficient-statistic memory AMP (SS-MAMP) algorithm framework.
arXiv Detail & Related papers (2021-12-31T07:25:18Z) - Estimation in Rotationally Invariant Generalized Linear Models via
Approximate Message Passing [21.871513580418604]
We propose a novel family of approximate message passing (AMP) algorithms for signal estimation.
We rigorously characterize their performance in the high-dimensional limit via a state evolution recursion.
arXiv Detail & Related papers (2021-12-08T15:20:04Z) - Memory Approximate Message Passing [9.116196799517262]
This paper proposes a memory AMP (MAMP), in which a long-memory matched filter is proposed for interference suppression.
A state evolution is derived to characterize the performance of MAMP.
For all right-unitarily-invariant matrices, the optimized MAMP converges to OAMP/VAMP, and thus is Bayes-optimal if it has a unique fixed point.
arXiv Detail & Related papers (2021-06-04T03:37:14Z) - Plug-And-Play Learned Gaussian-mixture Approximate Message Passing [71.74028918819046]
We propose a plug-and-play compressed sensing (CS) recovery algorithm suitable for any i.i.d. source prior.
Our algorithm builds upon Borgerding's learned AMP (LAMP), yet significantly improves it by adopting a universal denoising function within the algorithm.
Numerical evaluation shows that the L-GM-AMP algorithm achieves state-of-the-art performance without any knowledge of the source prior.
arXiv Detail & Related papers (2020-11-18T16:40:45Z) - Approximate Message Passing with Spectral Initialization for Generalized
Linear Models [35.618694363241744]
We focus on estimators based on approximate message passing (AMP)
We propose an AMP algorithm with a spectral estimator.
We also provide numerical results that demonstrate the validity of the proposed approach.
arXiv Detail & Related papers (2020-10-07T14:52:35Z) - Modal Regression based Structured Low-rank Matrix Recovery for
Multi-view Learning [70.57193072829288]
Low-rank Multi-view Subspace Learning has shown great potential in cross-view classification in recent years.
Existing LMvSL based methods are incapable of well handling view discrepancy and discriminancy simultaneously.
We propose Structured Low-rank Matrix Recovery (SLMR), a unique method of effectively removing view discrepancy and improving discriminancy.
arXiv Detail & Related papers (2020-03-22T03:57:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.