Sufficient Statistic Memory Approximate Message Passing
- URL: http://arxiv.org/abs/2206.11674v1
- Date: Thu, 23 Jun 2022 13:06:00 GMT
- Title: Sufficient Statistic Memory Approximate Message Passing
- Authors: Lei Liu, Shunqi Huang, and Brian M. Kurkoski
- Abstract summary: A key feature of the AMP-type algorithms is that their dynamics can be correctly described by state evolution.
This paper proposes a memory AMP (MAMP) under a sufficient statistic condition, named sufficient statistic MAMP (SS-MAMP)
- Score: 5.708490209087275
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Approximate message passing (AMP) type algorithms have been widely used in
the signal reconstruction of certain large random linear systems. A key feature
of the AMP-type algorithms is that their dynamics can be correctly described by
state evolution. However, state evolution does not necessarily guarantee the
convergence of iterative algorithms. To solve the convergence problem of
AMP-type algorithms in principle, this paper proposes a memory AMP (MAMP) under
a sufficient statistic condition, named sufficient statistic MAMP (SS-MAMP). We
show that the covariance matrices of SS-MAMP are L-banded and convergent. Given
an arbitrary MAMP, we can construct the SS-MAMP by damping, which not only
ensures the convergence, but also preserves the orthogonality, i.e., its
dynamics can be correctly described by state evolution.
Related papers
- Non-stationary Reinforcement Learning under General Function
Approximation [60.430936031067006]
We first propose a new complexity metric called dynamic Bellman Eluder (DBE) dimension for non-stationary MDPs.
Based on the proposed complexity metric, we propose a novel confidence-set based model-free algorithm called SW-OPEA.
We show that SW-OPEA is provably efficient as long as the variation budget is not significantly large.
arXiv Detail & Related papers (2023-06-01T16:19:37Z) - Optimal Algorithms for the Inhomogeneous Spiked Wigner Model [89.1371983413931]
We derive an approximate message-passing algorithm (AMP) for the inhomogeneous problem.
We identify in particular the existence of a statistical-to-computational gap where known algorithms require a signal-to-noise ratio bigger than the information-theoretic threshold to perform better than random.
arXiv Detail & Related papers (2023-02-13T19:57:17Z) - Unitary Approximate Message Passing for Matrix Factorization [90.84906091118084]
We consider matrix factorization (MF) with certain constraints, which finds wide applications in various areas.
We develop a Bayesian approach to MF with an efficient message passing implementation, called UAMPMF.
We show that UAMPMF significantly outperforms state-of-the-art algorithms in terms of recovery accuracy, robustness and computational complexity.
arXiv Detail & Related papers (2022-07-31T12:09:32Z) - Sufficient-Statistic Memory AMP [12.579567275436343]
A key feature of the AMP-type algorithms is that their dynamics can be correctly described by state evolution.
This paper proposes a sufficient-statistic memory AMP (SS-MAMP) algorithm framework.
arXiv Detail & Related papers (2021-12-31T07:25:18Z) - Estimation in Rotationally Invariant Generalized Linear Models via
Approximate Message Passing [21.871513580418604]
We propose a novel family of approximate message passing (AMP) algorithms for signal estimation.
We rigorously characterize their performance in the high-dimensional limit via a state evolution recursion.
arXiv Detail & Related papers (2021-12-08T15:20:04Z) - Memory Approximate Message Passing [9.116196799517262]
This paper proposes a memory AMP (MAMP), in which a long-memory matched filter is proposed for interference suppression.
A state evolution is derived to characterize the performance of MAMP.
For all right-unitarily-invariant matrices, the optimized MAMP converges to OAMP/VAMP, and thus is Bayes-optimal if it has a unique fixed point.
arXiv Detail & Related papers (2021-06-04T03:37:14Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Memory Approximate Message Passing [9.116196799517262]
Approximate message passing (AMP) is a low-cost iterative parameter-estimation technique.
This paper proposes a low-complexity memory AMP (MAMP) for unitarily-invariant matrices.
arXiv Detail & Related papers (2020-12-20T07:42:15Z) - Plug-And-Play Learned Gaussian-mixture Approximate Message Passing [71.74028918819046]
We propose a plug-and-play compressed sensing (CS) recovery algorithm suitable for any i.i.d. source prior.
Our algorithm builds upon Borgerding's learned AMP (LAMP), yet significantly improves it by adopting a universal denoising function within the algorithm.
Numerical evaluation shows that the L-GM-AMP algorithm achieves state-of-the-art performance without any knowledge of the source prior.
arXiv Detail & Related papers (2020-11-18T16:40:45Z) - Amortized Conditional Normalized Maximum Likelihood: Reliable Out of
Distribution Uncertainty Estimation [99.92568326314667]
We propose the amortized conditional normalized maximum likelihood (ACNML) method as a scalable general-purpose approach for uncertainty estimation.
Our algorithm builds on the conditional normalized maximum likelihood (CNML) coding scheme, which has minimax optimal properties according to the minimum description length principle.
We demonstrate that ACNML compares favorably to a number of prior techniques for uncertainty estimation in terms of calibration on out-of-distribution inputs.
arXiv Detail & Related papers (2020-11-05T08:04:34Z) - Rigorous State Evolution Analysis for Approximate Message Passing with
Side Information [15.90775344965397]
A novel framework that incorporates side information into Approximate Message Passing with Side Information (AMP-SI) has been introduced.
We provide rigorous performance guarantees for AMP-SI when there are statistical dependencies between the signal and SI pairs.
We show that the AMP-SI can predict the AMP-SI mean square error accurately.
arXiv Detail & Related papers (2020-03-25T16:11:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.