Sufficient-Statistic Memory AMP
- URL: http://arxiv.org/abs/2112.15327v4
- Date: Fri, 30 Jun 2023 03:26:48 GMT
- Title: Sufficient-Statistic Memory AMP
- Authors: Lei Liu, Shunqi Huang, YuZhi Yang, Zhaoyang Zhang and Brian M.
Kurkoski
- Abstract summary: A key feature of the AMP-type algorithms is that their dynamics can be correctly described by state evolution.
This paper proposes a sufficient-statistic memory AMP (SS-MAMP) algorithm framework.
- Score: 12.579567275436343
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Approximate message passing (AMP) type algorithms have been widely used in
the signal reconstruction of certain large random linear systems. A key feature
of the AMP-type algorithms is that their dynamics can be correctly described by
state evolution. While state evolution is a useful analytic tool, its
convergence is not guaranteed. To solve the convergence problem of the state
evolution of AMP-type algorithms in principle, this paper proposes a
sufficient-statistic memory AMP (SS-MAMP) algorithm framework under the
conditions of right-unitarily invariant sensing matrices, Lipschitz-continuous
local processors and the sufficient-statistic constraint (i.e., the current
message of each local processor is a sufficient statistic of the signal vector
given the current and all preceding messages). We show that the covariance
matrices of SS-MAMP are L-banded and convergent, which is an optimal framework
(from the local MMSE/LMMSE perspective) for AMP-type algorithms given the
Lipschitz-continuous local processors. Given an arbitrary MAMP, we can
construct an SS-MAMP by damping, which not only ensures the convergence of the
state evolution, but also preserves the orthogonality, i.e., its dynamics can
be correctly described by state evolution. As a byproduct, we prove that the
Bayes-optimal orthogonal/vector AMP (BO-OAMP/VAMP) is an SS-MAMP. As an
example, we construct a sufficient-statistic Bayes-optimal MAMP (SS-BO-MAMP)
whose state evolution converges to the minimum (i.e., Bayes-optimal) mean
square error (MSE) predicted by replica methods when it has a unique fixed
point. In addition, the MSE of SS-BO-MAMP is not worse than the original
BO-MAMP. Finally, simulations are provided to support the theoretical results.
Related papers
- Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Optimal Algorithms for the Inhomogeneous Spiked Wigner Model [89.1371983413931]
We derive an approximate message-passing algorithm (AMP) for the inhomogeneous problem.
We identify in particular the existence of a statistical-to-computational gap where known algorithms require a signal-to-noise ratio bigger than the information-theoretic threshold to perform better than random.
arXiv Detail & Related papers (2023-02-13T19:57:17Z) - Optimization of Annealed Importance Sampling Hyperparameters [77.34726150561087]
Annealed Importance Sampling (AIS) is a popular algorithm used to estimates the intractable marginal likelihood of deep generative models.
We present a parameteric AIS process with flexible intermediary distributions and optimize the bridging distributions to use fewer number of steps for sampling.
We assess the performance of our optimized AIS for marginal likelihood estimation of deep generative models and compare it to other estimators.
arXiv Detail & Related papers (2022-09-27T07:58:25Z) - Capacity Optimality of OAMP in Coded Large Unitarily Invariant Systems [9.101719525164803]
We investigate a unitarily invariant system (LUIS) involving a unitarily invariant sensing matrix, an arbitrary fixed signal distribution, and forward error control (FEC) coding.
We show that OAMP with the optimized codes has significant performance improvement over the un-optimized ones and the well-known Turbo linear MMSE algorithm.
arXiv Detail & Related papers (2022-06-23T13:11:20Z) - Sufficient Statistic Memory Approximate Message Passing [5.708490209087275]
A key feature of the AMP-type algorithms is that their dynamics can be correctly described by state evolution.
This paper proposes a memory AMP (MAMP) under a sufficient statistic condition, named sufficient statistic MAMP (SS-MAMP)
arXiv Detail & Related papers (2022-06-23T13:06:00Z) - Fast Batch Nuclear-norm Maximization and Minimization for Robust Domain
Adaptation [154.2195491708548]
We study the prediction discriminability and diversity by studying the structure of the classification output matrix of a randomly selected data batch.
We propose Batch Nuclear-norm Maximization and Minimization, which performs nuclear-norm on the target output matrix to enhance the target prediction ability.
Experiments show that our method could boost the adaptation accuracy and robustness under three typical domain adaptation scenarios.
arXiv Detail & Related papers (2021-07-13T15:08:32Z) - Memory Approximate Message Passing [9.116196799517262]
This paper proposes a memory AMP (MAMP), in which a long-memory matched filter is proposed for interference suppression.
A state evolution is derived to characterize the performance of MAMP.
For all right-unitarily-invariant matrices, the optimized MAMP converges to OAMP/VAMP, and thus is Bayes-optimal if it has a unique fixed point.
arXiv Detail & Related papers (2021-06-04T03:37:14Z) - Memory Approximate Message Passing [9.116196799517262]
Approximate message passing (AMP) is a low-cost iterative parameter-estimation technique.
This paper proposes a low-complexity memory AMP (MAMP) for unitarily-invariant matrices.
arXiv Detail & Related papers (2020-12-20T07:42:15Z) - Plug-And-Play Learned Gaussian-mixture Approximate Message Passing [71.74028918819046]
We propose a plug-and-play compressed sensing (CS) recovery algorithm suitable for any i.i.d. source prior.
Our algorithm builds upon Borgerding's learned AMP (LAMP), yet significantly improves it by adopting a universal denoising function within the algorithm.
Numerical evaluation shows that the L-GM-AMP algorithm achieves state-of-the-art performance without any knowledge of the source prior.
arXiv Detail & Related papers (2020-11-18T16:40:45Z) - Amortized Conditional Normalized Maximum Likelihood: Reliable Out of
Distribution Uncertainty Estimation [99.92568326314667]
We propose the amortized conditional normalized maximum likelihood (ACNML) method as a scalable general-purpose approach for uncertainty estimation.
Our algorithm builds on the conditional normalized maximum likelihood (CNML) coding scheme, which has minimax optimal properties according to the minimum description length principle.
We demonstrate that ACNML compares favorably to a number of prior techniques for uncertainty estimation in terms of calibration on out-of-distribution inputs.
arXiv Detail & Related papers (2020-11-05T08:04:34Z) - Rigorous State Evolution Analysis for Approximate Message Passing with
Side Information [15.90775344965397]
A novel framework that incorporates side information into Approximate Message Passing with Side Information (AMP-SI) has been introduced.
We provide rigorous performance guarantees for AMP-SI when there are statistical dependencies between the signal and SI pairs.
We show that the AMP-SI can predict the AMP-SI mean square error accurately.
arXiv Detail & Related papers (2020-03-25T16:11:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.