Unifying AMP Algorithms for Rotationally-Invariant Models
- URL: http://arxiv.org/abs/2412.01574v1
- Date: Mon, 02 Dec 2024 14:56:35 GMT
- Title: Unifying AMP Algorithms for Rotationally-Invariant Models
- Authors: Songbin Liu, Junjie Ma,
- Abstract summary: This paper presents a unified framework for constructing Approximate Message Passing (AMP) algorithms for rotationally-invariant models.
By employing a general iterative algorithm template and reducing it to long-memory Orthogonal AMP (OAMP), we systematically derive the correct Onsager terms of AMP algorithms.
- Score: 8.133739801185271
- License:
- Abstract: This paper presents a unified framework for constructing Approximate Message Passing (AMP) algorithms for rotationally-invariant models. By employing a general iterative algorithm template and reducing it to long-memory Orthogonal AMP (OAMP), we systematically derive the correct Onsager terms of AMP algorithms. This approach allows us to rederive an AMP algorithm introduced by Fan and Opper et al., while shedding new light on the role of free cumulants of the spectral law. The free cumulants arise naturally from a recursive centering operation, potentially of independent interest beyond the scope of AMP. To illustrate the flexibility of our framework, we introduce two novel AMP variants and apply them to estimation in spiked models.
Related papers
- Amortizing intractable inference in large language models [56.92471123778389]
We use amortized Bayesian inference to sample from intractable posterior distributions.
We empirically demonstrate that this distribution-matching paradigm of LLM fine-tuning can serve as an effective alternative to maximum-likelihood training.
As an important application, we interpret chain-of-thought reasoning as a latent variable modeling problem.
arXiv Detail & Related papers (2023-10-06T16:36:08Z) - Optimal Algorithms for the Inhomogeneous Spiked Wigner Model [89.1371983413931]
We derive an approximate message-passing algorithm (AMP) for the inhomogeneous problem.
We identify in particular the existence of a statistical-to-computational gap where known algorithms require a signal-to-noise ratio bigger than the information-theoretic threshold to perform better than random.
arXiv Detail & Related papers (2023-02-13T19:57:17Z) - Approximate Message Passing for Multi-Layer Estimation in Rotationally
Invariant Models [15.605031496980775]
We present a new class of approximate message passing (AMP) algorithms and give a state evolution recursion.
Our results show that this complexity gain comes at little to no cost in the performance of the algorithm.
arXiv Detail & Related papers (2022-12-03T08:10:35Z) - Variational Laplace Autoencoders [53.08170674326728]
Variational autoencoders employ an amortized inference model to approximate the posterior of latent variables.
We present a novel approach that addresses the limited posterior expressiveness of fully-factorized Gaussian assumption.
We also present a general framework named Variational Laplace Autoencoders (VLAEs) for training deep generative models.
arXiv Detail & Related papers (2022-11-30T18:59:27Z) - GEC: A Unified Framework for Interactive Decision Making in MDP, POMDP,
and Beyond [101.5329678997916]
We study sample efficient reinforcement learning (RL) under the general framework of interactive decision making.
We propose a novel complexity measure, generalized eluder coefficient (GEC), which characterizes the fundamental tradeoff between exploration and exploitation.
We show that RL problems with low GEC form a remarkably rich class, which subsumes low Bellman eluder dimension problems, bilinear class, low witness rank problems, PO-bilinear class, and generalized regular PSR.
arXiv Detail & Related papers (2022-11-03T16:42:40Z) - A Non-Asymptotic Framework for Approximate Message Passing in Spiked
Models [24.786030482013437]
Approximate message passing (AMP) emerges as an effective iterative paradigm for solving high-dimensional statistical problems.
Prior AMP theory fell short of predicting the AMP dynamics when the number of iterations surpasses $obig(fraclog nloglog nbig)$.
This paper develops a non-asymptotic framework for understanding AMP in spiked matrix estimation.
arXiv Detail & Related papers (2022-08-05T17:59:06Z) - Estimation in Rotationally Invariant Generalized Linear Models via
Approximate Message Passing [21.871513580418604]
We propose a novel family of approximate message passing (AMP) algorithms for signal estimation.
We rigorously characterize their performance in the high-dimensional limit via a state evolution recursion.
arXiv Detail & Related papers (2021-12-08T15:20:04Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Plug-And-Play Learned Gaussian-mixture Approximate Message Passing [71.74028918819046]
We propose a plug-and-play compressed sensing (CS) recovery algorithm suitable for any i.i.d. source prior.
Our algorithm builds upon Borgerding's learned AMP (LAMP), yet significantly improves it by adopting a universal denoising function within the algorithm.
Numerical evaluation shows that the L-GM-AMP algorithm achieves state-of-the-art performance without any knowledge of the source prior.
arXiv Detail & Related papers (2020-11-18T16:40:45Z) - Approximate Message Passing with Spectral Initialization for Generalized
Linear Models [35.618694363241744]
We focus on estimators based on approximate message passing (AMP)
We propose an AMP algorithm with a spectral estimator.
We also provide numerical results that demonstrate the validity of the proposed approach.
arXiv Detail & Related papers (2020-10-07T14:52:35Z) - Rigorous State Evolution Analysis for Approximate Message Passing with
Side Information [15.90775344965397]
A novel framework that incorporates side information into Approximate Message Passing with Side Information (AMP-SI) has been introduced.
We provide rigorous performance guarantees for AMP-SI when there are statistical dependencies between the signal and SI pairs.
We show that the AMP-SI can predict the AMP-SI mean square error accurately.
arXiv Detail & Related papers (2020-03-25T16:11:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.