Convergence Rates of Empirical Bayes Posterior Distributions: A
Variational Perspective
- URL: http://arxiv.org/abs/2009.03969v1
- Date: Tue, 8 Sep 2020 19:35:27 GMT
- Title: Convergence Rates of Empirical Bayes Posterior Distributions: A
Variational Perspective
- Authors: Fengshuo Zhang and Chao Gao
- Abstract summary: We study the convergence rates of empirical Bayes posterior distributions for nonparametric and high-dimensional inference.
We show that the empirical Bayes posterior distribution induced by the maximum marginal likelihood estimator can be regarded as a variational approximation to a hierarchical Bayes posterior distribution.
- Score: 20.51199643121034
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the convergence rates of empirical Bayes posterior distributions for
nonparametric and high-dimensional inference. We show that as long as the
hyperparameter set is discrete, the empirical Bayes posterior distribution
induced by the maximum marginal likelihood estimator can be regarded as a
variational approximation to a hierarchical Bayes posterior distribution. This
connection between empirical Bayes and variational Bayes allows us to leverage
the recent results in the variational Bayes literature, and directly obtains
the convergence rates of empirical Bayes posterior distributions from a
variational perspective. For a more general hyperparameter set that is not
necessarily discrete, we introduce a new technique called "prior decomposition"
to deal with prior distributions that can be written as convex combinations of
probability measures whose supports are low-dimensional subspaces. This leads
to generalized versions of the classical "prior mass and testing" conditions
for the convergence rates of empirical Bayes. Our theory is applied to a number
of statistical estimation problems including nonparametric density estimation
and sparse linear regression.
Related papers
- Predictive variational inference: Learn the predictively optimal posterior distribution [1.7648680700685022]
Vanilla variational inference finds an optimal approximation to the Bayesian posterior distribution, but even the exact Bayesian posterior is often not meaningful under model misspecification.
We propose predictive variational inference (PVI): a general inference framework that seeks and samples from an optimal posterior density.
This framework applies to both likelihood-exact and likelihood-free models.
arXiv Detail & Related papers (2024-10-18T19:44:57Z) - Quasi-Bayes meets Vines [2.3124143670964448]
We propose a different way to extend Quasi-Bayesian prediction to high dimensions through the use of Sklar's theorem.
We show that our proposed Quasi-Bayesian Vine (QB-Vine) is a fully non-parametric density estimator with emphan analytical form.
arXiv Detail & Related papers (2024-06-18T16:31:02Z) - A Mean Field Approach to Empirical Bayes Estimation in High-dimensional
Linear Regression [8.345523969593492]
We study empirical Bayes estimation in high-dimensional linear regression.
We adopt a variational empirical Bayes approach, introduced originally in Carbonetto and Stephens (2012) and Kim et al. (2022).
This provides the first rigorous empirical Bayes method in a high-dimensional regression setting without sparsity.
arXiv Detail & Related papers (2023-09-28T20:51:40Z) - Variational Prediction [95.00085314353436]
We present a technique for learning a variational approximation to the posterior predictive distribution using a variational bound.
This approach can provide good predictive distributions without test time marginalization costs.
arXiv Detail & Related papers (2023-07-14T18:19:31Z) - Bayesian Renormalization [68.8204255655161]
We present a fully information theoretic approach to renormalization inspired by Bayesian statistical inference.
The main insight of Bayesian Renormalization is that the Fisher metric defines a correlation length that plays the role of an emergent RG scale.
We provide insight into how the Bayesian Renormalization scheme relates to existing methods for data compression and data generation.
arXiv Detail & Related papers (2023-05-17T18:00:28Z) - Posterior concentration and fast convergence rates for generalized
Bayesian learning [4.186575888568896]
We study the learning rate of generalized Bayes estimators in a general setting.
We prove that under the multi-scale Bernstein's condition, the generalized posterior distribution concentrates around the set of optimal hypotheses.
arXiv Detail & Related papers (2021-11-19T14:25:21Z) - Variational Refinement for Importance Sampling Using the Forward
Kullback-Leibler Divergence [77.06203118175335]
Variational Inference (VI) is a popular alternative to exact sampling in Bayesian inference.
Importance sampling (IS) is often used to fine-tune and de-bias the estimates of approximate Bayesian inference procedures.
We propose a novel combination of optimization and sampling techniques for approximate Bayesian inference.
arXiv Detail & Related papers (2021-06-30T11:00:24Z) - Neural Empirical Bayes: Source Distribution Estimation and its
Applications to Simulation-Based Inference [9.877509217895263]
We show that a neural empirical Bayes approach recovers ground truth source distributions.
We also show the applicability of Neural Empirical Bayes on an inverse problem from collider physics.
arXiv Detail & Related papers (2020-11-11T14:59:34Z) - Bayesian Deep Learning and a Probabilistic Perspective of Generalization [56.69671152009899]
We show that deep ensembles provide an effective mechanism for approximate Bayesian marginalization.
We also propose a related approach that further improves the predictive distribution by marginalizing within basins of attraction.
arXiv Detail & Related papers (2020-02-20T15:13:27Z) - Decision-Making with Auto-Encoding Variational Bayes [71.44735417472043]
We show that a posterior approximation distinct from the variational distribution should be used for making decisions.
Motivated by these theoretical results, we propose learning several approximate proposals for the best model.
In addition to toy examples, we present a full-fledged case study of single-cell RNA sequencing.
arXiv Detail & Related papers (2020-02-17T19:23:36Z) - Distributionally Robust Bayesian Quadrature Optimization [60.383252534861136]
We study BQO under distributional uncertainty in which the underlying probability distribution is unknown except for a limited set of its i.i.d. samples.
A standard BQO approach maximizes the Monte Carlo estimate of the true expected objective given the fixed sample set.
We propose a novel posterior sampling based algorithm, namely distributionally robust BQO (DRBQO) for this purpose.
arXiv Detail & Related papers (2020-01-19T12:00:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.