Posterior concentration and fast convergence rates for generalized
Bayesian learning
- URL: http://arxiv.org/abs/2111.10243v1
- Date: Fri, 19 Nov 2021 14:25:21 GMT
- Title: Posterior concentration and fast convergence rates for generalized
Bayesian learning
- Authors: Lam Si Tung Ho, Binh T. Nguyen, Vu Dinh, Duy Nguyen
- Abstract summary: We study the learning rate of generalized Bayes estimators in a general setting.
We prove that under the multi-scale Bernstein's condition, the generalized posterior distribution concentrates around the set of optimal hypotheses.
- Score: 4.186575888568896
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we study the learning rate of generalized Bayes estimators in
a general setting where the hypothesis class can be uncountable and have an
irregular shape, the loss function can have heavy tails, and the optimal
hypothesis may not be unique. We prove that under the multi-scale Bernstein's
condition, the generalized posterior distribution concentrates around the set
of optimal hypotheses and the generalized Bayes estimator can achieve fast
learning rate. Our results are applied to show that the standard Bayesian
linear regression is robust to heavy-tailed distributions.
Related papers
- Predictive variational inference: Learn the predictively optimal posterior distribution [1.7648680700685022]
Vanilla variational inference finds an optimal approximation to the Bayesian posterior distribution, but even the exact Bayesian posterior is often not meaningful under model misspecification.
We propose predictive variational inference (PVI): a general inference framework that seeks and samples from an optimal posterior density.
This framework applies to both likelihood-exact and likelihood-free models.
arXiv Detail & Related papers (2024-10-18T19:44:57Z) - Generalizing to any diverse distribution: uniformity, gentle finetuning and rebalancing [55.791818510796645]
We aim to develop models that generalize well to any diverse test distribution, even if the latter deviates significantly from the training data.
Various approaches like domain adaptation, domain generalization, and robust optimization attempt to address the out-of-distribution challenge.
We adopt a more conservative perspective by accounting for the worst-case error across all sufficiently diverse test distributions within a known domain.
arXiv Detail & Related papers (2024-10-08T12:26:48Z) - Generalized Laplace Approximation [23.185126261153236]
We introduce a unified theoretical framework to attribute Bayesian inconsistency to model misspecification and inadequate priors.
We propose the generalized Laplace approximation, which involves a simple adjustment to the Hessian matrix of the regularized loss function.
We assess the performance and properties of the generalized Laplace approximation on state-of-the-art neural networks and real-world datasets.
arXiv Detail & Related papers (2024-05-22T11:11:42Z) - Bayesian Renormalization [68.8204255655161]
We present a fully information theoretic approach to renormalization inspired by Bayesian statistical inference.
The main insight of Bayesian Renormalization is that the Fisher metric defines a correlation length that plays the role of an emergent RG scale.
We provide insight into how the Bayesian Renormalization scheme relates to existing methods for data compression and data generation.
arXiv Detail & Related papers (2023-05-17T18:00:28Z) - Variational Refinement for Importance Sampling Using the Forward
Kullback-Leibler Divergence [77.06203118175335]
Variational Inference (VI) is a popular alternative to exact sampling in Bayesian inference.
Importance sampling (IS) is often used to fine-tune and de-bias the estimates of approximate Bayesian inference procedures.
We propose a novel combination of optimization and sampling techniques for approximate Bayesian inference.
arXiv Detail & Related papers (2021-06-30T11:00:24Z) - Robust Generalised Bayesian Inference for Intractable Likelihoods [9.77823546576708]
We consider generalised Bayesian inference with a Stein discrepancy as a loss function.
This is motivated by applications in which the likelihood contains an intractable normalisation constant.
We show consistency, normality and bias-robustness of the posterior, highlighting how these properties are impacted by the choice of Stein discrepancy.
arXiv Detail & Related papers (2021-04-15T10:31:22Z) - Convergence Rates of Empirical Bayes Posterior Distributions: A
Variational Perspective [20.51199643121034]
We study the convergence rates of empirical Bayes posterior distributions for nonparametric and high-dimensional inference.
We show that the empirical Bayes posterior distribution induced by the maximum marginal likelihood estimator can be regarded as a variational approximation to a hierarchical Bayes posterior distribution.
arXiv Detail & Related papers (2020-09-08T19:35:27Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z) - Bayesian Deep Learning and a Probabilistic Perspective of Generalization [56.69671152009899]
We show that deep ensembles provide an effective mechanism for approximate Bayesian marginalization.
We also propose a related approach that further improves the predictive distribution by marginalizing within basins of attraction.
arXiv Detail & Related papers (2020-02-20T15:13:27Z) - Distributionally Robust Bayesian Quadrature Optimization [60.383252534861136]
We study BQO under distributional uncertainty in which the underlying probability distribution is unknown except for a limited set of its i.i.d. samples.
A standard BQO approach maximizes the Monte Carlo estimate of the true expected objective given the fixed sample set.
We propose a novel posterior sampling based algorithm, namely distributionally robust BQO (DRBQO) for this purpose.
arXiv Detail & Related papers (2020-01-19T12:00:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.