Sequential Gibbs Posteriors with Applications to Principal Component
Analysis
- URL: http://arxiv.org/abs/2310.12882v1
- Date: Thu, 19 Oct 2023 16:36:18 GMT
- Title: Sequential Gibbs Posteriors with Applications to Principal Component
Analysis
- Authors: Steven Winter, Omar Melikechi, David B. Dunson
- Abstract summary: Gibbs posteriors provide a principled framework for likelihood-free Bayesian inference.
But in many situations, including a single tuning parameter inevitably leads to poor uncertainty quantification.
We propose a sequential extension to Gibbs posteriors to address this problem.
- Score: 8.90721241624138
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Gibbs posteriors are proportional to a prior distribution multiplied by an
exponentiated loss function, with a key tuning parameter weighting information
in the loss relative to the prior and providing a control of posterior
uncertainty. Gibbs posteriors provide a principled framework for
likelihood-free Bayesian inference, but in many situations, including a single
tuning parameter inevitably leads to poor uncertainty quantification. In
particular, regardless of the value of the parameter, credible regions have far
from the nominal frequentist coverage even in large samples. We propose a
sequential extension to Gibbs posteriors to address this problem. We prove the
proposed sequential posterior exhibits concentration and a Bernstein-von Mises
theorem, which holds under easy to verify conditions in Euclidean space and on
manifolds. As a byproduct, we obtain the first Bernstein-von Mises theorem for
traditional likelihood-based Bayesian posteriors on manifolds. All methods are
illustrated with an application to principal component analysis.
Related papers
- Reproducible Parameter Inference Using Bagged Posteriors [9.975422461924705]
Under model misspecification, it is known that Bayesian posteriors often do not properly quantify uncertainty about true or pseudo-true parameters.
We consider the probability that two confidence sets constructed from independent data sets have nonempty overlap.
We show that credible sets from the standard posterior can strongly violate this bound, particularly in high-dimensional settings.
arXiv Detail & Related papers (2023-11-03T16:28:16Z) - Calibrating Neural Simulation-Based Inference with Differentiable
Coverage Probability [50.44439018155837]
We propose to include a calibration term directly into the training objective of the neural model.
By introducing a relaxation of the classical formulation of calibration error we enable end-to-end backpropagation.
It is directly applicable to existing computational pipelines allowing reliable black-box posterior inference.
arXiv Detail & Related papers (2023-10-20T10:20:45Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Bayesian Renormalization [68.8204255655161]
We present a fully information theoretic approach to renormalization inspired by Bayesian statistical inference.
The main insight of Bayesian Renormalization is that the Fisher metric defines a correlation length that plays the role of an emergent RG scale.
We provide insight into how the Bayesian Renormalization scheme relates to existing methods for data compression and data generation.
arXiv Detail & Related papers (2023-05-17T18:00:28Z) - A Robustness Analysis of Blind Source Separation [91.3755431537592]
Blind source separation (BSS) aims to recover an unobserved signal from its mixture $X=f(S)$ under the condition that the transformation $f$ is invertible but unknown.
We present a general framework for analysing such violations and quantifying their impact on the blind recovery of $S$ from $X$.
We show that a generic BSS-solution in response to general deviations from its defining structural assumptions can be profitably analysed in the form of explicit continuity guarantees.
arXiv Detail & Related papers (2023-03-17T16:30:51Z) - Semiparametric inference using fractional posteriors [3.9599054392856483]
We show that fractional posterior credible sets can provide reliable semiparametric uncertainty quantification, but have inflated size.
We further propose a itshifted-and-rescaled fractional posterior set that is an efficient confidence set having optimal size under regularity conditions.
arXiv Detail & Related papers (2023-01-19T16:23:36Z) - Posterior samples of source galaxies in strong gravitational lenses with
score-based priors [107.52670032376555]
We use a score-based model to encode the prior for the inference of undistorted images of background galaxies.
We show how the balance between the likelihood and the prior meet our expectations in an experiment with out-of-distribution data.
arXiv Detail & Related papers (2022-11-07T19:00:42Z) - Large deviations rates for stochastic gradient descent with strongly
convex functions [11.247580943940916]
We provide a formal framework for the study of general high probability bounds with gradient descent.
We find an upper large deviations bound for SGD with strongly convex functions.
arXiv Detail & Related papers (2022-11-02T09:15:26Z) - Robust Generalised Bayesian Inference for Intractable Likelihoods [9.77823546576708]
We consider generalised Bayesian inference with a Stein discrepancy as a loss function.
This is motivated by applications in which the likelihood contains an intractable normalisation constant.
We show consistency, normality and bias-robustness of the posterior, highlighting how these properties are impacted by the choice of Stein discrepancy.
arXiv Detail & Related papers (2021-04-15T10:31:22Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z) - A deep-learning based Bayesian approach to seismic imaging and
uncertainty quantification [0.4588028371034407]
Uncertainty is essential when dealing with ill-conditioned inverse problems.
It is often not possible to formulate a prior distribution that precisely encodes our prior knowledge about the unknown.
We propose to use the functional form of a randomly convolutional neural network as an implicit structured prior.
arXiv Detail & Related papers (2020-01-13T23:46:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.