Variational Bayesian Monte Carlo with Noisy Likelihoods
- URL: http://arxiv.org/abs/2006.08655v3
- Date: Mon, 19 Oct 2020 05:54:29 GMT
- Title: Variational Bayesian Monte Carlo with Noisy Likelihoods
- Authors: Luigi Acerbi
- Abstract summary: We introduce new global' acquisition functions, such as expected information gain (EIG) and variational interquantile range (VIQR)
VBMC+VIQR achieves state-of-the-art performance in recovering the ground-truth posteriors and model evidence.
- Score: 11.4219428942199
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Variational Bayesian Monte Carlo (VBMC) is a recently introduced framework
that uses Gaussian process surrogates to perform approximate Bayesian inference
in models with black-box, non-cheap likelihoods. In this work, we extend VBMC
to deal with noisy log-likelihood evaluations, such as those arising from
simulation-based models. We introduce new `global' acquisition functions, such
as expected information gain (EIG) and variational interquantile range (VIQR),
which are robust to noise and can be efficiently evaluated within the VBMC
setting. In a novel, challenging, noisy-inference benchmark comprising of a
variety of models with real datasets from computational and cognitive
neuroscience, VBMC+VIQR achieves state-of-the-art performance in recovering the
ground-truth posteriors and model evidence. In particular, our method vastly
outperforms `local' acquisition functions and other surrogate-based inference
methods while keeping a small algorithmic cost. Our benchmark corroborates VBMC
as a general-purpose technique for sample-efficient black-box Bayesian
inference also with noisy models.
Related papers
- Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Consensus-Adaptive RANSAC [104.87576373187426]
We propose a new RANSAC framework that learns to explore the parameter space by considering the residuals seen so far via a novel attention layer.
The attention mechanism operates on a batch of point-to-model residuals, and updates a per-point estimation state to take into account the consensus found through a lightweight one-step transformer.
arXiv Detail & Related papers (2023-07-26T08:25:46Z) - Value function estimation using conditional diffusion models for control [62.27184818047923]
We propose a simple algorithm called Diffused Value Function (DVF)
It learns a joint multi-step model of the environment-robot interaction dynamics using a diffusion model.
We show how DVF can be used to efficiently capture the state visitation measure for multiple controllers.
arXiv Detail & Related papers (2023-06-09T18:40:55Z) - PyVBMC: Efficient Bayesian inference in Python [8.924669503280333]
PyVBMC is a Python implementation of the Variational Bayesian Monte Carlo (VBMC) algorithm for posterior and model inference.
VBMC is designed for efficient parameter estimation and model assessment when model evaluations are mildly-to-very expensive.
arXiv Detail & Related papers (2023-03-16T17:37:22Z) - Quasi Black-Box Variational Inference with Natural Gradients for
Bayesian Learning [84.90242084523565]
We develop an optimization algorithm suitable for Bayesian learning in complex models.
Our approach relies on natural gradient updates within a general black-box framework for efficient training with limited model-specific derivations.
arXiv Detail & Related papers (2022-05-23T18:54:27Z) - Scalable Stochastic Parametric Verification with Stochastic Variational
Smoothed Model Checking [1.5293427903448025]
Smoothed model checking (smMC) aims at inferring the satisfaction function over the entire parameter space from a limited set of observations.
In this paper, we exploit recent advances in probabilistic machine learning to push this limitation forward.
We compare the performances of smMC against those of SV-smMC by looking at the scalability, the computational efficiency and the accuracy of the reconstructed satisfaction function.
arXiv Detail & Related papers (2022-05-11T10:43:23Z) - Cyclical Variational Bayes Monte Carlo for Efficient Multi-Modal
Posterior Distributions Evaluation [0.0]
Variational inference is an alternative approach to sampling methods to estimate posterior approximations.
The Variational Bayesian Monte Carlo (VBMC) method is investigated with the purpose of dealing with statistical model updating problems.
arXiv Detail & Related papers (2022-02-23T17:31:42Z) - Sample-Efficient Reinforcement Learning via Conservative Model-Based
Actor-Critic [67.00475077281212]
Model-based reinforcement learning algorithms are more sample efficient than their model-free counterparts.
We propose a novel approach that achieves high sample efficiency without the strong reliance on accurate learned models.
We show that CMBAC significantly outperforms state-of-the-art approaches in terms of sample efficiency on several challenging tasks.
arXiv Detail & Related papers (2021-12-16T15:33:11Z) - Evaluating State-of-the-Art Classification Models Against Bayes
Optimality [106.50867011164584]
We show that we can compute the exact Bayes error of generative models learned using normalizing flows.
We use our approach to conduct a thorough investigation of state-of-the-art classification models.
arXiv Detail & Related papers (2021-06-07T06:21:20Z) - Approximate Bayesian inference from noisy likelihoods with Gaussian
process emulated MCMC [0.24275655667345403]
We model the log-likelihood function using a Gaussian process (GP)
The main methodological innovation is to apply this model to emulate the progression that an exact Metropolis-Hastings (MH) sampler would take.
The resulting approximate sampler is conceptually simple and sample-efficient.
arXiv Detail & Related papers (2021-04-08T17:38:02Z) - Fast Bayesian Estimation of Spatial Count Data Models [0.0]
We introduce Variational Bayes (VB) as an optimisation problem instead of a simulation problem.
A VB method is derived for posterior inference in negative binomial models with unobserved parameter and spatial dependence.
The VB approach is around 45 to 50 times faster than MCMC on a regular eight-core processor in a simulation and an empirical study.
arXiv Detail & Related papers (2020-07-07T10:24:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.