PyVBMC: Efficient Bayesian inference in Python
- URL: http://arxiv.org/abs/2303.09519v2
- Date: Tue, 27 Jun 2023 20:14:04 GMT
- Title: PyVBMC: Efficient Bayesian inference in Python
- Authors: Bobby Huggins, Chengkun Li, Marlon Tobaben, Mikko J. Aarnos, Luigi
Acerbi
- Abstract summary: PyVBMC is a Python implementation of the Variational Bayesian Monte Carlo (VBMC) algorithm for posterior and model inference.
VBMC is designed for efficient parameter estimation and model assessment when model evaluations are mildly-to-very expensive.
- Score: 8.924669503280333
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: PyVBMC is a Python implementation of the Variational Bayesian Monte Carlo
(VBMC) algorithm for posterior and model inference for black-box computational
models (Acerbi, 2018, 2020). VBMC is an approximate inference method designed
for efficient parameter estimation and model assessment when model evaluations
are mildly-to-very expensive (e.g., a second or more) and/or noisy.
Specifically, VBMC computes:
- a flexible (non-Gaussian) approximate posterior distribution of the model
parameters, from which statistics and posterior samples can be easily
extracted;
- an approximation of the model evidence or marginal likelihood, a metric
used for Bayesian model selection.
PyVBMC can be applied to any computational or statistical model with up to
roughly 10-15 continuous parameters, with the only requirement that the user
can provide a Python function that computes the target log likelihood of the
model, or an approximation thereof (e.g., an estimate of the likelihood
obtained via simulation or Monte Carlo methods). PyVBMC is particularly
effective when the model takes more than about a second per evaluation, with
dramatic speed-ups of 1-2 orders of magnitude when compared to traditional
approximate inference methods.
Extensive benchmarks on both artificial test problems and a large number of
real models from the computational sciences, particularly computational and
cognitive neuroscience, show that VBMC generally - and often vastly -
outperforms alternative methods for sample-efficient Bayesian inference, and is
applicable to both exact and simulator-based models (Acerbi, 2018, 2019, 2020).
PyVBMC brings this state-of-the-art inference algorithm to Python, along with
an easy-to-use Pythonic interface for running the algorithm and manipulating
and visualizing its results.
Related papers
- Computation-Aware Gaussian Processes: Model Selection And Linear-Time Inference [55.150117654242706]
We show that model selection for computation-aware GPs trained on 1.8 million data points can be done within a few hours on a single GPU.
As a result of this work, Gaussian processes can be trained on large-scale datasets without significantly compromising their ability to quantify uncertainty.
arXiv Detail & Related papers (2024-11-01T21:11:48Z) - Scalable Inference for Bayesian Multinomial Logistic-Normal Dynamic Linear Models [0.5735035463793009]
This article develops an efficient and accurate approach to posterior state estimation, called $textitFenrir$.
Our experiments suggest that Fenrir can be three orders of magnitude more efficient than Stan.
Our methods are made available to the community as a user-friendly software library written in C++ with an R interface.
arXiv Detail & Related papers (2024-10-07T23:20:14Z) - Simulation-based inference with the Python Package sbijax [0.7499722271664147]
sbijax is a Python package that implements a wide variety of state-of-the-art methods in neural simulation-based inference.
The package provides functionality for approximate Bayesian computation, to compute model diagnostics, and to automatically estimate summary statistics.
arXiv Detail & Related papers (2024-09-28T18:47:13Z) - Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - Model Comparison in Approximate Bayesian Computation [0.456877715768796]
A common problem in natural sciences is the comparison of competing models in the light of observed data.
This framework relies on the calculation of likelihood functions which are intractable for most models used in practice.
I propose a new efficient method to perform Bayesian model comparison in ABC.
arXiv Detail & Related papers (2022-03-15T10:24:16Z) - MoEfication: Conditional Computation of Transformer Models for Efficient
Inference [66.56994436947441]
Transformer-based pre-trained language models can achieve superior performance on most NLP tasks due to large parameter capacity, but also lead to huge computation cost.
We explore to accelerate large-model inference by conditional computation based on the sparse activation phenomenon.
We propose to transform a large model into its mixture-of-experts (MoE) version with equal model size, namely MoEfication.
arXiv Detail & Related papers (2021-10-05T02:14:38Z) - Evaluating State-of-the-Art Classification Models Against Bayes
Optimality [106.50867011164584]
We show that we can compute the exact Bayes error of generative models learned using normalizing flows.
We use our approach to conduct a thorough investigation of state-of-the-art classification models.
arXiv Detail & Related papers (2021-06-07T06:21:20Z) - Fast Bayesian Estimation of Spatial Count Data Models [0.0]
We introduce Variational Bayes (VB) as an optimisation problem instead of a simulation problem.
A VB method is derived for posterior inference in negative binomial models with unobserved parameter and spatial dependence.
The VB approach is around 45 to 50 times faster than MCMC on a regular eight-core processor in a simulation and an empirical study.
arXiv Detail & Related papers (2020-07-07T10:24:45Z) - Variational Bayesian Monte Carlo with Noisy Likelihoods [11.4219428942199]
We introduce new global' acquisition functions, such as expected information gain (EIG) and variational interquantile range (VIQR)
VBMC+VIQR achieves state-of-the-art performance in recovering the ground-truth posteriors and model evidence.
arXiv Detail & Related papers (2020-06-15T18:06:18Z) - Nonparametric Estimation in the Dynamic Bradley-Terry Model [69.70604365861121]
We develop a novel estimator that relies on kernel smoothing to pre-process the pairwise comparisons over time.
We derive time-varying oracle bounds for both the estimation error and the excess risk in the model-agnostic setting.
arXiv Detail & Related papers (2020-02-28T21:52:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.