Robust Expected Information Gain for Optimal Bayesian Experimental
Design Using Ambiguity Sets
- URL: http://arxiv.org/abs/2205.09914v1
- Date: Fri, 20 May 2022 01:07:41 GMT
- Title: Robust Expected Information Gain for Optimal Bayesian Experimental
Design Using Ambiguity Sets
- Authors: Jinwoo Go, Tobin Isaac
- Abstract summary: We define and analyze emphrobust expected information gain (REIG)
REIG is a modification of the objective in EIG by minimizing an affine relaxation of EIG over an ambiguity set of perturbed distributions.
We show that, when combined with a sampling-based approach to estimating EIG, REIG corresponds to a log-sum-exp' stabilization of the samples used to estimate EIG.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The ranking of experiments by expected information gain (EIG) in Bayesian
experimental design is sensitive to changes in the model's prior distribution,
and the approximation of EIG yielded by sampling will have errors similar to
the use of a perturbed prior. We define and analyze \emph{robust expected
information gain} (REIG), a modification of the objective in EIG maximization
by minimizing an affine relaxation of EIG over an ambiguity set of
distributions that are close to the original prior in KL-divergence. We show
that, when combined with a sampling-based approach to estimating EIG, REIG
corresponds to a `log-sum-exp' stabilization of the samples used to estimate
EIG, meaning that it can be efficiently implemented in practice. Numerical
tests combining REIG with variational nested Monte Carlo (VNMC), adaptive
contrastive estimation (ACE) and mutual information neural estimation (MINE)
suggest that in practice REIG also compensates for the variability of
under-sampled estimators.
Related papers
- Bayesian Experimental Design via Contrastive Diffusions [2.2186678387006435]
Experimental Design (BOED) is a powerful tool to reduce the cost of running a sequence of experiments.
We introduce an it expected posterior distribution with cost-effective properties and provide a tractable access to the EIG contrast.
By incorporating generative models into the BOED framework, we expand its scope and its use in scenarios that were previously impractical.
arXiv Detail & Related papers (2024-10-15T17:53:07Z) - Hyperspectral Unmixing Under Endmember Variability: A Variational Inference Framework [22.114121550108344]
This work proposes a variational inference framework for hyperspectral unmixing in the presence of endmember variability (HU-EV)
An EV-accounted noisy linear mixture model (LMM) is considered, and the presence of outliers is also incorporated into the model.
The effectiveness of the proposed framework is demonstrated through synthetic, semi-real, and real-data experiments.
arXiv Detail & Related papers (2024-07-20T15:16:14Z) - Variational Bayesian Optimal Experimental Design with Normalizing Flows [0.837622912636323]
Variational OED estimates a lower bound of the EIG without likelihood evaluations.
We introduce the use of normalizing flows for representing variational distributions in vOED.
We show that a composition of 4--5 layers is able to achieve lower EIG estimation bias.
arXiv Detail & Related papers (2024-04-08T14:44:21Z) - On Estimating the Gradient of the Expected Information Gain in Bayesian
Experimental Design [5.874142059884521]
We develop methods for estimating the gradient of EIG, which combined with gradient descent algorithms, result in efficient optimization of EIG.
Based on this, we propose two methods for estimating the EIG gradient, UEEG-MCMC that leverages posterior samples to estimate the EIG gradient, and BEEG-AP that focuses on achieving high simulation efficiency by repeatedly using parameter samples.
arXiv Detail & Related papers (2023-08-19T02:48:44Z) - Scalable method for Bayesian experimental design without integrating
over posterior distribution [0.0]
We address the computational efficiency in solving the A-optimal Bayesian design of experiments problems.
A-optimality is a widely used and easy-to-interpret criterion for Bayesian experimental design.
This study presents a novel likelihood-free approach to the A-optimal experimental design.
arXiv Detail & Related papers (2023-06-30T12:40:43Z) - Federated Conformal Predictors for Distributed Uncertainty
Quantification [83.50609351513886]
Conformal prediction is emerging as a popular paradigm for providing rigorous uncertainty quantification in machine learning.
In this paper, we extend conformal prediction to the federated learning setting.
We propose a weaker notion of partial exchangeability, better suited to the FL setting, and use it to develop the Federated Conformal Prediction framework.
arXiv Detail & Related papers (2023-05-27T19:57:27Z) - Prediction-Oriented Bayesian Active Learning [51.426960808684655]
Expected predictive information gain (EPIG) is an acquisition function that measures information gain in the space of predictions rather than parameters.
EPIG leads to stronger predictive performance compared with BALD across a range of datasets and models.
arXiv Detail & Related papers (2023-04-17T10:59:57Z) - Optimization of Annealed Importance Sampling Hyperparameters [77.34726150561087]
Annealed Importance Sampling (AIS) is a popular algorithm used to estimates the intractable marginal likelihood of deep generative models.
We present a parameteric AIS process with flexible intermediary distributions and optimize the bridging distributions to use fewer number of steps for sampling.
We assess the performance of our optimized AIS for marginal likelihood estimation of deep generative models and compare it to other estimators.
arXiv Detail & Related papers (2022-09-27T07:58:25Z) - Regularizing Variational Autoencoder with Diversity and Uncertainty
Awareness [61.827054365139645]
Variational Autoencoder (VAE) approximates the posterior of latent variables based on amortized variational inference.
We propose an alternative model, DU-VAE, for learning a more Diverse and less Uncertain latent space.
arXiv Detail & Related papers (2021-10-24T07:58:13Z) - Variational Refinement for Importance Sampling Using the Forward
Kullback-Leibler Divergence [77.06203118175335]
Variational Inference (VI) is a popular alternative to exact sampling in Bayesian inference.
Importance sampling (IS) is often used to fine-tune and de-bias the estimates of approximate Bayesian inference procedures.
We propose a novel combination of optimization and sampling techniques for approximate Bayesian inference.
arXiv Detail & Related papers (2021-06-30T11:00:24Z) - Cauchy-Schwarz Regularized Autoencoder [68.80569889599434]
Variational autoencoders (VAE) are a powerful and widely-used class of generative models.
We introduce a new constrained objective based on the Cauchy-Schwarz divergence, which can be computed analytically for GMMs.
Our objective improves upon variational auto-encoding models in density estimation, unsupervised clustering, semi-supervised learning, and face analysis.
arXiv Detail & Related papers (2021-01-06T17:36:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.