Sequential Bayesian Experimental Design for Implicit Models via Mutual
Information
- URL: http://arxiv.org/abs/2003.09379v1
- Date: Fri, 20 Mar 2020 16:52:10 GMT
- Title: Sequential Bayesian Experimental Design for Implicit Models via Mutual
Information
- Authors: Steven Kleinegesse, Christopher Drovandi, Michael U. Gutmann
- Abstract summary: A class of models of particular interest for the natural and medical sciences are implicit models.
We devise a novel sequential design framework for parameter estimation that uses the Mutual Information (MI) between model parameters and simulated data as a utility function.
We find that our framework is efficient for the various implicit models tested, yielding accurate parameter estimates after only a few iterations.
- Score: 12.68659360172393
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian experimental design (BED) is a framework that uses statistical
models and decision making under uncertainty to optimise the cost and
performance of a scientific experiment. Sequential BED, as opposed to static
BED, considers the scenario where we can sequentially update our beliefs about
the model parameters through data gathered in the experiment. A class of models
of particular interest for the natural and medical sciences are implicit
models, where the data generating distribution is intractable, but sampling
from it is possible. Even though there has been a lot of work on static BED for
implicit models in the past few years, the notoriously difficult problem of
sequential BED for implicit models has barely been touched upon. We address
this gap in the literature by devising a novel sequential design framework for
parameter estimation that uses the Mutual Information (MI) between model
parameters and simulated data as a utility function to find optimal
experimental designs, which has not been done before for implicit models. Our
approach uses likelihood-free inference by ratio estimation to simultaneously
estimate posterior distributions and the MI. During the sequential BED
procedure we utilise Bayesian optimisation to help us optimise the MI utility.
We find that our framework is efficient for the various implicit models tested,
yielding accurate parameter estimates after only a few iterations.
Related papers
- SMILE: Zero-Shot Sparse Mixture of Low-Rank Experts Construction From Pre-Trained Foundation Models [85.67096251281191]
We present an innovative approach to model fusion called zero-shot Sparse MIxture of Low-rank Experts (SMILE) construction.
SMILE allows for the upscaling of source models into an MoE model without extra data or further training.
We conduct extensive experiments across diverse scenarios, such as image classification and text generation tasks, using full fine-tuning and LoRA fine-tuning.
arXiv Detail & Related papers (2024-08-19T17:32:15Z) - Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - On the Effectiveness of Parameter-Efficient Fine-Tuning [79.6302606855302]
Currently, many research works propose to only fine-tune a small portion of the parameters while keeping most of the parameters shared across different tasks.
We show that all of the methods are actually sparse fine-tuned models and conduct a novel theoretical analysis of them.
Despite the effectiveness of sparsity grounded by our theory, it still remains an open problem of how to choose the tunable parameters.
arXiv Detail & Related papers (2022-11-28T17:41:48Z) - Correcting Model Bias with Sparse Implicit Processes [0.9187159782788579]
We show that Sparse Implicit Processes (SIP) is capable of correcting model bias when the data generating mechanism differs strongly from the one implied by the model.
We use synthetic datasets to show that SIP is capable of providing predictive distributions that reflect the data better than the exact predictions of the initial, but wrongly assumed model.
arXiv Detail & Related papers (2022-07-21T18:00:01Z) - Model Comparison in Approximate Bayesian Computation [0.456877715768796]
A common problem in natural sciences is the comparison of competing models in the light of observed data.
This framework relies on the calculation of likelihood functions which are intractable for most models used in practice.
I propose a new efficient method to perform Bayesian model comparison in ABC.
arXiv Detail & Related papers (2022-03-15T10:24:16Z) - MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood
Inference from Sampled Trajectories [61.3299263929289]
Simulation-based inference enables learning the parameters of a model even when its likelihood cannot be computed in practice.
One class of methods uses data simulated with different parameters to infer an amortized estimator for the likelihood-to-evidence ratio.
We show that this approach can be formulated in terms of mutual information between model parameters and simulated data.
arXiv Detail & Related papers (2021-06-03T12:59:16Z) - Gradient-based Bayesian Experimental Design for Implicit Models using
Mutual Information Lower Bounds [20.393359858407162]
We introduce a framework for Bayesian experimental design (BED) with implicit models, where the data-generating distribution is intractable but sampling from it is still possible.
In order to find optimal experimental designs for such models, our approach maximises mutual information lower bounds that are parametrised by neural networks.
By training a neural network on sampled data, we simultaneously update network parameters and designs using gradient-ascent.
arXiv Detail & Related papers (2021-05-10T13:59:25Z) - Efficient Ensemble Model Generation for Uncertainty Estimation with
Bayesian Approximation in Segmentation [74.06904875527556]
We propose a generic and efficient segmentation framework to construct ensemble segmentation models.
In the proposed method, ensemble models can be efficiently generated by using the layer selection method.
We also devise a new pixel-wise uncertainty loss, which improves the predictive performance.
arXiv Detail & Related papers (2020-05-21T16:08:38Z) - Amortized Bayesian model comparison with evidential deep learning [0.12314765641075436]
We propose a novel method for performing Bayesian model comparison using specialized deep learning architectures.
Our method is purely simulation-based and circumvents the step of explicitly fitting all alternative models under consideration to each observed dataset.
We show that our method achieves excellent results in terms of accuracy, calibration, and efficiency across the examples considered in this work.
arXiv Detail & Related papers (2020-04-22T15:15:46Z) - Nonparametric Estimation in the Dynamic Bradley-Terry Model [69.70604365861121]
We develop a novel estimator that relies on kernel smoothing to pre-process the pairwise comparisons over time.
We derive time-varying oracle bounds for both the estimation error and the excess risk in the model-agnostic setting.
arXiv Detail & Related papers (2020-02-28T21:52:49Z) - Bayesian Experimental Design for Implicit Models by Mutual Information
Neural Estimation [16.844481439960663]
Implicit models, where the data-generation distribution is intractable but sampling is possible, are ubiquitous in the natural sciences.
A fundamental question is how to design experiments so that the collected data are most useful.
For implicit models, however, this approach is severely hampered by the high computational cost of computing posteriors.
We show that training a neural network to maximise a lower bound on MI allows us to jointly determine the optimal design and the posterior.
arXiv Detail & Related papers (2020-02-19T12:09:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.