Efficient identification of informative features in simulation-based
inference
- URL: http://arxiv.org/abs/2210.11915v1
- Date: Fri, 21 Oct 2022 12:35:46 GMT
- Title: Efficient identification of informative features in simulation-based
inference
- Authors: Jonas Beck, Michael Deistler, Yves Bernaerts, Jakob Macke, Philipp
Berens
- Abstract summary: We show that one can marginalize the trained surrogate likelihood post-hoc before inferring the posterior to assess the contribution of a feature.
We demonstrate the usefulness of our method by identifying the most important features for inferring parameters of an example HH neuron model.
- Score: 5.538076164981993
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Simulation-based Bayesian inference (SBI) can be used to estimate the
parameters of complex mechanistic models given observed model outputs without
requiring access to explicit likelihood evaluations. A prime example for the
application of SBI in neuroscience involves estimating the parameters governing
the response dynamics of Hodgkin-Huxley (HH) models from electrophysiological
measurements, by inferring a posterior over the parameters that is consistent
with a set of observations. To this end, many SBI methods employ a set of
summary statistics or scientifically interpretable features to estimate a
surrogate likelihood or posterior. However, currently, there is no way to
identify how much each summary statistic or feature contributes to reducing
posterior uncertainty. To address this challenge, one could simply compare the
posteriors with and without a given feature included in the inference process.
However, for large or nested feature sets, this would necessitate repeatedly
estimating the posterior, which is computationally expensive or even
prohibitive. Here, we provide a more efficient approach based on the SBI method
neural likelihood estimation (NLE): We show that one can marginalize the
trained surrogate likelihood post-hoc before inferring the posterior to assess
the contribution of a feature. We demonstrate the usefulness of our method by
identifying the most important features for inferring parameters of an example
HH neuron model. Beyond neuroscience, our method is generally applicable to SBI
workflows that rely on data features for inference used in other scientific
fields.
Related papers
- Preconditioned Neural Posterior Estimation for Likelihood-free Inference [5.651060979874024]
We show in this paper that the neural posterior estimator (NPE) methods are not guaranteed to be highly accurate, even on problems with low dimension.
We propose preconditioned NPE and its sequential version (PSNPE), which uses a short run of ABC to effectively eliminate regions of parameter space that produce large discrepancy between simulations and data.
We present comprehensive empirical evidence that this melding of neural and statistical SBI methods improves performance over a range of examples.
arXiv Detail & Related papers (2024-04-21T07:05:38Z) - Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - Calibrating Neural Simulation-Based Inference with Differentiable
Coverage Probability [50.44439018155837]
We propose to include a calibration term directly into the training objective of the neural model.
By introducing a relaxation of the classical formulation of calibration error we enable end-to-end backpropagation.
It is directly applicable to existing computational pipelines allowing reliable black-box posterior inference.
arXiv Detail & Related papers (2023-10-20T10:20:45Z) - Misspecification-robust Sequential Neural Likelihood for
Simulation-based Inference [0.20971479389679337]
We propose a novel SNL method, which through the incorporation of additional adjustment parameters, is robust to model misspecification.
We demonstrate the efficacy of our approach through several illustrative examples.
arXiv Detail & Related papers (2023-01-31T02:28:18Z) - Investigating the Impact of Model Misspecification in Neural
Simulation-based Inference [1.933681537640272]
We study the behaviour of neural SBI algorithms in the presence of various forms of model misspecification.
We find that misspecification can have a profoundly deleterious effect on performance.
We conclude that new approaches are required to address model misspecification if neural SBI algorithms are to be relied upon to derive accurate conclusions.
arXiv Detail & Related papers (2022-09-05T09:08:16Z) - Neural Posterior Estimation with Differentiable Simulators [58.720142291102135]
We present a new method to perform Neural Posterior Estimation (NPE) with a differentiable simulator.
We demonstrate how gradient information helps constrain the shape of the posterior and improves sample-efficiency.
arXiv Detail & Related papers (2022-07-12T16:08:04Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood
Inference from Sampled Trajectories [61.3299263929289]
Simulation-based inference enables learning the parameters of a model even when its likelihood cannot be computed in practice.
One class of methods uses data simulated with different parameters to infer an amortized estimator for the likelihood-to-evidence ratio.
We show that this approach can be formulated in terms of mutual information between model parameters and simulated data.
arXiv Detail & Related papers (2021-06-03T12:59:16Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - SBI -- A toolkit for simulation-based inference [0.0]
Simulation-based inference ( SBI) seeks to identify parameter sets that a) are compatible with prior knowledge and b) match empirical observations.
We present $textttsbi$, a PyTorch-based package that implements SBI algorithms based on neural networks.
arXiv Detail & Related papers (2020-07-17T16:53:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.