Truncated Marginal Neural Ratio Estimation
- URL: http://arxiv.org/abs/2107.01214v1
- Date: Fri, 2 Jul 2021 18:00:03 GMT
- Title: Truncated Marginal Neural Ratio Estimation
- Authors: Benjamin Kurt Miller, Alex Cole, Patrick Forr\'e, Gilles Louppe,
Christoph Weniger
- Abstract summary: We present a neural simulator-based inference algorithm which simultaneously offers simulation efficiency and fast empirical posterior testability.
Our approach is simulation efficient by simultaneously estimating low-dimensional marginal posteriors instead of the joint posterior.
By estimating a locally amortized posterior our algorithm enables efficient empirical tests of the robustness of the inference results.
- Score: 5.438798591410838
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Parametric stochastic simulators are ubiquitous in science, often featuring
high-dimensional input parameters and/or an intractable likelihood. Performing
Bayesian parameter inference in this context can be challenging. We present a
neural simulator-based inference algorithm which simultaneously offers
simulation efficiency and fast empirical posterior testability, which is unique
among modern algorithms. Our approach is simulation efficient by simultaneously
estimating low-dimensional marginal posteriors instead of the joint posterior
and by proposing simulations targeted to an observation of interest via a prior
suitably truncated by an indicator function. Furthermore, by estimating a
locally amortized posterior our algorithm enables efficient empirical tests of
the robustness of the inference results. Such tests are important for
sanity-checking inference in real-world applications, which do not feature a
known ground truth. We perform experiments on a marginalized version of the
simulation-based inference benchmark and two complex and narrow posteriors,
highlighting the simulator efficiency of our algorithm as well as the quality
of the estimated marginal posteriors. Implementation on GitHub.
Related papers
- Accelerated zero-order SGD under high-order smoothness and overparameterized regime [79.85163929026146]
We present a novel gradient-free algorithm to solve convex optimization problems.
Such problems are encountered in medicine, physics, and machine learning.
We provide convergence guarantees for the proposed algorithm under both types of noise.
arXiv Detail & Related papers (2024-11-21T10:26:17Z) - Eliminating Ratio Bias for Gradient-based Simulated Parameter Estimation [0.7673339435080445]
This article addresses the challenge of parameter calibration in models where the likelihood function is not analytically available.
We propose a gradient-based simulated parameter estimation framework, leveraging a multi-time scale that tackles the issue of ratio bias in both maximum likelihood estimation and posterior density estimation problems.
arXiv Detail & Related papers (2024-11-20T02:46:15Z) - Compositional simulation-based inference for time series [21.9975782468709]
simulators frequently emulate real-world dynamics through thousands of single-state transitions over time.
We propose an SBI framework that can exploit such Markovian simulators by locally identifying parameters consistent with individual state transitions.
We then compose these local results to obtain a posterior over parameters that align with the entire time series observation.
arXiv Detail & Related papers (2024-11-05T01:55:07Z) - A variational neural Bayes framework for inference on intractable posterior distributions [1.0801976288811024]
Posterior distributions of model parameters are efficiently obtained by feeding observed data into a trained neural network.
We show theoretically that our posteriors converge to the true posteriors in Kullback-Leibler divergence.
arXiv Detail & Related papers (2024-04-16T20:40:15Z) - Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - Calibrating Neural Simulation-Based Inference with Differentiable
Coverage Probability [50.44439018155837]
We propose to include a calibration term directly into the training objective of the neural model.
By introducing a relaxation of the classical formulation of calibration error we enable end-to-end backpropagation.
It is directly applicable to existing computational pipelines allowing reliable black-box posterior inference.
arXiv Detail & Related papers (2023-10-20T10:20:45Z) - Learning Unnormalized Statistical Models via Compositional Optimization [73.30514599338407]
Noise-contrastive estimation(NCE) has been proposed by formulating the objective as the logistic loss of the real data and the artificial noise.
In this paper, we study it a direct approach for optimizing the negative log-likelihood of unnormalized models.
arXiv Detail & Related papers (2023-06-13T01:18:16Z) - Promises and Pitfalls of the Linearized Laplace in Bayesian Optimization [73.80101701431103]
The linearized-Laplace approximation (LLA) has been shown to be effective and efficient in constructing Bayesian neural networks.
We study the usefulness of the LLA in Bayesian optimization and highlight its strong performance and flexibility.
arXiv Detail & Related papers (2023-04-17T14:23:43Z) - Neural Posterior Estimation with Differentiable Simulators [58.720142291102135]
We present a new method to perform Neural Posterior Estimation (NPE) with a differentiable simulator.
We demonstrate how gradient information helps constrain the shape of the posterior and improves sample-efficiency.
arXiv Detail & Related papers (2022-07-12T16:08:04Z) - Simulation-efficient marginal posterior estimation with swyft: stop
wasting your precious time [5.533353383316288]
We present algorithms for nested neural likelihood-to-evidence ratio estimation and simulation reuse.
Together, these algorithms enable automatic and extremely simulator efficient estimation of marginal and joint posteriors.
arXiv Detail & Related papers (2020-11-27T19:00:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.