Arbitrary Marginal Neural Ratio Estimation for Simulation-based
Inference
- URL: http://arxiv.org/abs/2110.00449v1
- Date: Fri, 1 Oct 2021 14:35:46 GMT
- Title: Arbitrary Marginal Neural Ratio Estimation for Simulation-based
Inference
- Authors: Fran\c{c}ois Rozet and Gilles Louppe
- Abstract summary: We present a novel method that enables amortized inference over arbitrary subsets of the parameters, without resorting to numerical integration.
We demonstrate the applicability of the method on parameter inference of binary black hole systems from gravitational waves observations.
- Score: 7.888755225607877
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In many areas of science, complex phenomena are modeled by stochastic
parametric simulators, often featuring high-dimensional parameter spaces and
intractable likelihoods. In this context, performing Bayesian inference can be
challenging. In this work, we present a novel method that enables amortized
inference over arbitrary subsets of the parameters, without resorting to
numerical integration, which makes interpretation of the posterior more
convenient. Our method is efficient and can be implemented with arbitrary
neural network architectures. We demonstrate the applicability of the method on
parameter inference of binary black hole systems from gravitational waves
observations.
Related papers
- Nonparametric estimation of Hawkes processes with RKHSs [1.775610745277615]
This paper addresses nonparametric estimation of nonlinear Hawkes processes, where the interaction functions are assumed to lie in a reproducing kernel space (RKHS)
Motivated by applications in neuroscience, the model allows complex interaction functions, in order to express exciting and inhibiting effects, but also a combination of both.
It shows that our method achieves a better performance compared to related nonparametric estimation techniques and suits neuronal applications.
arXiv Detail & Related papers (2024-11-01T14:26:50Z) - Physics and geometry informed neural operator network with application to acoustic scattering [0.0]
We propose a physics-informed deep operator network (DeepONet) capable of predicting the scattered pressure field for arbitrarily shaped scatterers.
Our trained model is capable of learning solution operator that can approximate physically-consistent scattered pressure field in just a few seconds.
arXiv Detail & Related papers (2024-06-02T03:41:52Z) - Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - Simulation-based inference using surjective sequential neural likelihood
estimation [50.24983453990065]
Surjective Sequential Neural Likelihood estimation is a novel method for simulation-based inference.
By embedding the data in a low-dimensional space, SSNL solves several issues previous likelihood-based methods had when applied to high-dimensional data sets.
arXiv Detail & Related papers (2023-08-02T10:02:38Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Probabilistic Inference of Simulation Parameters via Parallel
Differentiable Simulation [34.30381620584878]
To accurately reproduce measurements from the real world, simulators need to have an adequate model of the physical system.
We address the latter problem of estimating parameters through a Bayesian inference approach.
We leverage GPU code generation and differentiable simulation to evaluate the likelihood and its gradient for many particles in parallel.
arXiv Detail & Related papers (2021-09-18T03:05:44Z) - MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood
Inference from Sampled Trajectories [61.3299263929289]
Simulation-based inference enables learning the parameters of a model even when its likelihood cannot be computed in practice.
One class of methods uses data simulated with different parameters to infer an amortized estimator for the likelihood-to-evidence ratio.
We show that this approach can be formulated in terms of mutual information between model parameters and simulated data.
arXiv Detail & Related papers (2021-06-03T12:59:16Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Resampling with neural networks for stochastic parameterization in
multiscale systems [0.0]
We present a machine-learning method, used for the conditional resampling of observations or reference data from a fully resolved simulation.
It is based on the probabilistic classiffcation of subsets of reference data, conditioned on macroscopic variables.
We validate our approach on the Lorenz 96 system, using two different parameter settings.
arXiv Detail & Related papers (2020-04-03T10:09:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.