Leveraging Global Parameters for Flow-based Neural Posterior Estimation
- URL: http://arxiv.org/abs/2102.06477v1
- Date: Fri, 12 Feb 2021 12:23:13 GMT
- Title: Leveraging Global Parameters for Flow-based Neural Posterior Estimation
- Authors: Pedro L. C. Rodrigues, Thomas Moreau, Gilles Louppe, Alexandre
Gramfort
- Abstract summary: Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
- Score: 90.21090932619695
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Inferring the parameters of a stochastic model based on experimental
observations is central to the scientific method. A particularly challenging
setting is when the model is strongly indeterminate, i.e., when distinct sets
of parameters yield identical observations. This arises in many practical
situations, such as when inferring the distance and power of a radio source (is
the source close and weak or far and strong?) or when estimating the amplifier
gain and underlying brain activity of an electrophysiological experiment. In
this work, we present a method for cracking such indeterminacy by exploiting
additional information conveyed by an auxiliary set of observations sharing
global parameters. Our method extends recent developments in simulation-based
inference(SBI) based on normalizing flows to Bayesian hierarchical models. We
validate quantitatively our proposal on a motivating example amenable to
analytical solutions, and then apply it to invert a well known non-linear model
from computational neuroscience.
Related papers
- A Likelihood Based Approach to Distribution Regression Using Conditional Deep Generative Models [6.647819824559201]
We study the large-sample properties of a likelihood-based approach for estimating conditional deep generative models.
Our results lead to the convergence rate of a sieve maximum likelihood estimator for estimating the conditional distribution.
arXiv Detail & Related papers (2024-10-02T20:46:21Z) - Estimation of spatio-temporal extremes via generative neural networks [0.0]
We provide a unified approach for analyzing spatial extremes with little available data.
By employing recent developments in generative neural networks we predict a full sample-based distribution.
We validate our method by fitting several simulated max-stable processes, showing a high accuracy of the approach.
arXiv Detail & Related papers (2024-07-11T16:57:17Z) - Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - An information field theory approach to Bayesian state and parameter estimation in dynamical systems [0.0]
This paper develops a scalable Bayesian approach to state and parameter estimation suitable for continuous-time, deterministic dynamical systems.
We construct a physics-informed prior probability measure on the function space of system responses so that functions that satisfy the physics are more likely.
arXiv Detail & Related papers (2023-06-03T16:36:43Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Efficient identification of informative features in simulation-based
inference [5.538076164981993]
We show that one can marginalize the trained surrogate likelihood post-hoc before inferring the posterior to assess the contribution of a feature.
We demonstrate the usefulness of our method by identifying the most important features for inferring parameters of an example HH neuron model.
arXiv Detail & Related papers (2022-10-21T12:35:46Z) - Spherical Poisson Point Process Intensity Function Modeling and
Estimation with Measure Transport [0.20305676256390934]
We present a new approach for modeling non-homogeneous Poisson process intensity functions on the sphere.
The central idea of this framework is to build, and estimate, a flexible Bijective map that transforms the underlying intensity function of interest on the sphere into a simpler reference, intensity function, also on the sphere.
arXiv Detail & Related papers (2022-01-24T06:46:22Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.