Likelihood-Free Parameter Estimation with Neural Bayes Estimators
- URL: http://arxiv.org/abs/2208.12942v5
- Date: Wed, 4 Oct 2023 06:08:20 GMT
- Title: Likelihood-Free Parameter Estimation with Neural Bayes Estimators
- Authors: Matthew Sainsbury-Dale, Andrew Zammit-Mangion, and Rapha\"el Huser
- Abstract summary: Neural point estimators are neural networks that map data to parameter point estimates.
We aim to increase the awareness of statisticians to this relatively new inferential tool, and to facilitate its adoption by providing user-friendly open-source software.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural point estimators are neural networks that map data to parameter point
estimates. They are fast, likelihood free and, due to their amortised nature,
amenable to fast bootstrap-based uncertainty quantification. In this paper, we
aim to increase the awareness of statisticians to this relatively new
inferential tool, and to facilitate its adoption by providing user-friendly
open-source software. We also give attention to the ubiquitous problem of
making inference from replicated data, which we address in the neural setting
using permutation-invariant neural networks. Through extensive simulation
studies we show that these neural point estimators can quickly and optimally
(in a Bayes sense) estimate parameters in weakly-identified and
highly-parameterised models with relative ease. We demonstrate their
applicability through an analysis of extreme sea-surface temperature in the Red
Sea where, after training, we obtain parameter estimates and bootstrap-based
confidence intervals from hundreds of spatial fields in a fraction of a second.
Related papers
- Trade-Offs of Diagonal Fisher Information Matrix Estimators [53.35448232352667]
The Fisher information matrix can be used to characterize the local geometry of the parameter space of neural networks.
We examine two popular estimators whose accuracy and sample complexity depend on their associated variances.
We derive bounds of the variances and instantiate them in neural networks for regression and classification.
arXiv Detail & Related papers (2024-02-08T03:29:10Z) - Neural Bayes Estimators for Irregular Spatial Data using Graph Neural Networks [0.0]
We employ graph neural networks to tackle the problem of parameter point estimation from data collected over arbitrary spatial locations.
In addition to extending neural Bayes estimation to irregular spatial data, our architecture leads to substantial computational benefits.
arXiv Detail & Related papers (2023-10-04T06:13:22Z) - Neural Bayes estimators for censored inference with peaks-over-threshold models [0.0]
Building on advances in likelihood-free inference with neural Bayes estimators, we develop highly efficient estimators for censored peaks-over-threshold models.
Our new method challenges traditional censored likelihood-based inference methods for spatial extremal dependence models.
arXiv Detail & Related papers (2023-06-27T17:36:14Z) - Learning to Learn with Generative Models of Neural Network Checkpoints [71.06722933442956]
We construct a dataset of neural network checkpoints and train a generative model on the parameters.
We find that our approach successfully generates parameters for a wide range of loss prompts.
We apply our method to different neural network architectures and tasks in supervised and reinforcement learning.
arXiv Detail & Related papers (2022-09-26T17:59:58Z) - Can pruning improve certified robustness of neural networks? [106.03070538582222]
We show that neural network pruning can improve empirical robustness of deep neural networks (NNs)
Our experiments show that by appropriately pruning an NN, its certified accuracy can be boosted up to 8.2% under standard training.
We additionally observe the existence of certified lottery tickets that can match both standard and certified robust accuracies of the original dense models.
arXiv Detail & Related papers (2022-06-15T05:48:51Z) - Deep Impulse Responses: Estimating and Parameterizing Filters with Deep
Networks [76.830358429947]
Impulse response estimation in high noise and in-the-wild settings is a challenging problem.
We propose a novel framework for parameterizing and estimating impulse responses based on recent advances in neural representation learning.
arXiv Detail & Related papers (2022-02-07T18:57:23Z) - Neural Networks for Parameter Estimation in Intractable Models [0.0]
We show how to estimate parameters from max-stable processes, where inference is exceptionally challenging.
We use data from model simulations as input and train deep neural networks to learn statistical parameters.
arXiv Detail & Related papers (2021-07-29T21:59:48Z) - Real-time gravitational-wave science with neural posterior estimation [64.67121167063696]
We demonstrate unprecedented accuracy for rapid gravitational-wave parameter estimation with deep learning.
We analyze eight gravitational-wave events from the first LIGO-Virgo Gravitational-Wave Transient Catalog.
We find very close quantitative agreement with standard inference codes, but with inference times reduced from O(day) to a minute per event.
arXiv Detail & Related papers (2021-06-23T18:00:05Z) - The Compact Support Neural Network [6.47243430672461]
We present a neuron generalization that has the standard dot-product-based neuron and the RBF neuron as two extreme cases of a shape parameter.
We show how to avoid difficulties in training a neural network with such neurons, by starting with a trained standard neural network and gradually increasing the shape parameter to the desired value.
arXiv Detail & Related papers (2021-04-01T06:08:09Z) - Robust and integrative Bayesian neural networks for likelihood-free
parameter inference [0.0]
State-of-the-art neural network-based methods for learning summary statistics have delivered promising results for simulation-based likelihood-free parameter inference.
This work proposes a robust integrated approach that learns summary statistics using Bayesian neural networks, and directly estimates the posterior density using categorical distributions.
arXiv Detail & Related papers (2021-02-12T13:45:23Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.