Learning to discover: expressive Gaussian mixture models for
multi-dimensional simulation and parameter inference in the physical sciences
- URL: http://arxiv.org/abs/2108.11481v1
- Date: Wed, 25 Aug 2021 21:27:29 GMT
- Title: Learning to discover: expressive Gaussian mixture models for
multi-dimensional simulation and parameter inference in the physical sciences
- Authors: Stephen B. Menary and Darren D. Price
- Abstract summary: We show that density models describing multiple observables may be created using an auto-regressive Gaussian mixture model.
The model is designed to capture how observable spectra are deformed by hypothesis variations.
It may be used as a statistical model for scientific discovery in interpreting experimental observations.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We show that density models describing multiple observables with (i) hard
boundaries and (ii) dependence on external parameters may be created using an
auto-regressive Gaussian mixture model. The model is designed to capture how
observable spectra are deformed by hypothesis variations, and is made more
expressive by projecting data onto a configurable latent space. It may be used
as a statistical model for scientific discovery in interpreting experimental
observations, for example when constraining the parameters of a physical model
or tuning simulation parameters according to calibration data. The model may
also be sampled for use within a Monte Carlo simulation chain, or used to
estimate likelihood ratios for event classification. The method is demonstrated
on simulated high-energy particle physics data considering the anomalous
electroweak production of a $Z$ boson in association with a dijet system at the
Large Hadron Collider, and the accuracy of inference is tested using a
realistic toy example. The developed methods are domain agnostic; they may be
used within any field to perform simulation or inference where a dataset
consisting of many real-valued observables has conditional dependence on
external parameters.
Related papers
- Diffusion-HMC: Parameter Inference with Diffusion Model driven Hamiltonian Monte Carlo [2.048226951354646]
This work uses a single diffusion generative model to address the interlinked objectives of generating predictions for observed astrophysical fields from theory and constraining physical models from observations using these predictions.
We leverage the approximate likelihood of the diffusion generative model to derive tight constraints on cosmology by using the Hamiltonian Monte Carlo method to sample the posterior on cosmological parameters for a given test image.
arXiv Detail & Related papers (2024-05-08T17:59:03Z) - Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - Simulation-based inference using surjective sequential neural likelihood
estimation [50.24983453990065]
Surjective Sequential Neural Likelihood estimation is a novel method for simulation-based inference.
By embedding the data in a low-dimensional space, SSNL solves several issues previous likelihood-based methods had when applied to high-dimensional data sets.
arXiv Detail & Related papers (2023-08-02T10:02:38Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Likelihood-Free Inference in State-Space Models with Unknown Dynamics [71.94716503075645]
We introduce a method for inferring and predicting latent states in state-space models where observations can only be simulated, and transition dynamics are unknown.
We propose a way of doing likelihood-free inference (LFI) of states and state prediction with a limited number of simulations.
arXiv Detail & Related papers (2021-11-02T12:33:42Z) - Probabilistic Inference of Simulation Parameters via Parallel
Differentiable Simulation [34.30381620584878]
To accurately reproduce measurements from the real world, simulators need to have an adequate model of the physical system.
We address the latter problem of estimating parameters through a Bayesian inference approach.
We leverage GPU code generation and differentiable simulation to evaluate the likelihood and its gradient for many particles in parallel.
arXiv Detail & Related papers (2021-09-18T03:05:44Z) - Gaussian Function On Response Surface Estimation [12.35564140065216]
We propose a new framework for interpreting (features and samples) black-box machine learning models via a metamodeling technique.
The metamodel can be estimated from data generated via a trained complex model by running the computer experiment on samples of data in the region of interest.
arXiv Detail & Related papers (2021-01-04T04:47:00Z) - Score Matched Conditional Exponential Families for Likelihood-Free
Inference [0.0]
Likelihood-Free Inference (LFI) relies on simulations from the model.
We generate parameter-simulation pairs from the model independently on the observation.
We use Neural Networks whose weights are tuned with Score Matching to learn a conditional exponential family likelihood approximation.
arXiv Detail & Related papers (2020-12-20T11:57:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.