Unifying Simulation and Inference with Normalizing Flows
- URL: http://arxiv.org/abs/2404.18992v2
- Date: Thu, 9 May 2024 21:41:49 GMT
- Title: Unifying Simulation and Inference with Normalizing Flows
- Authors: Haoxing Du, Claudius Krause, Vinicius Mikuni, Benjamin Nachman, Ian Pang, David Shih,
- Abstract summary: We show that two tasks can be unified by using maximum likelihood estimation (MLE) from conditional generative models for energy regression.
Using an ATLAS-like calorimeter simulation, we demonstrate this concept in the context of calorimeter energy calibration.
- Score: 0.08796261172196743
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: There have been many applications of deep neural networks to detector calibrations and a growing number of studies that propose deep generative models as automated fast detector simulators. We show that these two tasks can be unified by using maximum likelihood estimation (MLE) from conditional generative models for energy regression. Unlike direct regression techniques, the MLE approach is prior-independent and non-Gaussian resolutions can be determined from the shape of the likelihood near the maximum. Using an ATLAS-like calorimeter simulation, we demonstrate this concept in the context of calorimeter energy calibration.
Related papers
- Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - Simulation-based inference using surjective sequential neural likelihood
estimation [50.24983453990065]
Surjective Sequential Neural Likelihood estimation is a novel method for simulation-based inference.
By embedding the data in a low-dimensional space, SSNL solves several issues previous likelihood-based methods had when applied to high-dimensional data sets.
arXiv Detail & Related papers (2023-08-02T10:02:38Z) - Conditional Korhunen-Lo\'{e}ve regression model with Basis Adaptation
for high-dimensional problems: uncertainty quantification and inverse
modeling [62.997667081978825]
We propose a methodology for improving the accuracy of surrogate models of the observable response of physical systems.
We apply the proposed methodology to constructing surrogate models via the Basis Adaptation (BA) method of the stationary hydraulic head response.
arXiv Detail & Related papers (2023-07-05T18:14:38Z) - Machine Learning methods for simulating particle response in the Zero
Degree Calorimeter at the ALICE experiment, CERN [8.980453507536017]
Currently, over half of the computing power at CERN GRID is used to run High Energy Physics simulations.
The recent updates at the Large Hadron Collider (LHC) create the need for developing more efficient simulation methods.
We propose an alternative approach to the problem that leverages machine learning.
arXiv Detail & Related papers (2023-06-23T16:45:46Z) - Probabilistic Unrolling: Scalable, Inverse-Free Maximum Likelihood
Estimation for Latent Gaussian Models [69.22568644711113]
We introduce probabilistic unrolling, a method that combines Monte Carlo sampling with iterative linear solvers to circumvent matrix inversions.
Our theoretical analyses reveal that unrolling and backpropagation through the iterations of the solver can accelerate gradient estimation for maximum likelihood estimation.
In experiments on simulated and real data, we demonstrate that probabilistic unrolling learns latent Gaussian models up to an order of magnitude faster than gradient EM, with minimal losses in model performance.
arXiv Detail & Related papers (2023-06-05T21:08:34Z) - Inductive Simulation of Calorimeter Showers with Normalizing Flows [0.0]
iCaloFlow is a framework for fast detector simulation based on an inductive series of normalizing flows trained on the pattern of energy depositions in pairs of consecutive calorimeter layers.
As we demonstrate, iCaloFlow can realize the potential of normalizing flows in performing fast, high-fidelity simulation on detector geometries that are 10 - 100 times higher than previously considered.
arXiv Detail & Related papers (2023-05-19T18:00:00Z) - Geometry-aware Autoregressive Models for Calorimeter Shower Simulations [6.01665219244256]
We develop a geometry-aware autoregressive model on a range of calorimeter geometries.
This is a key proof-of-concept step towards building a model that can generalize to new unseen calorimeter geometries.
Such a model can replace the hundreds of generative models used for calorimeter simulation in a Large Hadron Collider experiment.
arXiv Detail & Related papers (2022-12-16T01:45:17Z) - Aspects of scaling and scalability for flow-based sampling of lattice
QCD [137.23107300589385]
Recent applications of machine-learned normalizing flows to sampling in lattice field theory suggest that such methods may be able to mitigate critical slowing down and topological freezing.
It remains to be determined whether they can be applied to state-of-the-art lattice quantum chromodynamics calculations.
arXiv Detail & Related papers (2022-11-14T17:07:37Z) - Maximum Likelihood Learning of Unnormalized Models for Simulation-Based
Inference [44.281860162298564]
We introduce two synthetic likelihood methods for Simulation-Based Inference.
We learn a conditional energy-based model (EBM) of the likelihood using synthetic data generated by the simulator.
We demonstrate the properties of both methods on a range of synthetic datasets, and apply them to a model of the neuroscience network in the crab.
arXiv Detail & Related papers (2022-10-26T14:38:24Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - DCTRGAN: Improving the Precision of Generative Models with Reweighting [1.2622634782102324]
We introduce a post-hoc correction to deep generative models to further improve their fidelity.
The correction takes the form of a reweighting function that can be applied to generated examples.
We show that the weighted GAN examples significantly improve the accuracy of the generated samples without a large loss in statistical power.
arXiv Detail & Related papers (2020-09-03T18:00:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.