Multidimensional Deconvolution with Profiling
- URL: http://arxiv.org/abs/2409.10421v1
- Date: Mon, 16 Sep 2024 15:52:28 GMT
- Title: Multidimensional Deconvolution with Profiling
- Authors: Huanbiao Zhu, Krish Desai, Mikael Kuusela, Vinicius Mikuni, Benjamin Nachman, Larry Wasserman,
- Abstract summary: In many experimental contexts, it is necessary to statistically remove the impact of instrumental effects in order to physically interpret measurements.
We propose a new algorithm called Profile OmniFold (POF), which works in a similar iterative manner as the OmniFold (OF) algorithm while being able to simultaneously profile the nuisance parameters.
- Score: 0.28587848809639416
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In many experimental contexts, it is necessary to statistically remove the impact of instrumental effects in order to physically interpret measurements. This task has been extensively studied in particle physics, where the deconvolution task is called unfolding. A number of recent methods have shown how to perform high-dimensional, unbinned unfolding using machine learning. However, one of the assumptions in all of these methods is that the detector response is accurately modeled in the Monte Carlo simulation. In practice, the detector response depends on a number of nuisance parameters that can be constrained with data. We propose a new algorithm called Profile OmniFold (POF), which works in a similar iterative manner as the OmniFold (OF) algorithm while being able to simultaneously profile the nuisance parameters. We illustrate the method with a Gaussian example as a proof of concept highlighting its promising capabilities.
Related papers
- Machine Learning-based Unfolding for Cross Section Measurements in the Presence of Nuisance Parameters [0.15325041686671656]
In particle physics, the distortions they introduce are often known only implicitly through simulations of the detector.<n>Modern machine learning has enabled efficient simulation-based approaches for unfolding high-dimensional data.<n>We show how to extend machine learning-based unfolding to incorporate nuisance parameters.
arXiv Detail & Related papers (2025-12-08T01:21:34Z) - Inference-Time Alignment in Diffusion Models with Reward-Guided Generation: Tutorial and Review [59.856222854472605]
This tutorial provides an in-depth guide on inference-time guidance and alignment methods for optimizing downstream reward functions in diffusion models.
practical applications in fields such as biology often require sample generation that maximizes specific metrics.
We discuss (1) fine-tuning methods combined with inference-time techniques, (2) inference-time algorithms based on search algorithms such as Monte Carlo tree search, and (3) connections between inference-time algorithms in language models and diffusion models.
arXiv Detail & Related papers (2025-01-16T17:37:35Z) - Accelerated zero-order SGD under high-order smoothness and overparameterized regime [79.85163929026146]
We present a novel gradient-free algorithm to solve convex optimization problems.
Such problems are encountered in medicine, physics, and machine learning.
We provide convergence guarantees for the proposed algorithm under both types of noise.
arXiv Detail & Related papers (2024-11-21T10:26:17Z) - Computation-Aware Gaussian Processes: Model Selection And Linear-Time Inference [55.150117654242706]
We show that model selection for computation-aware GPs trained on 1.8 million data points can be done within a few hours on a single GPU.
As a result of this work, Gaussian processes can be trained on large-scale datasets without significantly compromising their ability to quantify uncertainty.
arXiv Detail & Related papers (2024-11-01T21:11:48Z) - Parameters estimation by fitting correlation functions of continuous quantum measurement [0.0]
We propose a simple method to estimate the parameters of a continuously measured quantum system, by fitting correlation functions of the measured signal.
We demonstrate the approach in simulation, both on toy examples and on a recent superconducting circuits.
arXiv Detail & Related papers (2024-10-15T18:00:08Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Gaussian Process Probes (GPP) for Uncertainty-Aware Probing [61.91898698128994]
We introduce a unified and simple framework for probing and measuring uncertainty about concepts represented by models.
Our experiments show it can (1) probe a model's representations of concepts even with a very small number of examples, (2) accurately measure both epistemic uncertainty (how confident the probe is) and aleatory uncertainty (how fuzzy the concepts are to the model), and (3) detect out of distribution data using those uncertainty measures as well as classic methods do.
arXiv Detail & Related papers (2023-05-29T17:00:16Z) - Inverse Dynamics Pretraining Learns Good Representations for Multitask
Imitation [66.86987509942607]
We evaluate how such a paradigm should be done in imitation learning.
We consider a setting where the pretraining corpus consists of multitask demonstrations.
We argue that inverse dynamics modeling is well-suited to this setting.
arXiv Detail & Related papers (2023-05-26T14:40:46Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Unbinned Profiled Unfolding [2.0813318162800707]
Unfolding is an important procedure in particle physics experiments which corrects for detector effects.
We propose a new machine learning-based unfolding method that results in an unbinned differential cross section.
arXiv Detail & Related papers (2023-02-10T17:28:01Z) - RMFGP: Rotated Multi-fidelity Gaussian process with Dimension Reduction
for High-dimensional Uncertainty Quantification [12.826754199680474]
Multi-fidelity modelling enables accurate inference even when only a small set of accurate data is available.
By combining the realizations of the high-fidelity model with one or more low-fidelity models, the multi-fidelity method can make accurate predictions of quantities of interest.
This paper proposes a new dimension reduction framework based on rotated multi-fidelity Gaussian process regression and a Bayesian active learning scheme.
arXiv Detail & Related papers (2022-04-11T01:20:35Z) - Probabilistic Inference of Simulation Parameters via Parallel
Differentiable Simulation [34.30381620584878]
To accurately reproduce measurements from the real world, simulators need to have an adequate model of the physical system.
We address the latter problem of estimating parameters through a Bayesian inference approach.
We leverage GPU code generation and differentiable simulation to evaluate the likelihood and its gradient for many particles in parallel.
arXiv Detail & Related papers (2021-09-18T03:05:44Z) - Scaffolding Simulations with Deep Learning for High-dimensional
Deconvolution [0.3078691410268859]
We propose a simulation-based maximum likelihood deconvolution approach in this setting called OmniFold.
Deep learning enables this approach to be naturally unbinned and (variable-, and) high-dimensional.
We show how OmniFold can not only remove detector distortions, but it can also account for noise processes and acceptance effects.
arXiv Detail & Related papers (2021-05-10T15:16:18Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Preprocessing noisy functional data using factor models [0.0]
We consider functional data which are measured on a discrete set of observation points.
Signal and noise can be naturally represented as the common and idiosyncratic component of a factor model.
arXiv Detail & Related papers (2020-12-10T16:54:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.