Simulation-based inference for Precision Neutrino Physics through Neural Monte Carlo tuning
- URL: http://arxiv.org/abs/2507.23297v1
- Date: Thu, 31 Jul 2025 07:33:05 GMT
- Title: Simulation-based inference for Precision Neutrino Physics through Neural Monte Carlo tuning
- Authors: A. Gavrikov, A. Serafini, D. Dolzhikov, A. Garfagnini, M. Gonchar, M. Grassi, R. Brugnera, V. Cerrone, L. V. D'Auria, R. M. Guizzetti, L. Lastrucci, G. Andronico, V. Antonelli, A. Barresi, D. Basilico, M. Beretta, A. Bergnoli, M. Borghesi, A. Brigatti, R. Bruno, A. Budano, B. Caccianiga, A. Cammi, R. Caruso, D. Chiesa, C. Clementi, C. Coletta, S. Dusini, A. Fabbri, G. Felici, G. Ferrante, M. G. Giammarchi, N. Giudice, N. Guardone, F. Houria, C. Landini, I. Lippi, L. Loi, P. Lombardi, F. Mantovani, S. M. Mari, A. Martini, L. Miramonti, M. Montuschi, M. Nastasi, D. Orestano, F. Ortica, A. Paoloni, L. Pelicci, E. Percalli, F. Petrucci, E. Previtali, G. Ranucci, A. C. Re, B. Ricci, A. Romani, C. Sirignano, M. Sisti, L. Stanco, E. Stanescu Farilla, V. Strati, M. D. C Torri, C. Tuvè, C. Venettacci, G. Verde, L. Votano,
- Abstract summary: We propose a solution using neural likelihood estimation within the simulation-based inference framework.<n>We develop two complementary neural density estimators that model likelihoods of calibration data.<n>Our framework offers flexibility to choose the most appropriate method for specific needs.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Precise modeling of detector energy response is crucial for next-generation neutrino experiments which present computational challenges due to lack of analytical likelihoods. We propose a solution using neural likelihood estimation within the simulation-based inference framework. We develop two complementary neural density estimators that model likelihoods of calibration data: conditional normalizing flows and a transformer-based regressor. We adopt JUNO - a large neutrino experiment - as a case study. The energy response of JUNO depends on several parameters, all of which should be tuned, given their non-linear behavior and strong correlations in the calibration data. To this end, we integrate the modeled likelihoods with Bayesian nested sampling for parameter inference, achieving uncertainties limited only by statistics with near-zero systematic biases. The normalizing flows model enables unbinned likelihood analysis, while the transformer provides an efficient binned alternative. By providing both options, our framework offers flexibility to choose the most appropriate method for specific needs. Finally, our approach establishes a template for similar applications across experimental neutrino and broader particle physics.
Related papers
- A multi-dimensional quantum estimation and model learning framework based on variational Bayesian inference [3.7127285734321203]
We present a joint model selection and parameter estimation algorithm that is fast and operable on a large number of model parameters.<n>The algorithm is based on variational Bayesian inference (VBI), which approximates the target posterior distribution.<n>We show how a regularizing prior can be used to select between competing models, each comprising a different number of parameters.
arXiv Detail & Related papers (2025-07-30T22:18:27Z) - Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Bayesian Circular Regression with von Mises Quasi-Processes [57.88921637944379]
In this work we explore a family of expressive and interpretable distributions over circle-valued random functions.<n>For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Gibbs sampling.<n>We present experiments applying this model to the prediction of wind directions and the percentage of the running gait cycle as a function of joint angles.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Scaling and renormalization in high-dimensional regression [72.59731158970894]
We present a unifying perspective on recent results on ridge regression.<n>We use the basic tools of random matrix theory and free probability, aimed at readers with backgrounds in physics and deep learning.<n>Our results extend and provide a unifying perspective on earlier models of scaling laws.
arXiv Detail & Related papers (2024-05-01T15:59:00Z) - Machine learning enabled experimental design and parameter estimation
for ultrafast spin dynamics [54.172707311728885]
We introduce a methodology that combines machine learning with Bayesian optimal experimental design (BOED)
Our method employs a neural network model for large-scale spin dynamics simulations for precise distribution and utility calculations in BOED.
Our numerical benchmarks demonstrate the superior performance of our method in guiding XPFS experiments, predicting model parameters, and yielding more informative measurements within limited experimental time.
arXiv Detail & Related papers (2023-06-03T06:19:20Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Optimised Bayesian system identification in quantum devices [3.72081359624651]
We present a closed-loop Bayesian learning algorithm for estimating unknown parameters in a dynamical model.
We demonstrate the performance of the algorithm in both simulated calibration tasks and in an experimental single-qubit ion-trap system.
arXiv Detail & Related papers (2022-11-16T18:12:46Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Exhaustive Neural Importance Sampling applied to Monte Carlo event
generation [0.0]
Exhaustive Neural Sampling (ENIS) is a method based on normalizing flows to find a suitable proposal density for rejection sampling automatically and efficiently.
We discuss how this technique solves common issues of the rejection algorithm.
arXiv Detail & Related papers (2020-05-26T13:45:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.