Beyond Cuts in Small Signal Scenarios -- Enhanced Sneutrino
Detectability Using Machine Learning
- URL: http://arxiv.org/abs/2108.03125v4
- Date: Fri, 7 Jul 2023 11:12:29 GMT
- Title: Beyond Cuts in Small Signal Scenarios -- Enhanced Sneutrino
Detectability Using Machine Learning
- Authors: Daniel Alvestad, Nikolai Fomin, J\"orn Kersten, Steffen Maeland, Inga
Str\"umke
- Abstract summary: We use two different models, XGBoost and a deep neural network, to exploit correlations between observables.
We consider different methods to analyze the models' output, finding that a template fit generally performs better than a simple cut.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We investigate enhancing the sensitivity of new physics searches at the LHC
by machine learning in the case of background dominance and a high degree of
overlap between the observables for signal and background. We use two different
models, XGBoost and a deep neural network, to exploit correlations between
observables and compare this approach to the traditional cut-and-count method.
We consider different methods to analyze the models' output, finding that a
template fit generally performs better than a simple cut. By means of a Shapley
decomposition, we gain additional insight into the relationship between event
kinematics and the machine learning model output. We consider a supersymmetric
scenario with a metastable sneutrino as a concrete example, but the methodology
can be applied to a much wider class of models.
Related papers
- Designing Observables for Measurements with Deep Learning [0.12277343096128711]
We propose to design targeted observables with machine learning.
Unfolded, differential cross sections in a neural network output contain the most information about parameters of interest.
We demonstrate this idea in simulation using two physics models for inclusive measurements in deep in scattering.
arXiv Detail & Related papers (2023-10-12T20:54:34Z) - A Deep Dive into the Connections Between the Renormalization Group and
Deep Learning in the Ising Model [0.0]
Renormalization group (RG) is an essential technique in statistical physics and quantum field theory.
We develop extensive renormalization techniques for the 1D and 2D Ising model to provide a baseline for comparison.
For the 2D Ising model, we successfully generated Ising model samples using the Wolff algorithm, and performed the group flow using a quasi-deterministic method.
arXiv Detail & Related papers (2023-08-21T22:50:54Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - On the Influence of Enforcing Model Identifiability on Learning dynamics
of Gaussian Mixture Models [14.759688428864159]
We propose a technique for extracting submodels from singular models.
Our method enforces model identifiability during training.
We show how the method can be applied to more complex models like deep neural networks.
arXiv Detail & Related papers (2022-06-17T07:50:22Z) - Stochastic Parameterizations: Better Modelling of Temporal Correlations
using Probabilistic Machine Learning [1.5293427903448025]
We show that by using a physically-informed recurrent neural network within a probabilistic framework, our model for the 96 atmospheric simulation is competitive.
This is due to a superior ability to model temporal correlations compared to standard first-order autoregressive schemes.
We evaluate across a number of metrics from the literature, but also discuss how the probabilistic metric of likelihood may be a unifying choice for future climate models.
arXiv Detail & Related papers (2022-03-28T14:51:42Z) - Dynamically-Scaled Deep Canonical Correlation Analysis [77.34726150561087]
Canonical Correlation Analysis (CCA) is a method for feature extraction of two views by finding maximally correlated linear projections of them.
We introduce a novel dynamic scaling method for training an input-dependent canonical correlation model.
arXiv Detail & Related papers (2022-03-23T12:52:49Z) - Gone Fishing: Neural Active Learning with Fisher Embeddings [55.08537975896764]
There is an increasing need for active learning algorithms that are compatible with deep neural networks.
This article introduces BAIT, a practical representation of tractable, and high-performing active learning algorithm for neural networks.
arXiv Detail & Related papers (2021-06-17T17:26:31Z) - Firearm Detection via Convolutional Neural Networks: Comparing a
Semantic Segmentation Model Against End-to-End Solutions [68.8204255655161]
Threat detection of weapons and aggressive behavior from live video can be used for rapid detection and prevention of potentially deadly incidents.
One way for achieving this is through the use of artificial intelligence and, in particular, machine learning for image analysis.
We compare a traditional monolithic end-to-end deep learning model and a previously proposed model based on an ensemble of simpler neural networks detecting fire-weapons via semantic segmentation.
arXiv Detail & Related papers (2020-12-17T15:19:29Z) - Improving the Reconstruction of Disentangled Representation Learners via Multi-Stage Modeling [54.94763543386523]
Current autoencoder-based disentangled representation learning methods achieve disentanglement by penalizing the ( aggregate) posterior to encourage statistical independence of the latent factors.
We present a novel multi-stage modeling approach where the disentangled factors are first learned using a penalty-based disentangled representation learning method.
Then, the low-quality reconstruction is improved with another deep generative model that is trained to model the missing correlated latent variables.
arXiv Detail & Related papers (2020-10-25T18:51:15Z) - Towards Novel Insights in Lattice Field Theory with Explainable Machine
Learning [1.5854412882298003]
We propose representation learning in combination with interpretability methods as a framework for the identification of observables.
The approach is put to work in the context of a scalar Yukawa model in (2+1)d.
Based on our results, we argue that due to its broad applicability, attribution methods such as LRP could prove a useful and versatile tool in our search for new physical insights.
arXiv Detail & Related papers (2020-03-03T13:56:58Z) - Kernel and Rich Regimes in Overparametrized Models [69.40899443842443]
We show that gradient descent on overparametrized multilayer networks can induce rich implicit biases that are not RKHS norms.
We also demonstrate this transition empirically for more complex matrix factorization models and multilayer non-linear networks.
arXiv Detail & Related papers (2020-02-20T15:43:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.