Do graph neural networks learn traditional jet substructure?
- URL: http://arxiv.org/abs/2211.09912v1
- Date: Thu, 17 Nov 2022 22:08:10 GMT
- Title: Do graph neural networks learn traditional jet substructure?
- Authors: Farouk Mokhtar, Raghav Kansal, Javier Duarte
- Abstract summary: Graph neural networks have been used to treat jets as point clouds with underlying, learnable, edge connections between the particles inside.
We explore the decision-making process for one such state-of-the-art network, ParticleNet, by looking for relevant edge connections identified.
As the model is trained, we observe changes in the distribution of relevant edges connecting different intermediate clusters of particles, known as subjets.
- Score: 11.562331287684541
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: At the CERN LHC, the task of jet tagging, whose goal is to infer the origin
of a jet given a set of final-state particles, is dominated by machine learning
methods. Graph neural networks have been used to address this task by treating
jets as point clouds with underlying, learnable, edge connections between the
particles inside. We explore the decision-making process for one such
state-of-the-art network, ParticleNet, by looking for relevant edge connections
identified using the layerwise-relevance propagation technique. As the model is
trained, we observe changes in the distribution of relevant edges connecting
different intermediate clusters of particles, known as subjets. The resulting
distribution of subjet connections is different for signal jets originating
from top quarks, whose subjets typically correspond to its three decay
products, and background jets originating from lighter quarks and gluons. This
behavior indicates that the model is using traditional jet substructure
observables, such as the number of prongs -- energetic particle clusters --
within a jet, when identifying jets.
Related papers
- A multicategory jet image classification framework using deep neural network [0.9350546589421261]
Authors focus on jet category separability by particle and jet feature extraction, resulting in a computational efficient interpretable model for jet classification.
This work demonstrates that high dimensional datasets represented in separable latent spaces lead to simpler architectures for jet classification.
arXiv Detail & Related papers (2024-07-03T22:00:35Z) - Flow Matching Beyond Kinematics: Generating Jets with Particle-ID and
Trajectory Displacement Information [0.0]
We introduce the first generative model trained on the JetClass dataset.
Our model generates jets at the constituent level, and it is a permutation-equivariant continuous normalizing flow (CNF) trained with the flow matching technique.
For the first time, we also introduce a generative model that goes beyond the kinematic features of jet constituents.
arXiv Detail & Related papers (2023-11-30T19:00:02Z) - PCN: A Deep Learning Approach to Jet Tagging Utilizing Novel Graph Construction Methods and Chebyshev Graph Convolutions [0.0]
Jet tagging is a classification problem in high-energy physics experiments.
Current approaches use deep learning to uncover hidden patterns in complex collision data.
We propose a graph-based representation of a jet that encodes the most information possible.
arXiv Detail & Related papers (2023-09-12T23:20:19Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Interpretable Joint Event-Particle Reconstruction for Neutrino Physics
at NOvA with Sparse CNNs and Transformers [124.29621071934693]
We present a novel neural network architecture that combines the spatial learning enabled by convolutions with the contextual learning enabled by attention.
TransformerCVN simultaneously classifies each event and reconstructs every individual particle's identity.
This architecture enables us to perform several interpretability studies which provide insights into the network's predictions.
arXiv Detail & Related papers (2023-03-10T20:36:23Z) - Structure Embedded Nucleus Classification for Histopathology Images [51.02953253067348]
Most neural network based methods are affected by the local receptive field of convolutions.
We propose a novel polygon-structure feature learning mechanism that transforms a nucleus contour into a sequence of points sampled in order.
Next, we convert a histopathology image into a graph structure with nuclei as nodes, and build a graph neural network to embed the spatial distribution of nuclei into their representations.
arXiv Detail & Related papers (2023-02-22T14:52:06Z) - Tangent Bundle Filters and Neural Networks: from Manifolds to Cellular
Sheaves and Back [114.01902073621577]
We use the convolution to define tangent bundle filters and tangent bundle neural networks (TNNs)
We discretize TNNs both in space and time domains, showing that their discrete counterpart is a principled variant of the recently introduced Sheaf Neural Networks.
We numerically evaluate the effectiveness of the proposed architecture on a denoising task of a tangent vector field over the unit 2-sphere.
arXiv Detail & Related papers (2022-10-26T21:55:45Z) - Transformer with Implicit Edges for Particle-based Physics Simulation [135.77656965678196]
Transformer with Implicit Edges (TIE) captures the rich semantics of particle interactions in an edge-free manner.
We evaluate our model on diverse domains of varying complexity and materials.
arXiv Detail & Related papers (2022-07-22T03:45:29Z) - Particle Cloud Generation with Message Passing Generative Adversarial
Networks [14.737885252814273]
In high energy physics, jets are collections of correlated particles produced ubiquitously in particle collisions.
Machine-learning-based generative models, such as generative adversarial networks (GANs), have the potential to significantly accelerate LHC jet simulations.
We introduce a new particle cloud dataset (JetNet), and, due to similarities between particle and point clouds, apply to it existing point cloud GANs.
arXiv Detail & Related papers (2021-06-22T04:21:16Z) - Image-Based Jet Analysis [2.5382095320488665]
Image-based jet analysis is built upon the jet image representation of jets that enables a direct connection between high energy physics and computer vision and deep learning.
We review the methods to understand what these models have learned and what is their sensitivity to uncertainties.
Beyond jet classification, several other applications of jet image based techniques, including energy estimation, pileup noise reduction, data generation, and anomaly detection are discussed.
arXiv Detail & Related papers (2020-12-17T16:42:29Z) - Spatio-Temporal Inception Graph Convolutional Networks for
Skeleton-Based Action Recognition [126.51241919472356]
We design a simple and highly modularized graph convolutional network architecture for skeleton-based action recognition.
Our network is constructed by repeating a building block that aggregates multi-granularity information from both the spatial and temporal paths.
arXiv Detail & Related papers (2020-11-26T14:43:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.