2D Convolutional Neural Network for Event Reconstruction in IceCube
DeepCore
- URL: http://arxiv.org/abs/2307.16373v1
- Date: Mon, 31 Jul 2023 02:37:36 GMT
- Title: 2D Convolutional Neural Network for Event Reconstruction in IceCube
DeepCore
- Authors: J.H. Peterson, M. Prado Rodriguez, K. Hanson (for the IceCube
Collaboration)
- Abstract summary: IceCube DeepCore is an extension of the IceCube Neutrino Observatory designed to measure GeV scale atmospheric neutrino interactions.
Distinguishing muon neutrinos from other flavors and reconstructing inelasticity are especially difficult tasks at GeV scale energies.
We present a new CNN model that exploits time and depth translational symmetry in IceCube DeepCore data.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: IceCube DeepCore is an extension of the IceCube Neutrino Observatory designed
to measure GeV scale atmospheric neutrino interactions for the purpose of
neutrino oscillation studies. Distinguishing muon neutrinos from other flavors
and reconstructing inelasticity are especially difficult tasks at GeV scale
energies in IceCube DeepCore due to sparse instrumentation. Convolutional
neural networks (CNNs) have been found to have better success at neutrino event
reconstruction than conventional likelihood-based methods. In this
contribution, we present a new CNN model that exploits time and depth
translational symmetry in IceCube DeepCore data and present the model's
performance, specifically for flavor identification and inelasticity
reconstruction.
Related papers
- Recent neutrino oscillation result with the IceCube experiment [14.645468999921961]
The IceCube South Pole Neutrino Observatory is a Cherenkov detector instrumented in a cubic kilometer of ice at the South Pole.
IceCube's primary scientific goal is the detection of TeV neutrino emissions from astrophysical sources.
Advances in physics sensitivity have recently been achieved by employing Convolutional Neural Networks to reconstruct neutrino interactions in the DeepCore detector.
arXiv Detail & Related papers (2023-07-29T01:12:26Z) - Trigger-Level Event Reconstruction for Neutrino Telescopes Using Sparse
Submanifold Convolutional Neural Networks [0.0]
Convolutional neural networks (CNNs) have seen extensive applications in scientific data analysis, including in neutrino telescopes.
We propose sparse submanifold convolutions (SSCNNs) as a solution to these issues.
We show that the SSCNN event reconstruction performance is comparable to or better than traditional and machine learning algorithms.
arXiv Detail & Related papers (2023-03-15T17:59:01Z) - Interpretable Joint Event-Particle Reconstruction for Neutrino Physics
at NOvA with Sparse CNNs and Transformers [124.29621071934693]
We present a novel neural network architecture that combines the spatial learning enabled by convolutions with the contextual learning enabled by attention.
TransformerCVN simultaneously classifies each event and reconstructs every individual particle's identity.
This architecture enables us to perform several interpretability studies which provide insights into the network's predictions.
arXiv Detail & Related papers (2023-03-10T20:36:23Z) - Gradient Descent in Neural Networks as Sequential Learning in RKBS [63.011641517977644]
We construct an exact power-series representation of the neural network in a finite neighborhood of the initial weights.
We prove that, regardless of width, the training sequence produced by gradient descent can be exactly replicated by regularized sequential learning.
arXiv Detail & Related papers (2023-02-01T03:18:07Z) - GraphNeT: Graph neural networks for neutrino telescope event
reconstruction [0.0]
GraphNeT is an open-source python framework to perform reconstruction tasks at neutrino telescopes using graph neural networks (GNNs)
GNNs from GraphNeT are flexible enough to be applied to data from all neutrino telescopes, including future projects such as IceCube extensions or P-ONE.
This means that GNN-based reconstruction can be used to provide state-of-the-art performance on most reconstruction tasks in neutrino telescopes, at real-time event rates, across experiments and physics analyses, with vast potential impact for neutrino and astro-particle physics.
arXiv Detail & Related papers (2022-10-21T18:43:50Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - A Convolutional Neural Network based Cascade Reconstruction for the
IceCube Neutrino Observatory [0.4282223735043171]
Deep neural networks can be extremely powerful, and their usage is computationally inexpensive once the networks are trained.
A reconstruction method based on convolutional architectures and hexagonally shaped kernels is presented.
It can improve upon the reconstruction accuracy, while reducing the time necessary to run the reconstruction by two to three orders of magnitude.
arXiv Detail & Related papers (2021-01-27T18:34:58Z) - Neural Networks with Recurrent Generative Feedback [61.90658210112138]
We instantiate this design on convolutional neural networks (CNNs)
In the experiments, CNN-F shows considerably improved adversarial robustness over conventional feedforward CNNs on standard benchmarks.
arXiv Detail & Related papers (2020-07-17T19:32:48Z) - A Generalized Neural Tangent Kernel Analysis for Two-layer Neural
Networks [87.23360438947114]
We show that noisy gradient descent with weight decay can still exhibit a " Kernel-like" behavior.
This implies that the training loss converges linearly up to a certain accuracy.
We also establish a novel generalization error bound for two-layer neural networks trained by noisy gradient descent with weight decay.
arXiv Detail & Related papers (2020-02-10T18:56:15Z) - On Random Kernels of Residual Architectures [93.94469470368988]
We derive finite width and depth corrections for the Neural Tangent Kernel (NTK) of ResNets and DenseNets.
Our findings show that in ResNets, convergence to the NTK may occur when depth and width simultaneously tend to infinity.
In DenseNets, however, convergence of the NTK to its limit as the width tends to infinity is guaranteed.
arXiv Detail & Related papers (2020-01-28T16:47:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.