Gamma/hadron separation in the TAIGA experiment with neural network methods
- URL: http://arxiv.org/abs/2502.01500v1
- Date: Mon, 03 Feb 2025 16:34:44 GMT
- Title: Gamma/hadron separation in the TAIGA experiment with neural network methods
- Authors: E. O. Gres, A. P. Kryukov, P. A. Volchugov, J. J. Dubenskaya, D. P. Zhurov, S. P. Polyakov, E. B. Postnikov, A. A. Vlaskina,
- Abstract summary: The ability of rare VHE gamma ray selection with neural network methods is investigated in the case when cosmic radiation flux strongly prevails.
It is demonstrated that a signal was received at the level of higher than 5.5sigma in 21 hours of Crab Nebula observations.
- Score: 0.0
- License:
- Abstract: In this work, the ability of rare VHE gamma ray selection with neural network methods is investigated in the case when cosmic radiation flux strongly prevails (ratio up to {10^4} over the gamma radiation flux from a point source). This ratio is valid for the Crab Nebula in the TeV energy range, since the Crab is a well-studied source for calibration and test of various methods and installations in gamma astronomy. The part of TAIGA experiment which includes three Imaging Atmospheric Cherenkov Telescopes observes this gamma-source too. Cherenkov telescopes obtain images of Extensive Air Showers. Hillas parameters can be used to analyse images in standard processing method, or images can be processed with convolutional neural networks. In this work we would like to describe the main steps and results obtained in the gamma/hadron separation task from the Crab Nebula with neural network methods. The results obtained are compared with standard processing method applied in the TAIGA collaboration and using Hillas parameter cuts. It is demonstrated that a signal was received at the level of higher than 5.5{\sigma} in 21 hours of Crab Nebula observations after processing the experimental data with the neural network method.
Related papers
- Selection of gamma events from IACT images with deep learning methods [0.0]
Imaging Atmospheric Cherenkov Telescopes (IACTs) of gamma ray observatory TAIGA detect the Extesnive Air Showers (EASs)
The ability to segregate gamma rays images from the hadronic cosmic ray background is one of the main features of this type of detectors.
In actual IACT observations simultaneous observation of the background and the source of gamma ray is needed.
This observation mode (called wobbling) modifies images of events, which affects the quality of selection by neural networks.
arXiv Detail & Related papers (2024-01-30T13:07:24Z) - Generating Images of the M87* Black Hole Using GANs [1.0532948482859532]
We introduce Conditional Progressive Generative Adversarial Networks (CPGAN) to generate diverse black hole images.
GANs can be employed as cost effective models for black hole image generation and reliably augment training datasets for other parameterization algorithms.
arXiv Detail & Related papers (2023-12-02T02:47:34Z) - Energy Reconstruction in Analysis of Cherenkov Telescopes Images in
TAIGA Experiment Using Deep Learning Methods [0.0]
This paper presents the analysis of simulated Monte Carlo images by several Deep Learning methods for a single telescope (mono-mode) and multiple IACT telescopes (stereo-mode)
The estimation of the quality of energy reconstruction was carried out and their energy spectra were analyzed using several types of neural networks.
arXiv Detail & Related papers (2022-11-16T15:24:32Z) - Supernova Light Curves Approximation based on Neural Network Models [53.180678723280145]
Photometric data-driven classification of supernovae becomes a challenge due to the appearance of real-time processing of big data in astronomy.
Recent studies have demonstrated the superior quality of solutions based on various machine learning models.
We study the application of multilayer perceptron (MLP), bayesian neural network (BNN), and normalizing flows (NF) to approximate observations for a single light curve.
arXiv Detail & Related papers (2022-06-27T13:46:51Z) - Processing Images from Multiple IACTs in the TAIGA Experiment with
Convolutional Neural Networks [62.997667081978825]
We use convolutional neural networks (CNNs) to analyze Monte Carlo-simulated images from the TAIGA experiment.
The analysis includes selection of the images corresponding to the showers caused by gamma rays and estimating the energy of the gamma rays.
arXiv Detail & Related papers (2021-12-31T10:49:11Z) - Analysis of the HiSCORE Simulated Events in TAIGA Experiment Using
Convolutional Neural Networks [77.34726150561087]
We propose to consider the use of convolution neural networks in task of air shower characteristics determination.
We use CNN to analyze HiSCORE events, treating them like images.
In addition, we present some preliminary results on the determination of the parameters of air showers.
arXiv Detail & Related papers (2021-12-19T15:18:56Z) - The Preliminary Results on Analysis of TAIGA-IACT Images Using
Convolutional Neural Networks [68.8204255655161]
The aim of the work is to study the possibility of the machine learning application to solve the tasks set for TAIGA-IACT.
The method of Convolutional Neural Networks (CNN) was applied to process and analyze Monte-Carlo events simulated with CORSIKA.
arXiv Detail & Related papers (2021-12-19T15:17:20Z) - CNN Filter Learning from Drawn Markers for the Detection of Suggestive
Signs of COVID-19 in CT Images [58.720142291102135]
We propose a method that does not require either large annotated datasets or backpropagation to estimate the filters of a convolutional neural network (CNN)
For a few CT images, the user draws markers at representative normal and abnormal regions.
The method generates a feature extractor composed of a sequence of convolutional layers, whose kernels are specialized in enhancing regions similar to the marked ones.
arXiv Detail & Related papers (2021-11-16T15:03:42Z) - Depth Estimation from Monocular Images and Sparse Radar Data [93.70524512061318]
In this paper, we explore the possibility of achieving a more accurate depth estimation by fusing monocular images and Radar points using a deep neural network.
We find that the noise existing in Radar measurements is one of the main key reasons that prevents one from applying the existing fusion methods.
The experiments are conducted on the nuScenes dataset, which is one of the first datasets which features Camera, Radar, and LiDAR recordings in diverse scenes and weather conditions.
arXiv Detail & Related papers (2020-09-30T19:01:33Z) - A study of Neural networks point source extraction on simulated
Fermi/LAT Telescope images [1.4528756508275622]
We present a method for point sources extraction using a convolution neural network (CNN) trained on our own artificial data set.
These images are raw count photon maps of 10x10 degrees covering energies from 1 to 10 GeV.
arXiv Detail & Related papers (2020-07-08T17:47:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.