Morphological Classification of Radio Galaxies using Semi-Supervised
Group Equivariant CNNs
- URL: http://arxiv.org/abs/2306.00031v1
- Date: Wed, 31 May 2023 06:50:32 GMT
- Title: Morphological Classification of Radio Galaxies using Semi-Supervised
Group Equivariant CNNs
- Authors: Mir Sazzat Hossain (1), Sugandha Roy (1), K. M. B. Asad (1 and 2 and
3), Arshad Momen (1 and 2), Amin Ahsan Ali (1), M Ashraful Amin (1), A. K. M.
Mahbubur Rahman (1) ((1) Center for Computational & Data Sciences,
Independent University, Bangladesh, (2) Department of Physical Sciences,
Independent University, Bangladesh, (3) Astronomy and Radio Research Group,
SETS, Independent University, Bangladesh)
- Abstract summary: Out of the estimated few trillion galaxies, only around a million have been detected through radio frequencies.
We employ a semi-supervised learning approach to classify them into the known Fanaroff-Riley Type I (FRI) and Type II (FRII) categories.
A Group Equivariant Convolutional Neural Network (G-CNN) was used as an encoder of the state-of-the-art self-supervised methods SimCLR and BYOL.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Out of the estimated few trillion galaxies, only around a million have been
detected through radio frequencies, and only a tiny fraction, approximately a
thousand, have been manually classified. We have addressed this disparity
between labeled and unlabeled images of radio galaxies by employing a
semi-supervised learning approach to classify them into the known
Fanaroff-Riley Type I (FRI) and Type II (FRII) categories. A Group Equivariant
Convolutional Neural Network (G-CNN) was used as an encoder of the
state-of-the-art self-supervised methods SimCLR (A Simple Framework for
Contrastive Learning of Visual Representations) and BYOL (Bootstrap Your Own
Latent). The G-CNN preserves the equivariance for the Euclidean Group E(2),
enabling it to effectively learn the representation of globally oriented
feature maps. After representation learning, we trained a fully-connected
classifier and fine-tuned the trained encoder with labeled data. Our findings
demonstrate that our semi-supervised approach outperforms existing
state-of-the-art methods across several metrics, including cluster quality,
convergence rate, accuracy, precision, recall, and the F1-score. Moreover,
statistical significance testing via a t-test revealed that our method
surpasses the performance of a fully supervised G-CNN. This study emphasizes
the importance of semi-supervised learning in radio galaxy classification,
where labeled data are still scarce, but the prospects for discovery are
immense.
Related papers
- Enhancing Visual Continual Learning with Language-Guided Supervision [76.38481740848434]
Continual learning aims to empower models to learn new tasks without forgetting previously acquired knowledge.
We argue that the scarce semantic information conveyed by the one-hot labels hampers the effective knowledge transfer across tasks.
Specifically, we use PLMs to generate semantic targets for each class, which are frozen and serve as supervision signals.
arXiv Detail & Related papers (2024-03-24T12:41:58Z) - Generalized Category Discovery with Clustering Assignment Consistency [56.92546133591019]
Generalized category discovery (GCD) is a recently proposed open-world task.
We propose a co-training-based framework that encourages clustering consistency.
Our method achieves state-of-the-art performance on three generic benchmarks and three fine-grained visual recognition datasets.
arXiv Detail & Related papers (2023-10-30T00:32:47Z) - GenCo: An Auxiliary Generator from Contrastive Learning for Enhanced
Few-Shot Learning in Remote Sensing [9.504503675097137]
We introduce a generator-based contrastive learning framework (GenCo) that pre-trains backbones and simultaneously explores variants of feature samples.
In fine-tuning, the auxiliary generator can be used to enrich limited labeled data samples in feature space.
We demonstrate the effectiveness of our method in improving few-shot learning performance on two key remote sensing datasets.
arXiv Detail & Related papers (2023-07-27T03:59:19Z) - PromptCAL: Contrastive Affinity Learning via Auxiliary Prompts for
Generalized Novel Category Discovery [39.03732147384566]
Generalized Novel Category Discovery (GNCD) setting aims to categorize unlabeled training data coming from known and novel classes.
We propose Contrastive Affinity Learning method with auxiliary visual Prompts, dubbed PromptCAL, to address this challenging problem.
Our approach discovers reliable pairwise sample affinities to learn better semantic clustering of both known and novel classes for the class token and visual prompts.
arXiv Detail & Related papers (2022-12-11T20:06:14Z) - Semi-Supervised Domain Adaptation for Cross-Survey Galaxy Morphology
Classification and Anomaly Detection [57.85347204640585]
We develop a Universal Domain Adaptation method DeepAstroUDA.
It can be applied to datasets with different types of class overlap.
For the first time, we demonstrate the successful use of domain adaptation on two very different observational datasets.
arXiv Detail & Related papers (2022-11-01T18:07:21Z) - SCARF: Self-Supervised Contrastive Learning using Random Feature
Corruption [72.35532598131176]
We propose SCARF, a technique for contrastive learning, where views are formed by corrupting a random subset of features.
We show that SCARF complements existing strategies and outperforms alternatives like autoencoders.
arXiv Detail & Related papers (2021-06-29T08:08:33Z) - No Fear of Heterogeneity: Classifier Calibration for Federated Learning
with Non-IID Data [78.69828864672978]
A central challenge in training classification models in the real-world federated system is learning with non-IID data.
We propose a novel and simple algorithm called Virtual Representations (CCVR), which adjusts the classifier using virtual representations sampled from an approximated ssian mixture model.
Experimental results demonstrate that CCVR state-of-the-art performance on popular federated learning benchmarks including CIFAR-10, CIFAR-100, and CINIC-10.
arXiv Detail & Related papers (2021-06-09T12:02:29Z) - Binary Classification from Multiple Unlabeled Datasets via Surrogate Set
Classification [94.55805516167369]
We propose a new approach for binary classification from m U-sets for $mge2$.
Our key idea is to consider an auxiliary classification task called surrogate set classification (SSC)
arXiv Detail & Related papers (2021-02-01T07:36:38Z) - EC-GAN: Low-Sample Classification using Semi-Supervised Algorithms and
GANs [0.0]
Semi-supervised learning has been gaining attention as it allows for performing image analysis tasks such as classification with limited labeled data.
Some popular algorithms using Generative Adrial Networks (GANs) for semi-supervised classification share a single architecture for classification and discrimination.
This may require a model to converge to a separate data distribution for each task, which may reduce overall performance.
We propose a novel GAN model namely External GAN (ECGAN) that utilizes GANs and semi-supervised algorithms to improve classification in fully-supervised tasks.
arXiv Detail & Related papers (2020-12-26T05:58:00Z) - Data-Efficient Classification of Radio Galaxies [0.0]
In this paper, we explore the task of radio galaxy classification based on morphology using deep learning methods.
We apply few-shot learning techniques based on Twin Networks and transfer learning techniques using a pre-trained DenseNet model.
We achieve a classification accuracy of over 92% using our best performing model with the biggest source of confusion being between Bent and FRII type galaxies.
arXiv Detail & Related papers (2020-11-26T14:28:19Z) - DeepMerge: Classifying High-redshift Merging Galaxies with Deep Neural
Networks [0.0]
We show the use of convolutional neural networks (CNNs) for the task of distinguishing between merging and non-merging galaxies in simulated images.
We extract images of merging and non-merging galaxies from the Illustris-1 cosmological simulation and apply observational and experimental noise.
The test set classification accuracy of the CNN is $79%$ for pristine and $76%$ for noisy.
arXiv Detail & Related papers (2020-04-24T20:36:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.