Bayesian and Convolutional Networks for Hierarchical Morphological Classification of Galaxies
- URL: http://arxiv.org/abs/2405.02366v1
- Date: Fri, 3 May 2024 06:48:53 GMT
- Title: Bayesian and Convolutional Networks for Hierarchical Morphological Classification of Galaxies
- Authors: Jonathan Serrano-Pérez, Raquel Díaz Hernández, L. Enrique Sucar,
- Abstract summary: This work is focused on the morphological classification of galaxies following the Hubble sequence in which the different classes are arranged in a hierarchy.
The proposed method, BCNN, is composed of two main modules.
BCNN performed better than several CNNs in multiple evaluation measures, reaching the next scores: 67% in exact match, 78% in accuracy, and 83% in hierarchical F-measure.
- Score: 1.474723404975345
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: This work is focused on the morphological classification of galaxies following the Hubble sequence in which the different classes are arranged in a hierarchy. The proposed method, BCNN, is composed of two main modules. First, a convolutional neural network (CNN) is trained with images of the different classes of galaxies (image augmentation is carried out to balance some classes); the CNN outputs the probability for each class of the hierarchy, and its outputs/predictions feed the second module. The second module consists of a Bayesian network that represents the hierarchy and helps to improve the prediction accuracy by combining the predictions of the first phase while maintaining the hierarchical constraint (in a hierarchy, an instance associated with a node must be associated to all its ancestors), through probabilistic inference over the Bayesian network so that a consistent prediction is obtained. Different images from the Hubble telescope have been collected and labeled by experts, which are used to perform the experiments. The results show that BCNN performed better than several CNNs in multiple evaluation measures, reaching the next scores: 67% in exact match, 78% in accuracy, and 83% in hierarchical F-measure.
Related papers
- The Heterophilic Snowflake Hypothesis: Training and Empowering GNNs for Heterophilic Graphs [59.03660013787925]
We introduce the Heterophily Snowflake Hypothesis and provide an effective solution to guide and facilitate research on heterophilic graphs.
Our observations show that our framework acts as a versatile operator for diverse tasks.
It can be integrated into various GNN frameworks, boosting performance in-depth and offering an explainable approach to choosing the optimal network depth.
arXiv Detail & Related papers (2024-06-18T12:16:00Z) - RankDNN: Learning to Rank for Few-shot Learning [70.49494297554537]
This paper introduces a new few-shot learning pipeline that casts relevance ranking for image retrieval as binary ranking relation classification.
It provides a new perspective on few-shot learning and is complementary to state-of-the-art methods.
arXiv Detail & Related papers (2022-11-28T13:59:31Z) - Do We Really Need a Learnable Classifier at the End of Deep Neural
Network? [118.18554882199676]
We study the potential of learning a neural network for classification with the classifier randomly as an ETF and fixed during training.
Our experimental results show that our method is able to achieve similar performances on image classification for balanced datasets.
arXiv Detail & Related papers (2022-03-17T04:34:28Z) - Rethinking Nearest Neighbors for Visual Classification [56.00783095670361]
k-NN is a lazy learning method that aggregates the distance between the test image and top-k neighbors in a training set.
We adopt k-NN with pre-trained visual representations produced by either supervised or self-supervised methods in two steps.
Via extensive experiments on a wide range of classification tasks, our study reveals the generality and flexibility of k-NN integration.
arXiv Detail & Related papers (2021-12-15T20:15:01Z) - Learning Hierarchical Graph Neural Networks for Image Clustering [81.5841862489509]
We propose a hierarchical graph neural network (GNN) model that learns how to cluster a set of images into an unknown number of identities.
Our hierarchical GNN uses a novel approach to merge connected components predicted at each level of the hierarchy to form a new graph at the next level.
arXiv Detail & Related papers (2021-07-03T01:28:42Z) - Making CNNs Interpretable by Building Dynamic Sequential Decision
Forests with Top-down Hierarchy Learning [62.82046926149371]
We propose a generic model transfer scheme to make Convlutional Neural Networks (CNNs) interpretable.
We achieve this by building a differentiable decision forest on top of CNNs.
We name the transferred model deep Dynamic Sequential Decision Forest (dDSDF)
arXiv Detail & Related papers (2021-06-05T07:41:18Z) - An evidential classifier based on Dempster-Shafer theory and deep
learning [6.230751621285322]
We propose a new classification system based on Dempster-Shafer (DS) theory and a convolutional neural network (CNN) architecture for set-valued classification.
Experiments on image recognition, signal processing, and semantic-relationship classification tasks demonstrate that the proposed combination of deep CNN, DS layer, and expected utility layer makes it possible to improve classification accuracy.
arXiv Detail & Related papers (2021-03-25T01:29:05Z) - DeepMerge: Classifying High-redshift Merging Galaxies with Deep Neural
Networks [0.0]
We show the use of convolutional neural networks (CNNs) for the task of distinguishing between merging and non-merging galaxies in simulated images.
We extract images of merging and non-merging galaxies from the Illustris-1 cosmological simulation and apply observational and experimental noise.
The test set classification accuracy of the CNN is $79%$ for pristine and $76%$ for noisy.
arXiv Detail & Related papers (2020-04-24T20:36:06Z) - Multilayer Dense Connections for Hierarchical Concept Classification [3.6093339545734886]
We propose a multilayer dense connectivity for concurrent prediction of category and its conceptual superclasses in hierarchical order by the same CNN.
We experimentally demonstrate that our proposed network can simultaneously predict both the coarse superclasses and finer categories better than several existing algorithms in multiple datasets.
arXiv Detail & Related papers (2020-03-19T20:56:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.