Multilayer Dense Connections for Hierarchical Concept Classification
- URL: http://arxiv.org/abs/2003.09015v2
- Date: Mon, 22 Feb 2021 19:20:39 GMT
- Title: Multilayer Dense Connections for Hierarchical Concept Classification
- Authors: Toufiq Parag and Hongcheng Wang
- Abstract summary: We propose a multilayer dense connectivity for concurrent prediction of category and its conceptual superclasses in hierarchical order by the same CNN.
We experimentally demonstrate that our proposed network can simultaneously predict both the coarse superclasses and finer categories better than several existing algorithms in multiple datasets.
- Score: 3.6093339545734886
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Classification is a pivotal function for many computer vision tasks such as
object classification, detection, scene segmentation. Multinomial logistic
regression with a single final layer of dense connections has become the
ubiquitous technique for CNN-based classification. While these classifiers
project a mapping between the input and a set of output category classes, they
do not typically yield a comprehensive description of the category. In
particular, when a CNN based image classifier correctly identifies the image of
a Chimpanzee, its output does not clarify that Chimpanzee is a member of
Primate, Mammal, Chordate families and a living thing. We propose a multilayer
dense connectivity for concurrent prediction of category and its conceptual
superclasses in hierarchical order by the same CNN. We experimentally
demonstrate that our proposed network can simultaneously predict both the
coarse superclasses and finer categories better than several existing
algorithms in multiple datasets.
Related papers
- Bayesian and Convolutional Networks for Hierarchical Morphological Classification of Galaxies [1.474723404975345]
This work is focused on the morphological classification of galaxies following the Hubble sequence in which the different classes are arranged in a hierarchy.
The proposed method, BCNN, is composed of two main modules.
BCNN performed better than several CNNs in multiple evaluation measures, reaching the next scores: 67% in exact match, 78% in accuracy, and 83% in hierarchical F-measure.
arXiv Detail & Related papers (2024-05-03T06:48:53Z) - Fine-grained Recognition with Learnable Semantic Data Augmentation [68.48892326854494]
Fine-grained image recognition is a longstanding computer vision challenge.
We propose diversifying the training data at the feature-level to alleviate the discriminative region loss problem.
Our method significantly improves the generalization performance on several popular classification networks.
arXiv Detail & Related papers (2023-09-01T11:15:50Z) - A Capsule Network for Hierarchical Multi-Label Image Classification [2.507647327384289]
Hierarchical multi-label classification applies when a multi-class image classification problem is arranged into smaller ones based upon a hierarchy or taxonomy.
We propose a multi-label capsule network (ML-CapsNet) for hierarchical classification.
arXiv Detail & Related papers (2022-09-13T04:17:08Z) - Do We Really Need a Learnable Classifier at the End of Deep Neural
Network? [118.18554882199676]
We study the potential of learning a neural network for classification with the classifier randomly as an ETF and fixed during training.
Our experimental results show that our method is able to achieve similar performances on image classification for balanced datasets.
arXiv Detail & Related papers (2022-03-17T04:34:28Z) - Deep ensembles in bioimage segmentation [74.01883650587321]
In this work, we propose an ensemble of convolutional neural networks (CNNs)
In ensemble methods, many different models are trained and then used for classification, the ensemble aggregates the outputs of the single classifiers.
The proposed ensemble is implemented by combining different backbone networks using the DeepLabV3+ and HarDNet environment.
arXiv Detail & Related papers (2021-12-24T05:54:21Z) - An evidential classifier based on Dempster-Shafer theory and deep
learning [6.230751621285322]
We propose a new classification system based on Dempster-Shafer (DS) theory and a convolutional neural network (CNN) architecture for set-valued classification.
Experiments on image recognition, signal processing, and semantic-relationship classification tasks demonstrate that the proposed combination of deep CNN, DS layer, and expected utility layer makes it possible to improve classification accuracy.
arXiv Detail & Related papers (2021-03-25T01:29:05Z) - Forest R-CNN: Large-Vocabulary Long-Tailed Object Detection and Instance
Segmentation [75.93960390191262]
We exploit prior knowledge of the relations among object categories to cluster fine-grained classes into coarser parent classes.
We propose a simple yet effective resampling method, NMS Resampling, to re-balance the data distribution.
Our method, termed as Forest R-CNN, can serve as a plug-and-play module being applied to most object recognition models.
arXiv Detail & Related papers (2020-08-13T03:52:37Z) - Conditional Classification: A Solution for Computational Energy
Reduction [2.182419181054266]
We propose a novel solution to reduce the computational complexity of convolutional neural network models.
Our proposed technique breaks the classification task into two steps: 1) coarse-grain classification, in which the input samples are classified among a set of hyper-classes, 2) fine-grain classification, in which the final labels are predicted among those hyper-classes detected at the first step.
arXiv Detail & Related papers (2020-06-29T03:50:39Z) - Fine-Grained Visual Classification with Efficient End-to-end
Localization [49.9887676289364]
We present an efficient localization module that can be fused with a classification network in an end-to-end setup.
We evaluate the new model on the three benchmark datasets CUB200-2011, Stanford Cars and FGVC-Aircraft.
arXiv Detail & Related papers (2020-05-11T14:07:06Z) - A Systematic Evaluation: Fine-Grained CNN vs. Traditional CNN
Classifiers [54.996358399108566]
We investigate the performance of the landmark general CNN classifiers, which presented top-notch results on large scale classification datasets.
We compare it against state-of-the-art fine-grained classifiers.
We show an extensive evaluation on six datasets to determine whether the fine-grained classifier is able to elevate the baseline in their experiments.
arXiv Detail & Related papers (2020-03-24T23:49:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.