Label Relation Graphs Enhanced Hierarchical Residual Network for
Hierarchical Multi-Granularity Classification
- URL: http://arxiv.org/abs/2201.03194v2
- Date: Tue, 11 Jan 2022 06:57:52 GMT
- Title: Label Relation Graphs Enhanced Hierarchical Residual Network for
Hierarchical Multi-Granularity Classification
- Authors: Jingzhou Chen, Peng Wang, Jian Liu, Yuntao Qian
- Abstract summary: We study the HMC problem in which objects are labeled at any level of the hierarchy.
We propose a hierarchical residual network (HRN) in which residual connections are added to features of children levels.
- Score: 10.449261628173229
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hierarchical multi-granularity classification (HMC) assigns hierarchical
multi-granularity labels to each object and focuses on encoding the label
hierarchy, e.g., ["Albatross", "Laysan Albatross"] from coarse-to-fine levels.
However, the definition of what is fine-grained is subjective, and the image
quality may affect the identification. Thus, samples could be observed at any
level of the hierarchy, e.g., ["Albatross"] or ["Albatross", "Laysan
Albatross"], and examples discerned at coarse categories are often neglected in
the conventional setting of HMC. In this paper, we study the HMC problem in
which objects are labeled at any level of the hierarchy. The essential designs
of the proposed method are derived from two motivations: (1) learning with
objects labeled at various levels should transfer hierarchical knowledge
between levels; (2) lower-level classes should inherit attributes related to
upper-level superclasses. The proposed combinatorial loss maximizes the
marginal probability of the observed ground truth label by aggregating
information from related labels defined in the tree hierarchy. If the observed
label is at the leaf level, the combinatorial loss further imposes the
multi-class cross-entropy loss to increase the weight of fine-grained
classification loss. Considering the hierarchical feature interaction, we
propose a hierarchical residual network (HRN), in which granularity-specific
features from parent levels acting as residual connections are added to
features of children levels. Experiments on three commonly used datasets
demonstrate the effectiveness of our approach compared to the state-of-the-art
HMC approaches and fine-grained visual classification (FGVC) methods exploiting
the label hierarchy.
Related papers
- LayerMatch: Do Pseudo-labels Benefit All Layers? [77.59625180366115]
Semi-supervised learning offers a promising solution to mitigate the dependency of labeled data.
We develop two layer-specific pseudo-label strategies, termed Grad-ReLU and Avg-Clustering.
Our approach consistently demonstrates exceptional performance on standard semi-supervised learning benchmarks.
arXiv Detail & Related papers (2024-06-20T11:25:50Z) - Semantic Guided Level-Category Hybrid Prediction Network for
Hierarchical Image Classification [8.456482280676884]
Hierarchical classification (HC) assigns each object with multiple labels organized into a hierarchical structure.
We propose a novel semantic guided level-category hybrid prediction network (SGLCHPN) that can jointly perform the level and category prediction in an end-to-end manner.
arXiv Detail & Related papers (2022-11-22T13:49:10Z) - A Capsule Network for Hierarchical Multi-Label Image Classification [2.507647327384289]
Hierarchical multi-label classification applies when a multi-class image classification problem is arranged into smaller ones based upon a hierarchy or taxonomy.
We propose a multi-label capsule network (ML-CapsNet) for hierarchical classification.
arXiv Detail & Related papers (2022-09-13T04:17:08Z) - Use All The Labels: A Hierarchical Multi-Label Contrastive Learning
Framework [75.79736930414715]
We present a hierarchical multi-label representation learning framework that can leverage all available labels and preserve the hierarchical relationship between classes.
We introduce novel hierarchy preserving losses, which jointly apply a hierarchical penalty to the contrastive loss, and enforce the hierarchy constraint.
arXiv Detail & Related papers (2022-04-27T21:41:44Z) - Deep Hierarchical Semantic Segmentation [76.40565872257709]
hierarchical semantic segmentation (HSS) aims at structured, pixel-wise description of visual observation in terms of a class hierarchy.
HSSN casts HSS as a pixel-wise multi-label classification task, only bringing minimal architecture change to current segmentation models.
With hierarchy-induced margin constraints, HSSN reshapes the pixel embedding space, so as to generate well-structured pixel representations.
arXiv Detail & Related papers (2022-03-27T15:47:44Z) - The Overlooked Classifier in Human-Object Interaction Recognition [82.20671129356037]
We encode the semantic correlation among classes into the classification head by initializing the weights with language embeddings of HOIs.
We propose a new loss named LSE-Sign to enhance multi-label learning on a long-tailed dataset.
Our simple yet effective method enables detection-free HOI classification, outperforming the state-of-the-arts that require object detection and human pose by a clear margin.
arXiv Detail & Related papers (2022-03-10T23:35:00Z) - Label Hierarchy Transition: Delving into Class Hierarchies to Enhance
Deep Classifiers [40.993137740456014]
We propose a unified probabilistic framework based on deep learning to address the challenges of hierarchical classification.
The proposed framework can be readily adapted to any existing deep network with only minor modifications.
We extend our proposed LHT framework to the skin lesion diagnosis task and validate its great potential in computer-aided diagnosis.
arXiv Detail & Related papers (2021-12-04T14:58:36Z) - Joint Learning of Hyperbolic Label Embeddings for Hierarchical
Multi-label Classification [9.996804039553858]
We consider the problem of multi-label classification where the labels lie in a hierarchy.
We propose a novel formulation for the joint learning and empirically evaluate its efficacy.
arXiv Detail & Related papers (2021-01-13T10:58:54Z) - Coherent Hierarchical Multi-Label Classification Networks [56.41950277906307]
C-HMCNN(h) is a novel approach for HMC problems, which exploits hierarchy information in order to produce predictions coherent with the constraint and improve performance.
We conduct an extensive experimental analysis showing the superior performance of C-HMCNN(h) when compared to state-of-the-art models.
arXiv Detail & Related papers (2020-10-20T09:37:02Z) - Exploring the Hierarchy in Relation Labels for Scene Graph Generation [75.88758055269948]
The proposed method can improve several state-of-the-art baselines by a large margin (up to $33%$ relative gain) in terms of Recall@50.
Experiments show that the proposed simple yet effective method can improve several state-of-the-art baselines by a large margin.
arXiv Detail & Related papers (2020-09-12T17:36:53Z) - Joint Embedding of Words and Category Labels for Hierarchical
Multi-label Text Classification [4.2750700546937335]
hierarchical text classification (HTC) has received extensive attention and has broad application prospects.
We propose a joint embedding of text and parent category based on hierarchical fine-tuning ordered neurons LSTM (HFT-ONLSTM) for HTC.
arXiv Detail & Related papers (2020-04-06T11:06:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.