Ranking hierarchical multi-label classification results with mLPRs
- URL: http://arxiv.org/abs/2205.07833v2
- Date: Sun, 02 Nov 2025 08:51:23 GMT
- Title: Ranking hierarchical multi-label classification results with mLPRs
- Authors: Yuting Ye, Christine Ho, Ci-Ren Jiang, Wayne Tai Lee, Haiyan Huang,
- Abstract summary: We focus on the less attended second-stage question while adhering to the given class hierarchy.<n>We introduce a new objective function, called CATCH, to ensure reasonable classification performance.<n>Our approach was evaluated on a synthetic dataset and two real datasets.
- Score: 4.869182515096001
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hierarchical multi-label classification (HMC) has gained considerable attention in recent decades. A seminal line of HMC research addresses the problem in two stages: first, training individual classifiers for each class, then integrating these classifiers to provide a unified set of classification results across classes while respecting the given hierarchy. In this article, we focus on the less attended second-stage question while adhering to the given class hierarchy. This involves addressing a key challenge: how to manage the hierarchical constraint and account for statistical differences in the first-stage classifier scores across different classes to make classification decisions that are optimal under a justifiable criterion. To address this challenge, we introduce a new objective function, called CATCH, to ensure reasonable classification performance. To optimize this function, we propose a decision strategy built on a novel metric, the multidimensional Local Precision Rate (mLPR), which reflects the membership chance of an object in a class given all classifier scores and the class hierarchy. Particularly, we demonstrate that, under certain conditions, transforming the classifier scores into mLPRs and comparing mLPR values for all objects against all classes can, in theory, ensure the class hierarchy and maximize CATCH. In practice, we propose an algorithm HierRank to rank estimated mLPRs under the hierarchical constraint, leading to a ranking that maximizes an empirical version of CATCH. Our approach was evaluated on a synthetic dataset and two real datasets, exhibiting superior performance compared to several state-of-the-art methods in terms of improved decision accuracy.
Related papers
- Feature Identification for Hierarchical Contrastive Learning [7.655211354400059]
We propose two novel hierarchical contrastive learning (HMLC) methods.<n>Our approach explicitly models inter-class relationships and imbalanced class distribution at higher hierarchy levels.<n>Our method achieves state-of-the-art performance in linear evaluation, outperforming existing hierarchical contrastive learning methods by 2 percentage points in terms of accuracy.
arXiv Detail & Related papers (2025-10-01T12:46:47Z) - Hierarchical Representation Matching for CLIP-based Class-Incremental Learning [80.2317078787969]
Class-Incremental Learning (CIL) aims to endow models with the ability to continuously adapt to evolving data streams.<n>Recent advances in pre-trained vision-language models (e.g., CLIP) provide a powerful foundation for this task.<n>We introduce HiErarchical Representation MAtchiNg (HERMAN) for CLIP-based CIL.
arXiv Detail & Related papers (2025-09-26T17:59:51Z) - AHDMIL: Asymmetric Hierarchical Distillation Multi-Instance Learning for Fast and Accurate Whole-Slide Image Classification [51.525891360380285]
AHDMIL is an Asymmetric Hierarchical Distillation Multi-Instance Learning framework.<n>It eliminates irrelevant patches through a two-step training process.<n>It consistently outperforms previous state-of-the-art methods in both classification performance and inference speed.
arXiv Detail & Related papers (2025-08-07T07:47:16Z) - Class-Independent Increment: An Efficient Approach for Multi-label Class-Incremental Learning [49.65841002338575]
This paper focuses on the challenging yet practical multi-label class-incremental learning (MLCIL) problem.
We propose a novel class-independent incremental network (CINet) to extract multiple class-level embeddings for multi-label samples.
It learns and preserves the knowledge of different classes by constructing class-specific tokens.
arXiv Detail & Related papers (2025-03-01T14:40:52Z) - Performance Improvement in Multi-class Classification via Automated
Hierarchy Generation and Exploitation through Extended LCPN Schemes [0.0]
This study explores the performance of hierarchical classification (HC) through a comprehensive analysis.
Two novel hierarchy exploitation schemes, LCPN+ and LCPN+F, have been introduced and evaluated.
The findings reveal the consistent superiority of LCPN+F, which outperforms other schemes across various datasets and scenarios.
arXiv Detail & Related papers (2023-10-31T17:11:29Z) - Generating Hierarchical Structures for Improved Time Series
Classification Using Stochastic Splitting Functions [0.0]
This study introduces a novel hierarchical divisive clustering approach with splitting functions (SSFs) to enhance classification performance in multi-class datasets through hierarchical classification (HC)
The method has the unique capability of generating hierarchy without requiring explicit information, making it suitable for datasets lacking prior knowledge of hierarchy.
arXiv Detail & Related papers (2023-09-21T10:34:50Z) - Bipartite Ranking Fairness through a Model Agnostic Ordering Adjustment [54.179859639868646]
We propose a model agnostic post-processing framework xOrder for achieving fairness in bipartite ranking.
xOrder is compatible with various classification models and ranking fairness metrics, including supervised and unsupervised fairness metrics.
We evaluate our proposed algorithm on four benchmark data sets and two real-world patient electronic health record repositories.
arXiv Detail & Related papers (2023-07-27T07:42:44Z) - Enhancing Classification with Hierarchical Scalable Query on Fusion
Transformer [0.4129225533930965]
This paper proposes a method to boost fine-grained classification through a hierarchical approach via learnable independent query embeddings.
We exploit the idea of hierarchy to learn query embeddings that are scalable across all levels.
Our method is able to outperform the existing methods with an improvement of 11% at the fine-grained classification.
arXiv Detail & Related papers (2023-02-28T11:00:55Z) - Task-Specific Embeddings for Ante-Hoc Explainable Text Classification [6.671252951387647]
We propose an alternative training objective in which we learn task-specific embeddings of text.
Our proposed objective learns embeddings such that all texts that share the same target class label should be close together.
We present extensive experiments which show that the benefits of ante-hoc explainability and incremental learning come at no cost in overall classification accuracy.
arXiv Detail & Related papers (2022-11-30T19:56:25Z) - Class-Incremental Lifelong Learning in Multi-Label Classification [3.711485819097916]
This paper studies Lifelong Multi-Label (LML) classification, which builds an online class-incremental classifier in a sequential multi-label classification data stream.
To solve the problem, the study proposes an Augmented Graph Convolutional Network (AGCN) with a built Augmented Correlation Matrix (ACM) across sequential partial-label tasks.
arXiv Detail & Related papers (2022-07-16T05:14:07Z) - The Overlooked Classifier in Human-Object Interaction Recognition [82.20671129356037]
We encode the semantic correlation among classes into the classification head by initializing the weights with language embeddings of HOIs.
We propose a new loss named LSE-Sign to enhance multi-label learning on a long-tailed dataset.
Our simple yet effective method enables detection-free HOI classification, outperforming the state-of-the-arts that require object detection and human pose by a clear margin.
arXiv Detail & Related papers (2022-03-10T23:35:00Z) - Rank4Class: A Ranking Formulation for Multiclass Classification [26.47229268790206]
Multiclass classification (MCC) is a fundamental machine learning problem.
We show that it is easy to boost MCC performance with a novel formulation through the lens of ranking.
arXiv Detail & Related papers (2021-12-17T19:22:37Z) - PLM: Partial Label Masking for Imbalanced Multi-label Classification [59.68444804243782]
Neural networks trained on real-world datasets with long-tailed label distributions are biased towards frequent classes and perform poorly on infrequent classes.
We propose a method, Partial Label Masking (PLM), which utilizes this ratio during training.
Our method achieves strong performance when compared to existing methods on both multi-label (MultiMNIST and MSCOCO) and single-label (imbalanced CIFAR-10 and CIFAR-100) image classification datasets.
arXiv Detail & Related papers (2021-05-22T18:07:56Z) - Inducing a hierarchy for multi-class classification problems [11.58041597483471]
In applications where categorical labels follow a natural hierarchy, classification methods that exploit the label structure often outperform those that do not.
In this paper, we investigate a class of methods that induce a hierarchy that can similarly improve classification performance over flat classifiers.
We demonstrate the effectiveness of the class of methods both for discovering a latent hierarchy and for improving accuracy in principled simulation settings and three real data applications.
arXiv Detail & Related papers (2021-02-20T05:40:42Z) - Binary Classification from Multiple Unlabeled Datasets via Surrogate Set
Classification [94.55805516167369]
We propose a new approach for binary classification from m U-sets for $mge2$.
Our key idea is to consider an auxiliary classification task called surrogate set classification (SSC)
arXiv Detail & Related papers (2021-02-01T07:36:38Z) - Pitfalls of Assessing Extracted Hierarchies for Multi-Class
Classification [4.89253144446913]
We identify some common pitfalls that may lead practitioners to make misleading conclusions about their methods.
We show how the hierarchy's quality can become irrelevant depending on the experimental setup.
Our results confirm that datasets with a high number of classes generally present complex structures in how these classes relate to each other.
arXiv Detail & Related papers (2021-01-26T21:50:57Z) - Multitask Learning for Class-Imbalanced Discourse Classification [74.41900374452472]
We show that a multitask approach can improve 7% Micro F1-score upon current state-of-the-art benchmarks.
We also offer a comparative review of additional techniques proposed to address resource-poor problems in NLP.
arXiv Detail & Related papers (2021-01-02T07:13:41Z) - Theoretical Insights Into Multiclass Classification: A High-dimensional
Asymptotic View [82.80085730891126]
We provide the first modernally precise analysis of linear multiclass classification.
Our analysis reveals that the classification accuracy is highly distribution-dependent.
The insights gained may pave the way for a precise understanding of other classification algorithms.
arXiv Detail & Related papers (2020-11-16T05:17:29Z) - Learning and Evaluating Representations for Deep One-class
Classification [59.095144932794646]
We present a two-stage framework for deep one-class classification.
We first learn self-supervised representations from one-class data, and then build one-class classifiers on learned representations.
In experiments, we demonstrate state-of-the-art performance on visual domain one-class classification benchmarks.
arXiv Detail & Related papers (2020-11-04T23:33:41Z) - Self-Weighted Robust LDA for Multiclass Classification with Edge Classes [111.5515086563592]
A novel self-weighted robust LDA with l21-norm based between-class distance criterion, called SWRLDA, is proposed for multi-class classification.
The proposed SWRLDA is easy to implement, and converges fast in practice.
arXiv Detail & Related papers (2020-09-24T12:32:55Z) - Equivalent Classification Mapping for Weakly Supervised Temporal Action
Localization [92.58946210982411]
Weakly supervised temporal action localization is a newly emerging yet widely studied topic in recent years.
The pre-classification pipeline first performs classification on each video snippet and then aggregate the snippet-level classification scores to obtain the video-level classification score.
The post-classification pipeline aggregates the snippet-level features first and then predicts the video-level classification score based on the aggregated feature.
arXiv Detail & Related papers (2020-08-18T03:54:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.