TaxoKnow: Taxonomy as Prior Knowledge in the Loss Function of
Multi-class Classification
- URL: http://arxiv.org/abs/2305.16341v1
- Date: Wed, 24 May 2023 08:08:56 GMT
- Title: TaxoKnow: Taxonomy as Prior Knowledge in the Loss Function of
Multi-class Classification
- Authors: Mohsen Pourvali, Yao Meng, Chen Sheng, Yangzhou Du
- Abstract summary: We introduce two methods to integrate the hierarchical taxonomy as an explicit regularizer into the loss function of learning algorithms.
By reasoning on a hierarchical taxonomy, a neural network alleviates its output distributions over the classes, allowing conditioning on upper concepts for a minority class.
- Score: 1.130757825611188
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we investigate the effectiveness of integrating a hierarchical
taxonomy of labels as prior knowledge into the learning algorithm of a flat
classifier. We introduce two methods to integrate the hierarchical taxonomy as
an explicit regularizer into the loss function of learning algorithms. By
reasoning on a hierarchical taxonomy, a neural network alleviates its output
distributions over the classes, allowing conditioning on upper concepts for a
minority class. We limit ourselves to the flat classification task and provide
our experimental results on two industrial in-house datasets and two public
benchmarks, RCV1 and Amazon product reviews. Our obtained results show the
significant effect of a taxonomy in increasing the performance of a learner in
semisupervised multi-class classification and the considerable results obtained
in a fully supervised fashion.
Related papers
- Multi-Label Requirements Classification with Large Taxonomies [40.588683959176116]
Multi-label requirements classification with large labels could aid requirements traceability but is prohibitively costly with supervised training.
We associated 129 requirements with 769 labels from ranging between 250 and 1183 classes.
The sentence-based classification had a significantly higher recall compared to the word-based classification.
The hierarchical classification strategy did not always improve the performance of requirements classification.
arXiv Detail & Related papers (2024-06-07T09:53:55Z) - Weakly-supervised Action Localization via Hierarchical Mining [76.00021423700497]
Weakly-supervised action localization aims to localize and classify action instances in the given videos temporally with only video-level categorical labels.
We propose a hierarchical mining strategy under video-level and snippet-level manners, i.e., hierarchical supervision and hierarchical consistency mining.
We show that HiM-Net outperforms existing methods on THUMOS14 and ActivityNet1.3 datasets with large margins by hierarchically mining the supervision and consistency.
arXiv Detail & Related papers (2022-06-22T12:19:09Z) - Fine-Grained Visual Classification using Self Assessment Classifier [12.596520707449027]
Extracting discriminative features plays a crucial role in the fine-grained visual classification task.
In this paper, we introduce a Self Assessment, which simultaneously leverages the representation of the image and top-k prediction classes.
We show that our method achieves new state-of-the-art results on CUB200-2011, Stanford Dog, and FGVC Aircraft datasets.
arXiv Detail & Related papers (2022-05-21T07:41:27Z) - Self-Supervised Class Incremental Learning [51.62542103481908]
Existing Class Incremental Learning (CIL) methods are based on a supervised classification framework sensitive to data labels.
When updating them based on the new class data, they suffer from catastrophic forgetting: the model cannot discern old class data clearly from the new.
In this paper, we explore the performance of Self-Supervised representation learning in Class Incremental Learning (SSCIL) for the first time.
arXiv Detail & Related papers (2021-11-18T06:58:19Z) - ECKPN: Explicit Class Knowledge Propagation Network for Transductive
Few-shot Learning [53.09923823663554]
Class-level knowledge can be easily learned by humans from just a handful of samples.
We propose an Explicit Class Knowledge Propagation Network (ECKPN) to address this problem.
We conduct extensive experiments on four few-shot classification benchmarks, and the experimental results show that the proposed ECKPN significantly outperforms the state-of-the-art methods.
arXiv Detail & Related papers (2021-06-16T02:29:43Z) - No Fear of Heterogeneity: Classifier Calibration for Federated Learning
with Non-IID Data [78.69828864672978]
A central challenge in training classification models in the real-world federated system is learning with non-IID data.
We propose a novel and simple algorithm called Virtual Representations (CCVR), which adjusts the classifier using virtual representations sampled from an approximated ssian mixture model.
Experimental results demonstrate that CCVR state-of-the-art performance on popular federated learning benchmarks including CIFAR-10, CIFAR-100, and CINIC-10.
arXiv Detail & Related papers (2021-06-09T12:02:29Z) - Inducing a hierarchy for multi-class classification problems [11.58041597483471]
In applications where categorical labels follow a natural hierarchy, classification methods that exploit the label structure often outperform those that do not.
In this paper, we investigate a class of methods that induce a hierarchy that can similarly improve classification performance over flat classifiers.
We demonstrate the effectiveness of the class of methods both for discovering a latent hierarchy and for improving accuracy in principled simulation settings and three real data applications.
arXiv Detail & Related papers (2021-02-20T05:40:42Z) - Binary Classification from Multiple Unlabeled Datasets via Surrogate Set
Classification [94.55805516167369]
We propose a new approach for binary classification from m U-sets for $mge2$.
Our key idea is to consider an auxiliary classification task called surrogate set classification (SSC)
arXiv Detail & Related papers (2021-02-01T07:36:38Z) - Learning and Evaluating Representations for Deep One-class
Classification [59.095144932794646]
We present a two-stage framework for deep one-class classification.
We first learn self-supervised representations from one-class data, and then build one-class classifiers on learned representations.
In experiments, we demonstrate state-of-the-art performance on visual domain one-class classification benchmarks.
arXiv Detail & Related papers (2020-11-04T23:33:41Z) - Learn Class Hierarchy using Convolutional Neural Networks [0.9569316316728905]
We propose a new architecture for hierarchical classification of images, introducing a stack of deep linear layers with cross-entropy loss functions and center loss combined.
We experimentally show that our hierarchical classifier presents advantages to the traditional classification approaches finding application in computer vision tasks.
arXiv Detail & Related papers (2020-05-18T12:06:43Z) - Efficient strategies for hierarchical text classification: External
knowledge and auxiliary tasks [3.5557219875516655]
We perform a sequence of inference steps to predict the category of a document from top to bottom of a given class taxonomy.
With our efficient approaches, we outperform previous studies, using a drastically reduced number of parameters, in two well-known English datasets.
arXiv Detail & Related papers (2020-05-05T20:22:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.