Exploiting Class Similarity for Machine Learning with Confidence Labels
and Projective Loss Functions
- URL: http://arxiv.org/abs/2103.13607v1
- Date: Thu, 25 Mar 2021 04:49:44 GMT
- Title: Exploiting Class Similarity for Machine Learning with Confidence Labels
and Projective Loss Functions
- Authors: Gautam Rajendrakumar Gare and John Michael Galeotti
- Abstract summary: Class labels are relatable to each other, with certain class labels being more similar to each other than others.
Current labeling techniques fail to explicitly capture such similarity information.
We use our approach to train neural networks with noisy labels, as we believe noisy labels are partly a result of confusability arising from class similarity.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Class labels used for machine learning are relatable to each other, with
certain class labels being more similar to each other than others (e.g. images
of cats and dogs are more similar to each other than those of cats and cars).
Such similarity among classes is often the cause of poor model performance due
to the models confusing between them. Current labeling techniques fail to
explicitly capture such similarity information. In this paper, we instead
exploit the similarity between classes by capturing the similarity information
with our novel confidence labels. Confidence labels are probabilistic labels
denoting the likelihood of similarity, or confusability, between the classes.
Often even after models are trained to differentiate between classes in the
feature space, the similar classes' latent space still remains clustered. We
view this type of clustering as valuable information and exploit it with our
novel projective loss functions. Our projective loss functions are designed to
work with confidence labels with an ability to relax the loss penalty for
errors that confuse similar classes. We use our approach to train neural
networks with noisy labels, as we believe noisy labels are partly a result of
confusability arising from class similarity. We show improved performance
compared to the use of standard loss functions. We conduct a detailed analysis
using the CIFAR-10 dataset and show our proposed methods' applicability to
larger datasets, such as ImageNet and Food-101N.
Related papers
- Multi-Label Contrastive Learning : A Comprehensive Study [48.81069245141415]
Multi-label classification has emerged as a key area in both research and industry.
Applying contrastive learning to multi-label classification presents unique challenges.
We conduct an in-depth study of contrastive learning loss for multi-label classification across diverse settings.
arXiv Detail & Related papers (2024-11-27T20:20:06Z) - Not All Negatives are Equal: Label-Aware Contrastive Loss for
Fine-grained Text Classification [0.0]
We analyse the contrastive fine-tuning of pre-trained language models on two fine-grained text classification tasks.
We adaptively embed class relationships into a contrastive objective function to help differently weigh the positives and negatives.
We find that Label-aware Contrastive Loss outperforms previous contrastive methods.
arXiv Detail & Related papers (2021-09-12T04:19:17Z) - MSE Loss with Outlying Label for Imbalanced Classification [10.305130700118399]
We propose mean squared error (MSE) loss with outlying label for class imbalanced classification.
MSE loss is possible to equalize the number of back propagation for all classes and to learn the feature space considering the relationships between classes as metric learning.
It is possible to create the feature space for separating high-difficulty classes and low-difficulty classes.
arXiv Detail & Related papers (2021-07-06T05:17:00Z) - A Theory-Driven Self-Labeling Refinement Method for Contrastive
Representation Learning [111.05365744744437]
Unsupervised contrastive learning labels crops of the same image as positives, and other image crops as negatives.
In this work, we first prove that for contrastive learning, inaccurate label assignment heavily impairs its generalization for semantic instance discrimination.
Inspired by this theory, we propose a novel self-labeling refinement approach for contrastive learning.
arXiv Detail & Related papers (2021-06-28T14:24:52Z) - All Labels Are Not Created Equal: Enhancing Semi-supervision via Label
Grouping and Co-training [32.45488147013166]
Pseudo-labeling is a key component in semi-supervised learning (SSL)
We propose SemCo, a method which leverages label semantics and co-training to address this problem.
We show that our method achieves state-of-the-art performance across various SSL tasks including 5.6% accuracy improvement on Mini-ImageNet dataset with 1000 labeled examples.
arXiv Detail & Related papers (2021-04-12T07:33:16Z) - Class-Similarity Based Label Smoothing for Confidence Calibration [2.055949720959582]
We propose a novel form of label smoothing to improve confidence calibration.
Since different classes are of different intrinsic similarities, more similar classes should result in closer probability values in the final output.
This motivates the development of a new smooth label where the label values are based on similarities with the reference class.
arXiv Detail & Related papers (2020-06-24T20:26:22Z) - Class2Simi: A Noise Reduction Perspective on Learning with Noisy Labels [98.13491369929798]
We propose a framework called Class2Simi, which transforms data points with noisy class labels to data pairs with noisy similarity labels.
Class2Simi is computationally efficient because not only this transformation is on-the-fly in mini-batches, but also it just changes loss on top of model prediction into a pairwise manner.
arXiv Detail & Related papers (2020-06-14T07:55:32Z) - Multi-Class Classification from Noisy-Similarity-Labeled Data [98.13491369929798]
We propose a method for learning from only noisy-similarity-labeled data.
We use a noise transition matrix to bridge the class-posterior probability between clean and noisy data.
We build a novel learning system which can assign noise-free class labels for instances.
arXiv Detail & Related papers (2020-02-16T05:10:21Z) - Automatically Discovering and Learning New Visual Categories with
Ranking Statistics [145.89790963544314]
We tackle the problem of discovering novel classes in an image collection given labelled examples of other classes.
We learn a general-purpose clustering model and use the latter to identify the new classes in the unlabelled data.
We evaluate our approach on standard classification benchmarks and outperform current methods for novel category discovery by a significant margin.
arXiv Detail & Related papers (2020-02-13T18:53:32Z) - Rethinking Class Relations: Absolute-relative Supervised and
Unsupervised Few-shot Learning [157.62595449130973]
We study the fundamental problem of simplistic class modeling in current few-shot learning methods.
We propose a novel Absolute-relative Learning paradigm to fully take advantage of label information to refine the image representations.
arXiv Detail & Related papers (2020-01-12T12:25:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.