Complementary to Multiple Labels: A Correlation-Aware Correction
Approach
- URL: http://arxiv.org/abs/2302.12987v1
- Date: Sat, 25 Feb 2023 04:48:48 GMT
- Title: Complementary to Multiple Labels: A Correlation-Aware Correction
Approach
- Authors: Yi Gao, Miao Xu, Min-Ling Zhang
- Abstract summary: We show theoretically how the estimated transition matrix in multi-class CLL could be distorted in multi-labeled cases.
We propose a two-step method to estimate the transition matrix from candidate labels.
- Score: 65.59584909436259
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: \textit{Complementary label learning} (CLL) requires annotators to give
\emph{irrelevant} labels instead of relevant labels for instances. Currently,
CLL has shown its promising performance on multi-class data by estimating a
transition matrix. However, current multi-class CLL techniques cannot work well
on multi-labeled data since they assume each instance is associated with one
label while each multi-labeled instance is relevant to multiple labels. Here,
we show theoretically how the estimated transition matrix in multi-class CLL
could be distorted in multi-labeled cases as they ignore co-existing relevant
labels. Moreover, theoretical findings reveal that calculating a transition
matrix from label correlations in \textit{multi-labeled CLL} (ML-CLL) needs
multi-labeled data, while this is unavailable for ML-CLL. To solve this issue,
we propose a two-step method to estimate the transition matrix from candidate
labels. Specifically, we first estimate an initial transition matrix by
decomposing the multi-label problem into a series of binary classification
problems, then the initial transition matrix is corrected by label correlations
to enforce the addition of relationships among labels. We further show that the
proposal is classifier-consistent, and additionally introduce an MSE-based
regularizer to alleviate the tendency of BCE loss overfitting to noises.
Experimental results have demonstrated the effectiveness of the proposed
method.
Related papers
- Scalable Label Distribution Learning for Multi-Label Classification [43.52928088881866]
Multi-label classification (MLC) refers to the problem of tagging a given instance with a set of relevant labels.
Most existing MLC methods are based on the assumption that the correlation of two labels in each label pair is symmetric.
Most existing methods design learning processes associated with the number of labels, which makes their computational complexity a bottleneck when scaling up to large-scale output space.
arXiv Detail & Related papers (2023-11-28T06:52:53Z) - Multi-Label Noise Transition Matrix Estimation with Label Correlations:
Theory and Algorithm [73.94839250910977]
Noisy multi-label learning has garnered increasing attention due to the challenges posed by collecting large-scale accurate labels.
The introduction of transition matrices can help model multi-label noise and enable the development of statistically consistent algorithms.
We propose a novel estimator that leverages label correlations without the need for anchor points or precise fitting of noisy class posteriors.
arXiv Detail & Related papers (2023-09-22T08:35:38Z) - Towards Imbalanced Large Scale Multi-label Classification with Partially
Annotated Labels [8.977819892091]
Multi-label classification is a widely encountered problem in daily life, where an instance can be associated with multiple classes.
In this work, we address the issue of label imbalance and investigate how to train neural networks using partial labels.
arXiv Detail & Related papers (2023-07-31T21:50:48Z) - Positive Label Is All You Need for Multi-Label Classification [3.354528906571718]
Multi-label classification (MLC) faces challenges from label noise in training data.
Our paper addresses label noise in MLC by introducing a positive and unlabeled multi-label classification (PU-MLC) method.
PU-MLC employs positive-unlabeled learning, training the model with only positive labels and unlabeled data.
arXiv Detail & Related papers (2023-06-28T08:44:00Z) - Multi-Instance Partial-Label Learning: Towards Exploiting Dual Inexact
Supervision [53.530957567507365]
In some real-world tasks, each training sample is associated with a candidate label set that contains one ground-truth label and some false positive labels.
In this paper, we formalize such problems as multi-instance partial-label learning (MIPL)
Existing multi-instance learning algorithms and partial-label learning algorithms are suboptimal for solving MIPL problems.
arXiv Detail & Related papers (2022-12-18T03:28:51Z) - Multi-label Classification with High-rank and High-order Label
Correlations [62.39748565407201]
Previous methods capture the high-order label correlations mainly by transforming the label matrix to a latent label space with low-rank matrix factorization.
We propose a simple yet effective method to depict the high-order label correlations explicitly, and at the same time maintain the high-rank of the label matrix.
Comparative studies over twelve benchmark data sets validate the effectiveness of the proposed algorithm in multi-label classification.
arXiv Detail & Related papers (2022-07-09T05:15:31Z) - Acknowledging the Unknown for Multi-label Learning with Single Positive
Labels [65.5889334964149]
Traditionally, all unannotated labels are assumed as negative labels in single positive multi-label learning (SPML)
We propose entropy-maximization (EM) loss to maximize the entropy of predicted probabilities for all unannotated labels.
Considering the positive-negative label imbalance of unannotated labels, we propose asymmetric pseudo-labeling (APL) with asymmetric-tolerance strategies and a self-paced procedure to provide more precise supervision.
arXiv Detail & Related papers (2022-03-30T11:43:59Z) - Enhancing Label Correlation Feedback in Multi-Label Text Classification
via Multi-Task Learning [6.1538971100140145]
We introduce a novel approach with multi-task learning to enhance label correlation feedback.
We propose two auxiliary label co-occurrence prediction tasks to enhance label correlation learning.
arXiv Detail & Related papers (2021-06-06T12:26:14Z) - Extended T: Learning with Mixed Closed-set and Open-set Noisy Labels [86.5943044285146]
The label noise transition matrix $T$ reflects the probabilities that true labels flip into noisy ones.
In this paper, we focus on learning under the mixed closed-set and open-set label noise.
Our method can better model the mixed label noise, following its more robust performance than the prior state-of-the-art label-noise learning methods.
arXiv Detail & Related papers (2020-12-02T02:42:45Z) - Identifying noisy labels with a transductive semi-supervised
leave-one-out filter [2.4366811507669124]
We introduce the LGC_LVOF, a leave-one-out filtering approach based on the Local and Global Consistency (LGC) algorithm.
Our approach is best suited to datasets with a large amount of unlabeled data but not many labels.
arXiv Detail & Related papers (2020-09-24T16:50:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.