Learning from Multiple Annotators by Incorporating Instance Features
- URL: http://arxiv.org/abs/2106.15146v1
- Date: Tue, 29 Jun 2021 08:07:24 GMT
- Title: Learning from Multiple Annotators by Incorporating Instance Features
- Authors: Jingzheng Li and Hailong Sun and Jiyi Li and Zhijun Chen and Renshuai
Tao and Yufei Ge
- Abstract summary: Learning from multiple annotators aims to induce a high-quality classifier from training instances.
Most existing methods adopt class-level confusion matrices of annotators that observed labels do not depend on the instance features.
We propose a noise transition matrix, which incorporates the influence of instance features on annotators' performance based on confusion matrices.
- Score: 15.643325526074804
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Learning from multiple annotators aims to induce a high-quality classifier
from training instances, where each of them is associated with a set of
possibly noisy labels provided by multiple annotators under the influence of
their varying abilities and own biases. In modeling the probability transition
process from latent true labels to observed labels, most existing methods adopt
class-level confusion matrices of annotators that observed labels do not depend
on the instance features, just determined by the true labels. It may limit the
performance that the classifier can achieve. In this work, we propose the noise
transition matrix, which incorporates the influence of instance features on
annotators' performance based on confusion matrices. Furthermore, we propose a
simple yet effective learning framework, which consists of a classifier module
and a noise transition matrix module in a unified neural network architecture.
Experimental results demonstrate the superiority of our method in comparison
with state-of-the-art methods.
Related papers
- Estimating Noisy Class Posterior with Part-level Labels for Noisy Label Learning [13.502549812291878]
Existing methods typically learn noisy class posteriors by training a classification model with noisy labels.
This paper proposes to augment the supervised information with part-level labels, encouraging the model to focus on and integrate richer information from various parts.
Our method is theoretically sound, while experiments show that it is empirically effective in synthetic and real-world noisy benchmarks.
arXiv Detail & Related papers (2024-05-08T12:13:40Z) - Multi-Label Noise Transition Matrix Estimation with Label Correlations:
Theory and Algorithm [73.94839250910977]
Noisy multi-label learning has garnered increasing attention due to the challenges posed by collecting large-scale accurate labels.
The introduction of transition matrices can help model multi-label noise and enable the development of statistically consistent algorithms.
We propose a novel estimator that leverages label correlations without the need for anchor points or precise fitting of noisy class posteriors.
arXiv Detail & Related papers (2023-09-22T08:35:38Z) - Evolving Multi-Label Fuzzy Classifier [5.53329677986653]
Multi-label classification has attracted much attention in the machine learning community to address the problem of assigning single samples to more than one class at the same time.
We propose an evolving multi-label fuzzy classifier (EFC-ML) which is able to self-adapt and self-evolve its structure with new incoming multi-label samples in an incremental, single-pass manner.
arXiv Detail & Related papers (2022-03-29T08:01:03Z) - A Similarity-based Framework for Classification Task [21.182406977328267]
Similarity-based method gives rise to a new class of methods for multi-label learning and also achieves promising performance.
We unite similarity-based learning and generalized linear models to achieve the best of both worlds.
arXiv Detail & Related papers (2022-03-05T06:39:50Z) - Approximating Instance-Dependent Noise via Instance-Confidence Embedding [87.65718705642819]
Label noise in multiclass classification is a major obstacle to the deployment of learning systems.
We investigate the instance-dependent noise (IDN) model and propose an efficient approximation of IDN to capture the instance-specific label corruption.
arXiv Detail & Related papers (2021-03-25T02:33:30Z) - Learning Noise Transition Matrix from Only Noisy Labels via Total
Variation Regularization [88.91872713134342]
We propose a theoretically grounded method that can estimate the noise transition matrix and learn a classifier simultaneously.
We show the effectiveness of the proposed method through experiments on benchmark and real-world datasets.
arXiv Detail & Related papers (2021-02-04T05:09:18Z) - Provably End-to-end Label-Noise Learning without Anchor Points [118.97592870124937]
We propose an end-to-end framework for solving label-noise learning without anchor points.
Our proposed framework can identify the transition matrix if the clean class-posterior probabilities are sufficiently scattered.
arXiv Detail & Related papers (2021-02-04T03:59:37Z) - Extended T: Learning with Mixed Closed-set and Open-set Noisy Labels [86.5943044285146]
The label noise transition matrix $T$ reflects the probabilities that true labels flip into noisy ones.
In this paper, we focus on learning under the mixed closed-set and open-set label noise.
Our method can better model the mixed label noise, following its more robust performance than the prior state-of-the-art label-noise learning methods.
arXiv Detail & Related papers (2020-12-02T02:42:45Z) - Multi-Class Classification from Noisy-Similarity-Labeled Data [98.13491369929798]
We propose a method for learning from only noisy-similarity-labeled data.
We use a noise transition matrix to bridge the class-posterior probability between clean and noisy data.
We build a novel learning system which can assign noise-free class labels for instances.
arXiv Detail & Related papers (2020-02-16T05:10:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.