Imbalanced Medical Image Segmentation with Pixel-dependent Noisy Labels
- URL: http://arxiv.org/abs/2501.06678v1
- Date: Sun, 12 Jan 2025 00:59:57 GMT
- Title: Imbalanced Medical Image Segmentation with Pixel-dependent Noisy Labels
- Authors: Erjian Guo, Zicheng Wang, Zhen Zhao, Luping Zhou,
- Abstract summary: We propose Collaborative Learning with Curriculum Selection (CLCS) to address pixel-dependent noisy labels with class imbalance.
CLCS consists of two modules: Curriculum Noisy Label Sample Selection (CNS) and Noise Balance Loss (NBL)
- Score: 23.049622621090453
- License:
- Abstract: Accurate medical image segmentation is often hindered by noisy labels in training data, due to the challenges of annotating medical images. Prior research works addressing noisy labels tend to make class-dependent assumptions, overlooking the pixel-dependent nature of most noisy labels. Furthermore, existing methods typically apply fixed thresholds to filter out noisy labels, risking the removal of minority classes and consequently degrading segmentation performance. To bridge these gaps, our proposed framework, Collaborative Learning with Curriculum Selection (CLCS), addresses pixel-dependent noisy labels with class imbalance. CLCS advances the existing works by i) treating noisy labels as pixel-dependent and addressing them through a collaborative learning framework, and ii) employing a curriculum dynamic thresholding approach adapting to model learning progress to select clean data samples to mitigate the class imbalance issue, and iii) applying a noise balance loss to noisy data samples to improve data utilization instead of discarding them outright. Specifically, our CLCS contains two modules: Curriculum Noisy Label Sample Selection (CNS) and Noise Balance Loss (NBL). In the CNS module, we designed a two-branch network with discrepancy loss for collaborative learning so that different feature representations of the same instance could be extracted from distinct views and used to vote the class probabilities of pixels. Besides, a curriculum dynamic threshold is adopted to select clean-label samples through probability voting. In the NBL module, instead of directly dropping the suspiciously noisy labels, we further adopt a robust loss to leverage such instances to boost the performance.
Related papers
- Active Label Refinement for Robust Training of Imbalanced Medical Image Classification Tasks in the Presence of High Label Noise [10.232537737211098]
We propose a two-phase approach that combines Learning with Noisy Labels (LNL) and active learning.
We demonstrate that our proposed technique is superior to its predecessors at handling class imbalance by not misidentifying clean samples from minority classes as mostly noisy samples.
arXiv Detail & Related papers (2024-07-08T14:16:05Z) - Extracting Clean and Balanced Subset for Noisy Long-tailed Classification [66.47809135771698]
We develop a novel pseudo labeling method using class prototypes from the perspective of distribution matching.
By setting a manually-specific probability measure, we can reduce the side-effects of noisy and long-tailed data simultaneously.
Our method can extract this class-balanced subset with clean labels, which brings effective performance gains for long-tailed classification with label noise.
arXiv Detail & Related papers (2024-04-10T07:34:37Z) - Combating Label Noise With A General Surrogate Model For Sample Selection [77.45468386115306]
We propose to leverage the vision-language surrogate model CLIP to filter noisy samples automatically.
We validate the effectiveness of our proposed method on both real-world and synthetic noisy datasets.
arXiv Detail & Related papers (2023-10-16T14:43:27Z) - Category-Adaptive Label Discovery and Noise Rejection for Multi-label
Image Recognition with Partial Positive Labels [78.88007892742438]
Training multi-label models with partial positive labels (MLR-PPL) attracts increasing attention.
Previous works regard unknown labels as negative and adopt traditional MLR algorithms.
We propose to explore semantic correlation among different images to facilitate the MLR-PPL task.
arXiv Detail & Related papers (2022-11-15T02:11:20Z) - Joint Class-Affinity Loss Correction for Robust Medical Image
Segmentation with Noisy Labels [22.721870430220598]
noisy labels prevent medical image segmentation algorithms from learning precise semantic correlations.
We present a novel perspective for noisy mitigation by incorporating both pixel-wise and pair-wise manners.
We propose a robust Joint Class-Affinity (JCAS) framework to combat label noise issues in medical image segmentation.
arXiv Detail & Related papers (2022-06-16T08:19:33Z) - Transductive CLIP with Class-Conditional Contrastive Learning [68.51078382124331]
We propose Transductive CLIP, a novel framework for learning a classification network with noisy labels from scratch.
A class-conditional contrastive learning mechanism is proposed to mitigate the reliance on pseudo labels.
ensemble labels is adopted as a pseudo label updating strategy to stabilize the training of deep neural networks with noisy labels.
arXiv Detail & Related papers (2022-06-13T14:04:57Z) - Alleviating Noisy-label Effects in Image Classification via Probability
Transition Matrix [30.532481130511137]
Deep-learning-based image classification frameworks often suffer from the noisy label problem caused by the inter-observer variation.
We propose a plugin module, namely noise ignoring block (NIB), to separate the hard samples from the mislabeled ones.
Our NIB module consistently improves the performances of the state-of-the-art robust training methods.
arXiv Detail & Related papers (2021-10-17T17:01:57Z) - A Second-Order Approach to Learning with Instance-Dependent Label Noise [58.555527517928596]
The presence of label noise often misleads the training of deep neural networks.
We show that the errors in human-annotated labels are more likely to be dependent on the difficulty levels of tasks.
arXiv Detail & Related papers (2020-12-22T06:36:58Z) - CCML: A Novel Collaborative Learning Model for Classification of Remote
Sensing Images with Noisy Multi-Labels [0.9995347522610671]
We propose a novel Consensual Collaborative Multi-Label Learning (CCML) method to alleviate the adverse effects of multi-label noise during the training phase of the CNN model.
CCML identifies, ranks, and corrects noisy multi-labels in RS images based on four main modules.
arXiv Detail & Related papers (2020-12-19T15:42:24Z) - Attention-Aware Noisy Label Learning for Image Classification [97.26664962498887]
Deep convolutional neural networks (CNNs) learned on large-scale labeled samples have achieved remarkable progress in computer vision.
The cheapest way to obtain a large body of labeled visual data is to crawl from websites with user-supplied labels, such as Flickr.
This paper proposes the attention-aware noisy label learning approach to improve the discriminative capability of the network trained on datasets with potential label noise.
arXiv Detail & Related papers (2020-09-30T15:45:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.