Fidelity Estimation Improves Noisy-Image Classification with Pretrained
Networks
- URL: http://arxiv.org/abs/2106.00673v1
- Date: Tue, 1 Jun 2021 17:58:32 GMT
- Title: Fidelity Estimation Improves Noisy-Image Classification with Pretrained
Networks
- Authors: Xiaoyu Lin, Deblina Bhattacharjee, Majed El Helou and Sabine
S\"usstrunk
- Abstract summary: We propose a method that can be applied on a pretrained classifier.
Our method exploits a fidelity map estimate that is fused into the internal representations of the feature extractor.
We show that when using our oracle fidelity map we even outperform the fully retrained methods, whether trained on noisy or restored images.
- Score: 12.814135905559992
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Image classification has significantly improved using deep learning. This is
mainly due to convolutional neural networks (CNNs) that are capable of learning
rich feature extractors from large datasets. However, most deep learning
classification methods are trained on clean images and are not robust when
handling noisy ones, even if a restoration preprocessing step is applied. While
novel methods address this problem, they rely on modified feature extractors
and thus necessitate retraining. We instead propose a method that can be
applied on a pretrained classifier. Our method exploits a fidelity map estimate
that is fused into the internal representations of the feature extractor,
thereby guiding the attention of the network and making it more robust to noisy
data. We improve the noisy-image classification (NIC) results by significantly
large margins, especially at high noise levels, and come close to the fully
retrained approaches. Furthermore, as proof of concept, we show that when using
our oracle fidelity map we even outperform the fully retrained methods, whether
trained on noisy or restored images.
Related papers
- Masked Image Training for Generalizable Deep Image Denoising [53.03126421917465]
We present a novel approach to enhance the generalization performance of denoising networks.
Our method involves masking random pixels of the input image and reconstructing the missing information during training.
Our approach exhibits better generalization ability than other deep learning models and is directly applicable to real-world scenarios.
arXiv Detail & Related papers (2023-03-23T09:33:44Z) - Learning advisor networks for noisy image classification [22.77447144331876]
We introduce the novel concept of advisor network to address the problem of noisy labels in image classification.
We trained it with a meta-learning strategy so that it can adapt throughout the training of the main model.
We tested our method on CIFAR10 and CIFAR100 with synthetic noise, and on Clothing1M which contains real-world noise, reporting state-of-the-art results.
arXiv Detail & Related papers (2022-11-08T11:44:08Z) - Pure Noise to the Rescue of Insufficient Data: Improving Imbalanced
Classification by Training on Random Noise Images [12.91269560135337]
We present a surprisingly simple yet highly effective method to mitigate this limitation.
Unlike the common use of additive noise or adversarial noise for data augmentation, we propose directly training on pure random noise images.
We present a new Distribution-Aware Routing Batch Normalization layer (DAR-BN), which enables training on pure noise images in addition to natural images within the same network.
arXiv Detail & Related papers (2021-12-16T11:51:35Z) - Learning degraded image classification with restoration data fidelity [0.0]
We investigate the influence of degradation types and levels on four widely-used classification networks.
We propose a novel method leveraging a fidelity map to calibrate the image features obtained by pre-trained networks.
Our results reveal that the proposed method is a promising solution to mitigate the effect caused by image degradation.
arXiv Detail & Related papers (2021-01-23T23:47:03Z) - Mixed-Privacy Forgetting in Deep Networks [114.3840147070712]
We show that the influence of a subset of the training samples can be removed from the weights of a network trained on large-scale image classification tasks.
Inspired by real-world applications of forgetting techniques, we introduce a novel notion of forgetting in mixed-privacy setting.
We show that our method allows forgetting without having to trade off the model accuracy.
arXiv Detail & Related papers (2020-12-24T19:34:56Z) - Noisy Labels Can Induce Good Representations [53.47668632785373]
We study how architecture affects learning with noisy labels.
We show that training with noisy labels can induce useful hidden representations, even when the model generalizes poorly.
This finding leads to a simple method to improve models trained on noisy labels.
arXiv Detail & Related papers (2020-12-23T18:58:05Z) - Attention-Aware Noisy Label Learning for Image Classification [97.26664962498887]
Deep convolutional neural networks (CNNs) learned on large-scale labeled samples have achieved remarkable progress in computer vision.
The cheapest way to obtain a large body of labeled visual data is to crawl from websites with user-supplied labels, such as Flickr.
This paper proposes the attention-aware noisy label learning approach to improve the discriminative capability of the network trained on datasets with potential label noise.
arXiv Detail & Related papers (2020-09-30T15:45:36Z) - Data-driven Meta-set Based Fine-Grained Visual Classification [61.083706396575295]
We propose a data-driven meta-set based approach to deal with noisy web images for fine-grained recognition.
Specifically, guided by a small amount of clean meta-set, we train a selection net in a meta-learning manner to distinguish in- and out-of-distribution noisy images.
arXiv Detail & Related papers (2020-08-06T03:04:16Z) - SCAN: Learning to Classify Images without Labels [73.69513783788622]
We advocate a two-step approach where feature learning and clustering are decoupled.
A self-supervised task from representation learning is employed to obtain semantically meaningful features.
We obtain promising results on ImageNet, and outperform several semi-supervised learning methods in the low-data regime.
arXiv Detail & Related papers (2020-05-25T18:12:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.