Establishment of Neural Networks Robust to Label Noise
- URL: http://arxiv.org/abs/2211.15279v3
- Date: Mon, 24 Apr 2023 02:43:38 GMT
- Title: Establishment of Neural Networks Robust to Label Noise
- Authors: Pengwei Yang, Chongyangzi Teng and Jack George Mangos
- Abstract summary: In this paper, we have examined the fundamental concept underlying related label noise approaches.
A transition matrix estimator has been created, and its effectiveness against the actual transition matrix has been demonstrated.
We are not efficiently able to demonstrate the influence of the transition matrix noise correction on robustness enhancements due to our inability to correctly tune the complex convolutional neural network model.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Label noise is a significant obstacle in deep learning model training. It can
have a considerable impact on the performance of image classification models,
particularly deep neural networks, which are especially susceptible because
they have a strong propensity to memorise noisy labels. In this paper, we have
examined the fundamental concept underlying related label noise approaches. A
transition matrix estimator has been created, and its effectiveness against the
actual transition matrix has been demonstrated. In addition, we examined the
label noise robustness of two convolutional neural network classifiers with
LeNet and AlexNet designs. The two FashionMINIST datasets have revealed the
robustness of both models. We are not efficiently able to demonstrate the
influence of the transition matrix noise correction on robustness enhancements
due to our inability to correctly tune the complex convolutional neural network
model due to time and computing resource constraints. There is a need for
additional effort to fine-tune the neural network model and explore the
precision of the estimated transition model in future research.
Related papers
- Ambiguity in solving imaging inverse problems with deep learning based
operators [0.0]
Large convolutional neural networks have been widely used as tools for image deblurring.
Image deblurring is mathematically modeled as an ill-posed inverse problem and its solution is difficult to approximate when noise affects the data.
In this paper, we propose some strategies to improve stability without losing to much accuracy to deblur images with deep-learning based methods.
arXiv Detail & Related papers (2023-05-31T12:07:08Z) - An Adversarial Active Sampling-based Data Augmentation Framework for
Manufacturable Chip Design [55.62660894625669]
Lithography modeling is a crucial problem in chip design to ensure a chip design mask is manufacturable.
Recent developments in machine learning have provided alternative solutions in replacing the time-consuming lithography simulations with deep neural networks.
We propose a litho-aware data augmentation framework to resolve the dilemma of limited data and improve the machine learning model performance.
arXiv Detail & Related papers (2022-10-27T20:53:39Z) - A Study of Deep CNN Model with Labeling Noise Based on Granular-ball
Computing [0.0]
Granular ball computing is an efficient, robust and scalable learning method.
In this paper, we pioneered a granular ball neural network algorithm model, which adopts the idea of multi-granular to filter label noise samples during model training.
arXiv Detail & Related papers (2022-07-17T13:58:46Z) - From Environmental Sound Representation to Robustness of 2D CNN Models
Against Adversarial Attacks [82.21746840893658]
This paper investigates the impact of different standard environmental sound representations (spectrograms) on the recognition performance and adversarial attack robustness of a victim residual convolutional neural network.
We show that while the ResNet-18 model trained on DWT spectrograms achieves a high recognition accuracy, attacking this model is relatively more costly for the adversary.
arXiv Detail & Related papers (2022-04-14T15:14:08Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Learning to Rectify for Robust Learning with Noisy Labels [25.149277009932423]
We propose warped probabilistic inference (WarPI) to achieve adaptively rectifying the training procedure for the classification network.
We evaluate WarPI on four benchmarks of robust learning with noisy labels and achieve the new state-of-the-art under variant noise types.
arXiv Detail & Related papers (2021-11-08T02:25:50Z) - Robust Learning of Recurrent Neural Networks in Presence of Exogenous
Noise [22.690064709532873]
We propose a tractable robustness analysis for RNN models subject to input noise.
The robustness measure can be estimated efficiently using linearization techniques.
Our proposed methodology significantly improves robustness of recurrent neural networks.
arXiv Detail & Related papers (2021-05-03T16:45:05Z) - Non-Singular Adversarial Robustness of Neural Networks [58.731070632586594]
Adrial robustness has become an emerging challenge for neural network owing to its over-sensitivity to small input perturbations.
We formalize the notion of non-singular adversarial robustness for neural networks through the lens of joint perturbations to data inputs as well as model weights.
arXiv Detail & Related papers (2021-02-23T20:59:30Z) - Learning Noise-Aware Encoder-Decoder from Noisy Labels by Alternating
Back-Propagation for Saliency Detection [54.98042023365694]
We propose a noise-aware encoder-decoder framework to disentangle a clean saliency predictor from noisy training examples.
The proposed model consists of two sub-models parameterized by neural networks.
arXiv Detail & Related papers (2020-07-23T18:47:36Z) - Robust Processing-In-Memory Neural Networks via Noise-Aware
Normalization [26.270754571140735]
PIM accelerators often suffer from intrinsic noise in the physical components.
We propose a noise-agnostic method to achieve robust neural network performance against any noise setting.
arXiv Detail & Related papers (2020-07-07T06:51:28Z) - Rectified Meta-Learning from Noisy Labels for Robust Image-based Plant
Disease Diagnosis [64.82680813427054]
Plant diseases serve as one of main threats to food security and crop production.
One popular approach is to transform this problem as a leaf image classification task, which can be addressed by the powerful convolutional neural networks (CNNs)
We propose a novel framework that incorporates rectified meta-learning module into common CNN paradigm to train a noise-robust deep network without using extra supervision information.
arXiv Detail & Related papers (2020-03-17T09:51:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.