Matrix Smoothing: A Regularization for DNN with Transition Matrix under
Noisy Labels
- URL: http://arxiv.org/abs/2003.11904v1
- Date: Thu, 26 Mar 2020 13:49:37 GMT
- Title: Matrix Smoothing: A Regularization for DNN with Transition Matrix under
Noisy Labels
- Authors: Xianbin Lv, Dongxian Wu, Shu-Tao Xia
- Abstract summary: Training deep neural networks (DNNs) in the presence of noisy labels is an important and challenging task.
Recent probabilistic methods directly apply transition matrix to DNN, neglect DNN's susceptibility to overfitting.
We propose a novel method, in which a smoothed transition matrix is used for updating DNN, to restrict the overfitting.
- Score: 54.585681272543056
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Training deep neural networks (DNNs) in the presence of noisy labels is an
important and challenging task. Probabilistic modeling, which consists of a
classifier and a transition matrix, depicts the transformation from true labels
to noisy labels and is a promising approach. However, recent probabilistic
methods directly apply transition matrix to DNN, neglect DNN's susceptibility
to overfitting, and achieve unsatisfactory performance, especially under the
uniform noise. In this paper, inspired by label smoothing, we proposed a novel
method, in which a smoothed transition matrix is used for updating DNN, to
restrict the overfitting of DNN in probabilistic modeling. Our method is termed
Matrix Smoothing. We also empirically demonstrate that our method not only
improves the robustness of probabilistic modeling significantly, but also even
obtains a better estimation of the transition matrix.
Related papers
- Matrix Completion via Nonsmooth Regularization of Fully Connected Neural Networks [7.349727826230864]
It has been shown that enhanced performance could be attained by using nonlinear estimators such as deep neural networks.
In this paper, we control over-fitting by regularizing FCNN model in terms of norm intermediate representations.
Our simulations indicate the superiority of the proposed algorithm in comparison with existing linear and nonlinear algorithms.
arXiv Detail & Related papers (2024-03-15T12:00:37Z) - CoRMF: Criticality-Ordered Recurrent Mean Field Ising Solver [4.364088891019632]
We propose an RNN-based efficient Ising model solver, the Criticality-ordered Recurrent Mean Field (CoRMF)
By leveraging the approximated tree structure of the underlying Ising graph, the newly-obtained criticality order enables the unification between variational mean-field and RNN.
CoRFM solves the Ising problems in a self-train fashion without data/evidence, and the inference tasks can be executed by directly sampling from RNN.
arXiv Detail & Related papers (2024-03-05T16:55:06Z) - Dirichlet-based Per-Sample Weighting by Transition Matrix for Noisy
Label Learning [18.688802981093644]
We propose a new utilization method based on resampling, coined RENT.
Rent consistently outperforms existing transition matrix utilization methods, which includes reweighting, on various benchmark datasets.
arXiv Detail & Related papers (2024-03-05T06:20:49Z) - Learning Noise Transition Matrix from Only Noisy Labels via Total
Variation Regularization [88.91872713134342]
We propose a theoretically grounded method that can estimate the noise transition matrix and learn a classifier simultaneously.
We show the effectiveness of the proposed method through experiments on benchmark and real-world datasets.
arXiv Detail & Related papers (2021-02-04T05:09:18Z) - Provably End-to-end Label-Noise Learning without Anchor Points [118.97592870124937]
We propose an end-to-end framework for solving label-noise learning without anchor points.
Our proposed framework can identify the transition matrix if the clean class-posterior probabilities are sufficiently scattered.
arXiv Detail & Related papers (2021-02-04T03:59:37Z) - Extended T: Learning with Mixed Closed-set and Open-set Noisy Labels [86.5943044285146]
The label noise transition matrix $T$ reflects the probabilities that true labels flip into noisy ones.
In this paper, we focus on learning under the mixed closed-set and open-set label noise.
Our method can better model the mixed label noise, following its more robust performance than the prior state-of-the-art label-noise learning methods.
arXiv Detail & Related papers (2020-12-02T02:42:45Z) - Delving Deep into Label Smoothing [112.24527926373084]
Label smoothing is an effective regularization tool for deep neural networks (DNNs)
We present an Online Label Smoothing (OLS) strategy, which generates soft labels based on the statistics of the model prediction for the target category.
arXiv Detail & Related papers (2020-11-25T08:03:11Z) - Dual T: Reducing Estimation Error for Transition Matrix in Label-noise
Learning [157.2709657207203]
Existing methods for estimating the transition matrix rely heavily on estimating the noisy class posterior.
We introduce an intermediate class to avoid directly estimating the noisy class posterior.
By this intermediate class, the original transition matrix can then be factorized into the product of two easy-to-estimate transition matrices.
arXiv Detail & Related papers (2020-06-14T05:48:20Z) - Causality-aware counterfactual confounding adjustment for feature
representations learned by deep models [14.554818659491644]
Causal modeling has been recognized as a potential solution to many challenging problems in machine learning (ML)
We describe how a recently proposed counterfactual approach can still be used to deconfound the feature representations learned by deep neural network (DNN) models.
arXiv Detail & Related papers (2020-04-20T17:37:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.