Combined Cleaning and Resampling Algorithm for Multi-Class Imbalanced
Data with Label Noise
- URL: http://arxiv.org/abs/2004.03406v1
- Date: Tue, 7 Apr 2020 13:59:35 GMT
- Title: Combined Cleaning and Resampling Algorithm for Multi-Class Imbalanced
Data with Label Noise
- Authors: Micha{\l} Koziarski, Micha{\l} Wo\'zniak, Bartosz Krawczyk
- Abstract summary: In this paper, we propose a novel oversampling technique, a Multi-Class Combined Cleaning and Resampling algorithm.
The proposed method utilizes an energy-based approach to modeling the regions suitable for oversampling, less affected by small disjuncts and outliers than SMOTE.
It combines it with a simultaneous cleaning operation, the aim of which is to reduce the effect of overlapping class distributions on the performance of the learning algorithms.
- Score: 11.868507571027626
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The imbalanced data classification is one of the most crucial tasks facing
modern data analysis. Especially when combined with other difficulty factors,
such as the presence of noise, overlapping class distributions, and small
disjuncts, data imbalance can significantly impact the classification
performance. Furthermore, some of the data difficulty factors are known to
affect the performance of the existing oversampling strategies, in particular
SMOTE and its derivatives. This effect is especially pronounced in the
multi-class setting, in which the mutual imbalance relationships between the
classes complicate even further. Despite that, most of the contemporary
research in the area of data imbalance focuses on the binary classification
problems, while their more difficult multi-class counterparts are relatively
unexplored. In this paper, we propose a novel oversampling technique, a
Multi-Class Combined Cleaning and Resampling (MC-CCR) algorithm. The proposed
method utilizes an energy-based approach to modeling the regions suitable for
oversampling, less affected by small disjuncts and outliers than SMOTE. It
combines it with a simultaneous cleaning operation, the aim of which is to
reduce the effect of overlapping class distributions on the performance of the
learning algorithms. Finally, by incorporating a dedicated strategy of handling
the multi-class problems, MC-CCR is less affected by the loss of information
about the inter-class relationships than the traditional multi-class
decomposition strategies. Based on the results of experimental research carried
out for many multi-class imbalanced benchmark datasets, the high robust of the
proposed approach to noise was shown, as well as its high quality compared to
the state-of-art methods.
Related papers
- Tackling Diverse Minorities in Imbalanced Classification [80.78227787608714]
Imbalanced datasets are commonly observed in various real-world applications, presenting significant challenges in training classifiers.
We propose generating synthetic samples iteratively by mixing data samples from both minority and majority classes.
We demonstrate the effectiveness of our proposed framework through extensive experiments conducted on seven publicly available benchmark datasets.
arXiv Detail & Related papers (2023-08-28T18:48:34Z) - A review of ensemble learning and data augmentation models for class
imbalanced problems: combination, implementation and evaluation [0.196629787330046]
Class imbalance (CI) in classification problems arises when the number of observations belonging to one class is lower than the other.
In this paper, we evaluate data augmentation and ensemble learning methods used to address prominent benchmark CI problems.
arXiv Detail & Related papers (2023-04-06T04:37:10Z) - Compound Batch Normalization for Long-tailed Image Classification [77.42829178064807]
We propose a compound batch normalization method based on a Gaussian mixture.
It can model the feature space more comprehensively and reduce the dominance of head classes.
The proposed method outperforms existing methods on long-tailed image classification.
arXiv Detail & Related papers (2022-12-02T07:31:39Z) - A Hybrid Approach for Binary Classification of Imbalanced Data [0.0]
We propose HADR, a hybrid approach with dimension reduction that consists of data block construction, dimentionality reduction, and ensemble learning.
We evaluate the performance on eight imbalanced public datasets in terms of recall, G-mean, and AUC.
arXiv Detail & Related papers (2022-07-06T15:18:41Z) - Consistency and Diversity induced Human Motion Segmentation [231.36289425663702]
We propose a novel Consistency and Diversity induced human Motion (CDMS) algorithm.
Our model factorizes the source and target data into distinct multi-layer feature spaces.
A multi-mutual learning strategy is carried out to reduce the domain gap between the source and target data.
arXiv Detail & Related papers (2022-02-10T06:23:56Z) - Envelope Imbalance Learning Algorithm based on Multilayer Fuzzy C-means
Clustering and Minimum Interlayer discrepancy [14.339674126923903]
This paper proposes a deep instance envelope network-based imbalanced learning algorithm with the multilayer fuzzy c-means (MlFCM) and a minimum interlayer discrepancy mechanism based on the maximum mean discrepancy (MIDMD)
This algorithm can guarantee high quality balanced instances using a deep instance envelope network in the absence of prior knowledge.
arXiv Detail & Related papers (2021-11-02T04:59:57Z) - Learning with Multiclass AUC: Theory and Algorithms [141.63211412386283]
Area under the ROC curve (AUC) is a well-known ranking metric for problems such as imbalanced learning and recommender systems.
In this paper, we start an early trial to consider the problem of learning multiclass scoring functions via optimizing multiclass AUC metrics.
arXiv Detail & Related papers (2021-07-28T05:18:10Z) - Capturing scattered discriminative information using a deep architecture
in acoustic scene classification [49.86640645460706]
In this study, we investigate various methods to capture discriminative information and simultaneously mitigate the overfitting problem.
We adopt a max feature map method to replace conventional non-linear activations in a deep neural network.
Two data augment methods and two deep architecture modules are further explored to reduce overfitting and sustain the system's discriminative power.
arXiv Detail & Related papers (2020-07-09T08:32:06Z) - Investigating Class-level Difficulty Factors in Multi-label
Classification Problems [23.51529285126783]
This work investigates the use of class-level difficulty factors in multi-label classification problems for the first time.
Four difficulty factors are proposed: frequency, visual variation, semantic abstraction, and class co-occurrence.
These difficulty factors are shown to have several potential applications including the prediction of class-level performance across datasets.
arXiv Detail & Related papers (2020-05-01T15:06:53Z) - Long-Tailed Recognition Using Class-Balanced Experts [128.73438243408393]
We propose an ensemble of class-balanced experts that combines the strength of diverse classifiers.
Our ensemble of class-balanced experts reaches results close to state-of-the-art and an extended ensemble establishes a new state-of-the-art on two benchmarks for long-tailed recognition.
arXiv Detail & Related papers (2020-04-07T20:57:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.