Imbalanced Data Learning by Minority Class Augmentation using Capsule
Adversarial Networks
- URL: http://arxiv.org/abs/2004.02182v3
- Date: Wed, 8 Apr 2020 07:47:38 GMT
- Title: Imbalanced Data Learning by Minority Class Augmentation using Capsule
Adversarial Networks
- Authors: Pourya Shamsolmoali, Masoumeh Zareapoor, Linlin Shen, Abdul Hamid
Sadka, Jie Yang
- Abstract summary: We propose a method to restore the balance in imbalanced images, by coalescing two concurrent methods.
In our model, generative and discriminative networks play a novel competitive game.
The coalescing of capsule-GAN is effective at recognizing highly overlapping classes with much fewer parameters compared with the convolutional-GAN.
- Score: 31.073558420480964
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: The fact that image datasets are often imbalanced poses an intense challenge
for deep learning techniques. In this paper, we propose a method to restore the
balance in imbalanced images, by coalescing two concurrent methods, generative
adversarial networks (GANs) and capsule network. In our model, generative and
discriminative networks play a novel competitive game, in which the generator
generates samples towards specific classes from multivariate probabilities
distribution. The discriminator of our model is designed in a way that while
recognizing the real and fake samples, it is also requires to assign classes to
the inputs. Since GAN approaches require fully observed data during training,
when the training samples are imbalanced, the approaches might generate similar
samples which leading to data overfitting. This problem is addressed by
providing all the available information from both the class components jointly
in the adversarial training. It improves learning from imbalanced data by
incorporating the majority distribution structure in the generation of new
minority samples. Furthermore, the generator is trained with feature matching
loss function to improve the training convergence. In addition, prevents
generation of outliers and does not affect majority class space. The
evaluations show the effectiveness of our proposed methodology; in particular,
the coalescing of capsule-GAN is effective at recognizing highly overlapping
classes with much fewer parameters compared with the convolutional-GAN.
Related papers
- Probabilistic Contrastive Learning for Long-Tailed Visual Recognition [78.70453964041718]
Longtailed distributions frequently emerge in real-world data, where a large number of minority categories contain a limited number of samples.
Recent investigations have revealed that supervised contrastive learning exhibits promising potential in alleviating the data imbalance.
We propose a novel probabilistic contrastive (ProCo) learning algorithm that estimates the data distribution of the samples from each class in the feature space.
arXiv Detail & Related papers (2024-03-11T13:44:49Z) - Tackling Diverse Minorities in Imbalanced Classification [80.78227787608714]
Imbalanced datasets are commonly observed in various real-world applications, presenting significant challenges in training classifiers.
We propose generating synthetic samples iteratively by mixing data samples from both minority and majority classes.
We demonstrate the effectiveness of our proposed framework through extensive experiments conducted on seven publicly available benchmark datasets.
arXiv Detail & Related papers (2023-08-28T18:48:34Z) - Fair GANs through model rebalancing for extremely imbalanced class
distributions [5.463417677777276]
We present an approach to construct an unbiased generative adversarial network (GAN) from an existing biased GAN.
We show results for the StyleGAN2 models while training on the Flickr Faces High Quality (FFHQ) dataset for racial fairness.
We further validate our approach by applying it to an imbalanced CIFAR10 dataset which is also twice as large.
arXiv Detail & Related papers (2023-08-16T19:20:06Z) - Class-Balancing Diffusion Models [57.38599989220613]
Class-Balancing Diffusion Models (CBDM) are trained with a distribution adjustment regularizer as a solution.
Our method benchmarked the generation results on CIFAR100/CIFAR100LT dataset and shows outstanding performance on the downstream recognition task.
arXiv Detail & Related papers (2023-04-30T20:00:14Z) - Integrating Local Real Data with Global Gradient Prototypes for
Classifier Re-Balancing in Federated Long-Tailed Learning [60.41501515192088]
Federated Learning (FL) has become a popular distributed learning paradigm that involves multiple clients training a global model collaboratively.
The data samples usually follow a long-tailed distribution in the real world, and FL on the decentralized and long-tailed data yields a poorly-behaved global model.
In this work, we integrate the local real data with the global gradient prototypes to form the local balanced datasets.
arXiv Detail & Related papers (2023-01-25T03:18:10Z) - Learning to Re-weight Examples with Optimal Transport for Imbalanced
Classification [74.62203971625173]
Imbalanced data pose challenges for deep learning based classification models.
One of the most widely-used approaches for tackling imbalanced data is re-weighting.
We propose a novel re-weighting method based on optimal transport (OT) from a distributional point of view.
arXiv Detail & Related papers (2022-08-05T01:23:54Z) - Efficient Augmentation for Imbalanced Deep Learning [8.38844520504124]
We study a convolutional neural network's internal representation of imbalanced image data.
We measure the generalization gap between a model's feature embeddings in the training and test sets, showing that the gap is wider for minority classes.
This insight enables us to design an efficient three-phase CNN training framework for imbalanced data.
arXiv Detail & Related papers (2022-07-13T09:43:17Z) - Class Balancing GAN with a Classifier in the Loop [58.29090045399214]
We introduce a novel theoretically motivated Class Balancing regularizer for training GANs.
Our regularizer makes use of the knowledge from a pre-trained classifier to ensure balanced learning of all the classes in the dataset.
We demonstrate the utility of our regularizer in learning representations for long-tailed distributions via achieving better performance than existing approaches over multiple datasets.
arXiv Detail & Related papers (2021-06-17T11:41:30Z) - Mitigating Dataset Imbalance via Joint Generation and Classification [17.57577266707809]
Supervised deep learning methods are enjoying enormous success in many practical applications of computer vision.
The marked performance degradation to biases and imbalanced data questions the reliability of these methods.
We introduce a joint dataset repairment strategy by combining a neural network classifier with Generative Adversarial Networks (GAN)
We show that the combined training helps to improve the robustness of both the classifier and the GAN against severe class imbalance.
arXiv Detail & Related papers (2020-08-12T18:40:38Z) - Oversampling Adversarial Network for Class-Imbalanced Fault Diagnosis [12.526197448825968]
Class-imbalance problem requires a robust learning system which can timely predict and classify the data.
We propose a new adversarial network for simultaneous classification and fault detection.
arXiv Detail & Related papers (2020-08-07T10:12:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.