EC-GAN: Low-Sample Classification using Semi-Supervised Algorithms and
GANs
- URL: http://arxiv.org/abs/2012.15864v2
- Date: Wed, 31 Mar 2021 17:13:04 GMT
- Title: EC-GAN: Low-Sample Classification using Semi-Supervised Algorithms and
GANs
- Authors: Ayaan Haque
- Abstract summary: Semi-supervised learning has been gaining attention as it allows for performing image analysis tasks such as classification with limited labeled data.
Some popular algorithms using Generative Adrial Networks (GANs) for semi-supervised classification share a single architecture for classification and discrimination.
This may require a model to converge to a separate data distribution for each task, which may reduce overall performance.
We propose a novel GAN model namely External GAN (ECGAN) that utilizes GANs and semi-supervised algorithms to improve classification in fully-supervised tasks.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Semi-supervised learning has been gaining attention as it allows for
performing image analysis tasks such as classification with limited labeled
data. Some popular algorithms using Generative Adversarial Networks (GANs) for
semi-supervised classification share a single architecture for classification
and discrimination. However, this may require a model to converge to a separate
data distribution for each task, which may reduce overall performance. While
progress in semi-supervised learning has been made, less addressed are
small-scale, fully-supervised tasks where even unlabeled data is unavailable
and unattainable. We therefore, propose a novel GAN model namely External
Classifier GAN (EC-GAN), that utilizes GANs and semi-supervised algorithms to
improve classification in fully-supervised regimes. Our method leverages a GAN
to generate artificial data used to supplement supervised classification. More
specifically, we attach an external classifier, hence the name EC-GAN, to the
GAN's generator, as opposed to sharing an architecture with the discriminator.
Our experiments demonstrate that EC-GAN's performance is comparable to the
shared architecture method, far superior to the standard data augmentation and
regularization-based approach, and effective on a small, realistic dataset.
Related papers
- Divide and Contrast: Source-free Domain Adaptation via Adaptive
Contrastive Learning [122.62311703151215]
Divide and Contrast (DaC) aims to connect the good ends of both worlds while bypassing their limitations.
DaC divides the target data into source-like and target-specific samples, where either group of samples is treated with tailored goals.
We further align the source-like domain with the target-specific samples using a memory bank-based Maximum Mean Discrepancy (MMD) loss to reduce the distribution mismatch.
arXiv Detail & Related papers (2022-11-12T09:21:49Z) - Self-Ensembling GAN for Cross-Domain Semantic Segmentation [107.27377745720243]
This paper proposes a self-ensembling generative adversarial network (SE-GAN) exploiting cross-domain data for semantic segmentation.
In SE-GAN, a teacher network and a student network constitute a self-ensembling model for generating semantic segmentation maps, which together with a discriminator, forms a GAN.
Despite its simplicity, we find SE-GAN can significantly boost the performance of adversarial training and enhance the stability of the model.
arXiv Detail & Related papers (2021-12-15T09:50:25Z) - Top-Down Deep Clustering with Multi-generator GANs [0.0]
Deep clustering (DC) learns embedding spaces that are optimal for cluster analysis.
We propose HC-MGAN, a new technique based on GANs with multiple generators (MGANs)
Our method is inspired by the observation that each generator of a MGAN tends to generate data that correlates with a sub-region of the real data distribution.
arXiv Detail & Related papers (2021-12-06T22:53:12Z) - Semi-Supervised Semantic Segmentation of Vessel Images using Leaking
Perturbations [1.5791732557395552]
Leaking GAN is a GAN-based semi-supervised architecture for retina vessel semantic segmentation.
Our key idea is to pollute the discriminator by leaking information from the generator.
This leads to more moderate generations that benefit the training of GAN.
arXiv Detail & Related papers (2021-10-22T18:25:08Z) - MineGAN++: Mining Generative Models for Efficient Knowledge Transfer to
Limited Data Domains [77.46963293257912]
We propose a novel knowledge transfer method for generative models based on mining the knowledge that is most beneficial to a specific target domain.
This is done using a miner network that identifies which part of the generative distribution of each pretrained GAN outputs samples closest to the target domain.
We show that the proposed method, called MineGAN, effectively transfers knowledge to domains with few target images, outperforming existing methods.
arXiv Detail & Related papers (2021-04-28T13:10:56Z) - Towards Uncovering the Intrinsic Data Structures for Unsupervised Domain
Adaptation using Structurally Regularized Deep Clustering [119.88565565454378]
Unsupervised domain adaptation (UDA) is to learn classification models that make predictions for unlabeled data on a target domain.
We propose a hybrid model of Structurally Regularized Deep Clustering, which integrates the regularized discriminative clustering of target data with a generative one.
Our proposed H-SRDC outperforms all the existing methods under both the inductive and transductive settings.
arXiv Detail & Related papers (2020-12-08T08:52:00Z) - Teaching a GAN What Not to Learn [20.03447539784024]
Generative adversarial networks (GANs) were originally envisioned as unsupervised generative models that learn to follow a target distribution.
In this paper, we approach the supervised GAN problem from a different perspective, one motivated by the philosophy of the famous Persian poet Rumi.
In the GAN framework, we not only provide the GAN positive data that it must learn to model, but also present it with so-called negative samples that it must learn to avoid.
This formulation allows the discriminator to represent the underlying target distribution better by learning to penalize generated samples that are undesirable.
arXiv Detail & Related papers (2020-10-29T14:44:24Z) - Improving Generative Adversarial Networks with Local Coordinate Coding [150.24880482480455]
Generative adversarial networks (GANs) have shown remarkable success in generating realistic data from some predefined prior distribution.
In practice, semantic information might be represented by some latent distribution learned from data.
We propose an LCCGAN model with local coordinate coding (LCC) to improve the performance of generating data.
arXiv Detail & Related papers (2020-07-28T09:17:50Z) - Generalized Zero-Shot Learning Via Over-Complete Distribution [79.5140590952889]
We propose to generate an Over-Complete Distribution (OCD) using Conditional Variational Autoencoder (CVAE) of both seen and unseen classes.
The effectiveness of the framework is evaluated using both Zero-Shot Learning and Generalized Zero-Shot Learning protocols.
arXiv Detail & Related papers (2020-04-01T19:05:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.