Continuous Conditional Generative Adversarial Networks: Novel Empirical
Losses and Label Input Mechanisms
- URL: http://arxiv.org/abs/2011.07466v9
- Date: Mon, 30 Oct 2023 11:52:16 GMT
- Title: Continuous Conditional Generative Adversarial Networks: Novel Empirical
Losses and Label Input Mechanisms
- Authors: Xin Ding and Yongwei Wang and Zuheng Xu and William J. Welch and Z.
Jane Wang
- Abstract summary: This work proposes the continuous conditional generative adversarial network (CcGAN)
CcGAN is the first generative model for image generation conditional on continuous, scalar conditions (termed regression labels)
- Score: 26.331792743036804
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: This work proposes the continuous conditional generative adversarial network
(CcGAN), the first generative model for image generation conditional on
continuous, scalar conditions (termed regression labels). Existing conditional
GANs (cGANs) are mainly designed for categorical conditions (eg, class labels);
conditioning on regression labels is mathematically distinct and raises two
fundamental problems:(P1) Since there may be very few (even zero) real images
for some regression labels, minimizing existing empirical versions of cGAN
losses (aka empirical cGAN losses) often fails in practice;(P2) Since
regression labels are scalar and infinitely many, conventional label input
methods are not applicable. The proposed CcGAN solves the above problems,
respectively, by (S1) reformulating existing empirical cGAN losses to be
appropriate for the continuous scenario; and (S2) proposing a naive label input
(NLI) method and an improved label input (ILI) method to incorporate regression
labels into the generator and the discriminator. The reformulation in (S1)
leads to two novel empirical discriminator losses, termed the hard vicinal
discriminator loss (HVDL) and the soft vicinal discriminator loss (SVDL)
respectively, and a novel empirical generator loss. The error bounds of a
discriminator trained with HVDL and SVDL are derived under mild assumptions in
this work. Two new benchmark datasets (RC-49 and Cell-200) and a novel
evaluation metric (Sliding Fr\'echet Inception Distance) are also proposed for
this continuous scenario. Our experiments on the Circular 2-D Gaussians, RC-49,
UTKFace, Cell-200, and Steering Angle datasets show that CcGAN is able to
generate diverse, high-quality samples from the image distribution conditional
on a given regression label. Moreover, in these experiments, CcGAN
substantially outperforms cGAN both visually and quantitatively.
Related papers
- Generating Unbiased Pseudo-labels via a Theoretically Guaranteed
Chebyshev Constraint to Unify Semi-supervised Classification and Regression [57.17120203327993]
threshold-to-pseudo label process (T2L) in classification uses confidence to determine the quality of label.
In nature, regression also requires unbiased methods to generate high-quality labels.
We propose a theoretically guaranteed constraint for generating unbiased labels based on Chebyshev's inequality.
arXiv Detail & Related papers (2023-11-03T08:39:35Z) - Partial Label Supervision for Agnostic Generative Noisy Label Learning [18.29334728940232]
Noisy label learning has been tackled with both discriminative and generative approaches.
We propose a novel framework for generative noisy label learning that addresses these challenges.
arXiv Detail & Related papers (2023-08-02T14:48:25Z) - Label Distributionally Robust Losses for Multi-class Classification:
Consistency, Robustness and Adaptivity [55.29408396918968]
We study a family of loss functions named label-distributionally robust (LDR) losses for multi-class classification.
Our contributions include both consistency and robustness by establishing top-$k$ consistency of LDR losses for multi-class classification.
We propose a new adaptive LDR loss that automatically adapts the individualized temperature parameter to the noise degree of class label of each instance.
arXiv Detail & Related papers (2021-12-30T00:27:30Z) - An Empirical Study on GANs with Margin Cosine Loss and Relativistic
Discriminator [4.899818550820575]
We introduce a new loss function, namely Relativistic Margin Cosine Loss (RMCosGAN)
We compare RMCosGAN performance with existing loss functions based on two metrics: Frechet inception distance and inception score.
The experimental results show that RMCosGAN outperforms the existing ones and significantly improves the quality of images generated.
arXiv Detail & Related papers (2021-10-21T17:25:47Z) - cGANs with Auxiliary Discriminative Classifier [43.78253518292111]
Conditional generative models aim to learn the underlying joint distribution of data and labels.
auxiliary classifier generative adversarial networks (AC-GAN) have been widely used, but suffer from the issue of low intra-class diversity on generated samples.
We propose novel cGANs with auxiliary discriminative classifier (ADC-GAN) to address the issue of AC-GAN.
arXiv Detail & Related papers (2021-07-21T13:06:32Z) - Are conditional GANs explicitly conditional? [0.0]
This paper proposes two contributions for conditional Generative Adversarial Networks (cGANs)
The first main contribution is an analysis of cGANs to show that they are not explicitly conditional.
The second contribution is a new method, called acontrario, that explicitly models conditionality for both parts of the adversarial architecture.
arXiv Detail & Related papers (2021-06-28T22:49:27Z) - Distribution-Balanced Loss for Multi-Label Classification in Long-Tailed
Datasets [98.74153364118898]
We present a new loss function called Distribution-Balanced Loss for the multi-label recognition problems that exhibit long-tailed class distributions.
The Distribution-Balanced Loss tackles these issues through two key modifications to the standard binary cross-entropy loss.
Experiments on both Pascal VOC and COCO show that the models trained with this new loss function achieve significant performance gains.
arXiv Detail & Related papers (2020-07-19T11:50:10Z) - Classify and Generate Reciprocally: Simultaneous Positive-Unlabelled
Learning and Conditional Generation with Extra Data [77.31213472792088]
The scarcity of class-labeled data is a ubiquitous bottleneck in many machine learning problems.
We address this problem by leveraging Positive-Unlabeled(PU) classification and the conditional generation with extra unlabeled data.
We present a novel training framework to jointly target both PU classification and conditional generation when exposed to extra data.
arXiv Detail & Related papers (2020-06-14T08:27:40Z) - MatchGAN: A Self-Supervised Semi-Supervised Conditional Generative
Adversarial Network [51.84251358009803]
We present a novel self-supervised learning approach for conditional generative adversarial networks (GANs) under a semi-supervised setting.
We perform augmentation by randomly sampling sensible labels from the label space of the few labelled examples available.
Our method surpasses the baseline with only 20% of the labelled examples used to train the baseline.
arXiv Detail & Related papers (2020-06-11T17:14:55Z) - When Relation Networks meet GANs: Relation GANs with Triplet Loss [110.7572918636599]
Training stability is still a lingering concern of generative adversarial networks (GANs)
In this paper, we explore a relation network architecture for the discriminator and design a triplet loss which performs better generalization and stability.
Experiments on benchmark datasets show that the proposed relation discriminator and new loss can provide significant improvement on variable vision tasks.
arXiv Detail & Related papers (2020-02-24T11:35:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.