Selectively increasing the diversity of GAN-generated samples
- URL: http://arxiv.org/abs/2207.01561v3
- Date: Thu, 22 Jun 2023 19:00:32 GMT
- Title: Selectively increasing the diversity of GAN-generated samples
- Authors: Jan Dubi\'nski, Kamil Deja, Sandro Wenzel, Przemys{\l}aw Rokita,
Tomasz Trzci\'nski
- Abstract summary: We propose a novel method to selectively increase the diversity of GAN-generated samples.
We show the superiority of our method in a synthetic benchmark as well as a real-life scenario simulating data from the Zero Degree Calorimeter of ALICE experiment in CERN.
- Score: 8.980453507536017
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Generative Adversarial Networks (GANs) are powerful models able to synthesize
data samples closely resembling the distribution of real data, yet the
diversity of those generated samples is limited due to the so-called mode
collapse phenomenon observed in GANs. Especially prone to mode collapse are
conditional GANs, which tend to ignore the input noise vector and focus on the
conditional information. Recent methods proposed to mitigate this limitation
increase the diversity of generated samples, yet they reduce the performance of
the models when similarity of samples is required. To address this shortcoming,
we propose a novel method to selectively increase the diversity of
GAN-generated samples. By adding a simple, yet effective regularization to the
training loss function we encourage the generator to discover new data modes
for inputs related to diverse outputs while generating consistent samples for
the remaining ones. More precisely, we maximise the ratio of distances between
generated images and input latent vectors scaling the effect according to the
diversity of samples for a given conditional input. We show the superiority of
our method in a synthetic benchmark as well as a real-life scenario of
simulating data from the Zero Degree Calorimeter of ALICE experiment in LHC,
CERN.
Related papers
- CCS: Controllable and Constrained Sampling with Diffusion Models via Initial Noise Perturbation [9.12693573953231]
We first observe an interesting phenomenon: the relationship between the change of generation outputs and the scale of initial noise perturbation is highly linear through the diffusion ODE sampling.
We propose a novel Controllable and Constrained Sampling method (CCS) together with a new controller algorithm for diffusion models to sample with desired statistical properties.
Results show that our CCS method achieves more precisely controlled sampling while maintaining superior sample quality and diversity.
arXiv Detail & Related papers (2025-02-07T05:30:48Z) - A New Formulation of Lipschitz Constrained With Functional Gradient Learning for GANs [52.55025869932486]
This paper introduces a promising alternative method for training Generative Adversarial Networks (GANs) on large-scale datasets with clear theoretical guarantees.
We propose a novel Lipschitz-constrained Functional Gradient GANs learning (Li-CFG) method to stabilize the training of GAN.
We demonstrate that the neighborhood size of the latent vector can be reduced by increasing the norm of the discriminator gradient.
arXiv Detail & Related papers (2025-01-20T02:48:07Z) - Diverse Rare Sample Generation with Pretrained GANs [24.227852798611025]
This study proposes a novel approach for generating diverse rare samples from high-resolution image datasets with pretrained GANs.
Our method employs gradient-based optimization of latent vectors within a multi-objective framework and utilizes normalizing flows for density estimation on the feature space.
This enables the generation of diverse rare images, with controllable parameters for rarity, diversity, and similarity to a reference image.
arXiv Detail & Related papers (2024-12-27T09:10:30Z) - Score-based Generative Models with Adaptive Momentum [40.84399531998246]
We propose an adaptive momentum sampling method to accelerate the transforming process.
We show that our method can produce more faithful images/graphs in small sampling steps with 2 to 5 times speed up.
arXiv Detail & Related papers (2024-05-22T15:20:27Z) - Iterated Denoising Energy Matching for Sampling from Boltzmann Densities [109.23137009609519]
Iterated Denoising Energy Matching (iDEM)
iDEM alternates between (I) sampling regions of high model density from a diffusion-based sampler and (II) using these samples in our matching objective.
We show that the proposed approach achieves state-of-the-art performance on all metrics and trains $2-5times$ faster.
arXiv Detail & Related papers (2024-02-09T01:11:23Z) - Semi-Implicit Denoising Diffusion Models (SIDDMs) [50.30163684539586]
Existing models such as Denoising Diffusion Probabilistic Models (DDPM) deliver high-quality, diverse samples but are slowed by an inherently high number of iterative steps.
We introduce a novel approach that tackles the problem by matching implicit and explicit factors.
We demonstrate that our proposed method obtains comparable generative performance to diffusion-based models and vastly superior results to models with a small number of sampling steps.
arXiv Detail & Related papers (2023-06-21T18:49:22Z) - Breaking the Spurious Causality of Conditional Generation via Fairness
Intervention with Corrective Sampling [77.15766509677348]
Conditional generative models often inherit spurious correlations from the training dataset.
This can result in label-conditional distributions that are imbalanced with respect to another latent attribute.
We propose a general two-step strategy to mitigate this issue.
arXiv Detail & Related papers (2022-12-05T08:09:33Z) - Saliency Grafting: Innocuous Attribution-Guided Mixup with Calibrated
Label Mixing [104.630875328668]
Mixup scheme suggests mixing a pair of samples to create an augmented training sample.
We present a novel, yet simple Mixup-variant that captures the best of both worlds.
arXiv Detail & Related papers (2021-12-16T11:27:48Z) - Refining Deep Generative Models via Discriminator Gradient Flow [18.406499703293566]
Discriminator Gradient flow (DGflow) is a new technique that improves generated samples via the gradient flow of entropy-regularized f-divergences.
We show that DGflow leads to significant improvement in the quality of generated samples for a variety of generative models.
arXiv Detail & Related papers (2020-12-01T19:10:15Z) - GANs with Variational Entropy Regularizers: Applications in Mitigating
the Mode-Collapse Issue [95.23775347605923]
Building on the success of deep learning, Generative Adversarial Networks (GANs) provide a modern approach to learn a probability distribution from observed samples.
GANs often suffer from the mode collapse issue where the generator fails to capture all existing modes of the input distribution.
We take an information-theoretic approach and maximize a variational lower bound on the entropy of the generated samples to increase their diversity.
arXiv Detail & Related papers (2020-09-24T19:34:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.