Boost-and-Skip: A Simple Guidance-Free Diffusion for Minority Generation
- URL: http://arxiv.org/abs/2502.06516v1
- Date: Mon, 10 Feb 2025 14:37:26 GMT
- Title: Boost-and-Skip: A Simple Guidance-Free Diffusion for Minority Generation
- Authors: Soobin Um, Beomsu Kim, Jong Chul Ye,
- Abstract summary: We present a powerful yet powerful guidance-free approach called Boost-and-Skip for generating minority samples using diffusion models.
We highlight that these seemingly-trivial modifications are supported by solid theoretical and empirical evidence.
Our experiments demonstrate that Boost-and-Skip greatly enhances the capability of generating minority samples, even rivaling guidance-based state-of-the-art approaches.
- Score: 57.19995625893062
- License:
- Abstract: Minority samples are underrepresented instances located in low-density regions of a data manifold, and are valuable in many generative AI applications, such as data augmentation, creative content generation, etc. Unfortunately, existing diffusion-based minority generators often rely on computationally expensive guidance dedicated for minority generation. To address this, here we present a simple yet powerful guidance-free approach called Boost-and-Skip for generating minority samples using diffusion models. The key advantage of our framework requires only two minimal changes to standard generative processes: (i) variance-boosted initialization and (ii) timestep skipping. We highlight that these seemingly-trivial modifications are supported by solid theoretical and empirical evidence, thereby effectively promoting emergence of underrepresented minority features. Our comprehensive experiments demonstrate that Boost-and-Skip greatly enhances the capability of generating minority samples, even rivaling guidance-based state-of-the-art approaches while requiring significantly fewer computations.
Related papers
- Minority-Focused Text-to-Image Generation via Prompt Optimization [57.319845580050924]
We investigate the generation of minority samples using pretrained text-to-image (T2I) latent diffusion models.
We develop an online prompt optimization framework that can encourage the emergence of desired properties.
We then tailor this generic prompt into a specialized solver that promotes the generation of minority features.
arXiv Detail & Related papers (2024-10-10T11:56:09Z) - Self-Guided Generation of Minority Samples Using Diffusion Models [57.319845580050924]
We present a novel approach for generating minority samples that live on low-density regions of a data manifold.
Our framework is built upon diffusion models, leveraging the principle of guided sampling.
Experiments on benchmark real datasets demonstrate that our approach can greatly improve the capability of creating realistic low-likelihood minority instances.
arXiv Detail & Related papers (2024-07-16T10:03:29Z) - Chameleon: Foundation Models for Fairness-aware Multi-modal Data
Augmentation to Enhance Coverage of Minorities [25.215178019059874]
Underrepresentation of minorities in training data is a well-recognized concern.
We propose Chameleon, a system that augments a data set with a minimal addition of settings to enhance the coverage of under-represented groups.
Our experiment, in addition to confirming the efficiency of our proposed algorithms, illustrate the effectiveness of our approach.
arXiv Detail & Related papers (2024-02-02T00:16:45Z) - Generative Oversampling for Imbalanced Data via Majority-Guided VAE [15.93867386081279]
We propose a novel over-sampling model, called Majority-Guided VAE(MGVAE), which generates new minority samples under the guidance of a majority-based prior.
In this way, the newly generated minority samples can inherit the diversity and richness of the majority ones, thus mitigating overfitting in downstream tasks.
arXiv Detail & Related papers (2023-02-14T06:35:23Z) - Don't Play Favorites: Minority Guidance for Diffusion Models [59.75996752040651]
We present a novel framework that can make the generation process of the diffusion models focus on the minority samples.
We develop minority guidance, a sampling technique that can guide the generation process toward regions with desired likelihood levels.
arXiv Detail & Related papers (2023-01-29T03:08:47Z) - Few-shot Forgery Detection via Guided Adversarial Interpolation [56.59499187594308]
Existing forgery detection methods suffer from significant performance drops when applied to unseen novel forgery approaches.
We propose Guided Adversarial Interpolation (GAI) to overcome the few-shot forgery detection problem.
Our method is validated to be robust to choices of majority and minority forgery approaches.
arXiv Detail & Related papers (2022-04-12T16:05:10Z) - Counterfactual-based minority oversampling for imbalanced classification [11.140929092818235]
A key challenge of oversampling in imbalanced classification is that the generation of new minority samples often neglects the usage of majority classes.
We present a new oversampling framework based on the counterfactual theory.
arXiv Detail & Related papers (2020-08-21T14:13:15Z) - Inclusive GAN: Improving Data and Minority Coverage in Generative Models [101.67587566218928]
We formalize the problem of minority inclusion as one of data coverage.
We then propose to improve data coverage by harmonizing adversarial training with reconstructive generation.
We develop an extension that allows explicit control over the minority subgroups that the model should ensure to include.
arXiv Detail & Related papers (2020-04-07T13:31:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.