GradCheck: Analyzing classifier guidance gradients for conditional diffusion sampling
- URL: http://arxiv.org/abs/2406.17399v1
- Date: Tue, 25 Jun 2024 09:23:25 GMT
- Title: GradCheck: Analyzing classifier guidance gradients for conditional diffusion sampling
- Authors: Philipp Vaeth, Alexander M. Fruehwald, Benjamin Paassen, Magda Gregorova,
- Abstract summary: gradients from classifiers, especially those not trained on noisy images, are often unstable.
This study conducts a gradient analysis comparing robust and non-robust classifiers, as well as multiple gradient stabilization techniques.
- Score: 42.50219822975012
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: To sample from an unconditionally trained Denoising Diffusion Probabilistic Model (DDPM), classifier guidance adds conditional information during sampling, but the gradients from classifiers, especially those not trained on noisy images, are often unstable. This study conducts a gradient analysis comparing robust and non-robust classifiers, as well as multiple gradient stabilization techniques. Experimental results demonstrate that these techniques significantly improve the quality of class-conditional samples for non-robust classifiers by providing more stable and informative classifier guidance gradients. The findings highlight the importance of gradient stability in enhancing the performance of classifier guidance, especially on non-robust classifiers.
Related papers
- Diffusion Classifier Guidance for Non-robust Classifiers [0.5999777817331317]
We study the sensitivity of general, non-robust, and robust classifiers to noise of the diffusion process.<n>Non-robust classifiers exhibit significant accuracy degradation under noisy conditions, leading to unstable guidance gradients.<n>We propose a method that utilizes one-step denoised image predictions and implements techniques inspired by optimization methods.
arXiv Detail & Related papers (2025-07-01T11:39:41Z) - Noise Conditional Variational Score Distillation [60.38982038894823]
Noise Conditional Variational Score Distillation (NCVSD) is a novel method for distilling pretrained diffusion models into generative denoisers.<n>By integrating this insight into the Variational Score Distillation framework, we enable scalable learning of generative denoisers.
arXiv Detail & Related papers (2025-06-11T06:01:39Z) - Studying Classifier(-Free) Guidance From a Classifier-Centric Perspective [100.54185280153753]
We find that both classifier guidance and classifier-free guidance achieve conditional generation by pushing the denoising diffusion trajectories away from decision boundaries.
We propose a generic postprocessing step built upon flow-matching to shrink the gap between the learned distribution for a pretrained denoising diffusion model and the real data distribution.
arXiv Detail & Related papers (2025-03-13T17:59:59Z) - Optimized Gradient Clipping for Noisy Label Learning [26.463965846251938]
We propose a simple yet effective approach called Optimized Gradient Clipping (OGC)
OGC dynamically adjusts the clipping threshold based on the ratio of noise gradients to clean gradients after clipping.
Our experiments across various types of label noise, including symmetric, asymmetric, instance-dependent, and real-world noise, demonstrate the effectiveness of OGC.
arXiv Detail & Related papers (2024-12-12T05:08:05Z) - Stable Neighbor Denoising for Source-free Domain Adaptive Segmentation [91.83820250747935]
Pseudo-label noise is mainly contained in unstable samples in which predictions of most pixels undergo significant variations during self-training.
We introduce the Stable Neighbor Denoising (SND) approach, which effectively discovers highly correlated stable and unstable samples.
SND consistently outperforms state-of-the-art methods in various SFUDA semantic segmentation settings.
arXiv Detail & Related papers (2024-06-10T21:44:52Z) - Self-Rectifying Diffusion Sampling with Perturbed-Attention Guidance [28.354284737867136]
Perturbed-Attention Guidance (PAG) improves diffusion sample quality across both unconditional and conditional settings.
In both ADM and Stable Diffusion, PAG surprisingly improves sample quality in conditional and even unconditional scenarios.
arXiv Detail & Related papers (2024-03-26T04:49:11Z) - Label-Noise Robust Diffusion Models [18.82847557713331]
Conditional diffusion models have shown remarkable performance in various generative tasks.
Training them requires large-scale datasets that often contain noise in conditional inputs, a.k.a. noisy labels.
This paper proposes Transition-aware weighted Denoising Score Matching for training conditional diffusion models with noisy labels.
arXiv Detail & Related papers (2024-02-27T14:00:34Z) - Bridging the Gap: Addressing Discrepancies in Diffusion Model Training
for Classifier-Free Guidance [1.6804613362826175]
Diffusion models have emerged as a pivotal advancement in generative models.
In this paper we aim to underscore a discrepancy between conventional training methods and the desired conditional sampling behavior.
We introduce an updated loss function that better aligns training objectives with sampling behaviors.
arXiv Detail & Related papers (2023-11-02T02:03:12Z) - Benchmarking common uncertainty estimation methods with
histopathological images under domain shift and label noise [62.997667081978825]
In high-risk environments, deep learning models need to be able to judge their uncertainty and reject inputs when there is a significant chance of misclassification.
We conduct a rigorous evaluation of the most commonly used uncertainty and robustness methods for the classification of Whole Slide Images.
We observe that ensembles of methods generally lead to better uncertainty estimates as well as an increased robustness towards domain shifts and label noise.
arXiv Detail & Related papers (2023-01-03T11:34:36Z) - Confidence-aware Training of Smoothed Classifiers for Certified
Robustness [75.95332266383417]
We use "accuracy under Gaussian noise" as an easy-to-compute proxy of adversarial robustness for an input.
Our experiments show that the proposed method consistently exhibits improved certified robustness upon state-of-the-art training methods.
arXiv Detail & Related papers (2022-12-18T03:57:12Z) - Enhancing Diffusion-Based Image Synthesis with Robust Classifier
Guidance [17.929524924008962]
In order to obtain class-conditional generation, it was suggested to guide the diffusion process by gradients from a time-dependent classifier.
While the idea is theoretically sound, deep learning-based classifiers are infamously susceptible to gradient-based adversarial attacks.
We utilize this observation by defining and training a time-dependent adversarially robust classifier and use it as guidance for a generative diffusion model.
arXiv Detail & Related papers (2022-08-18T06:51:23Z) - Centrality and Consistency: Two-Stage Clean Samples Identification for
Learning with Instance-Dependent Noisy Labels [87.48541631675889]
We propose a two-stage clean samples identification method.
First, we employ a class-level feature clustering procedure for the early identification of clean samples.
Second, for the remaining clean samples that are close to the ground truth class boundary, we propose a novel consistency-based classification method.
arXiv Detail & Related papers (2022-07-29T04:54:57Z) - Classifier-Free Diffusion Guidance [17.355749359987648]
guidance is a recently introduced method of trade off mode coverage and sample fidelity in conditional diffusion models.
We show that guidance can be indeed performed by a pure generative model without such a classifier.
We combine the resulting conditional and unconditional score estimates to attain a trade-off between sample quality and diversity.
arXiv Detail & Related papers (2022-07-26T01:42:07Z) - Prototypical Classifier for Robust Class-Imbalanced Learning [64.96088324684683]
We propose textitPrototypical, which does not require fitting additional parameters given the embedding network.
Prototypical produces balanced and comparable predictions for all classes even though the training set is class-imbalanced.
We test our method on CIFAR-10LT, CIFAR-100LT and Webvision datasets, observing that Prototypical obtains substaintial improvements compared with state of the arts.
arXiv Detail & Related papers (2021-10-22T01:55:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.