Diffusion Model Guided Sampling with Pixel-Wise Aleatoric Uncertainty Estimation
- URL: http://arxiv.org/abs/2412.00205v1
- Date: Fri, 29 Nov 2024 19:02:08 GMT
- Title: Diffusion Model Guided Sampling with Pixel-Wise Aleatoric Uncertainty Estimation
- Authors: Michele De Vita, Vasileios Belagiannis,
- Abstract summary: We propose to estimate the pixel-wise aleatoric uncertainty during the sampling phase of diffusion models.
The uncertainty is computed as the variance of the denoising scores with a perturbation scheme specifically designed for diffusion models.
We show that our guided approach leads to better sample generation in terms of FID scores.
- Score: 10.269485943949332
- License:
- Abstract: Despite the remarkable progress in generative modelling, current diffusion models lack a quantitative approach to assess image quality. To address this limitation, we propose to estimate the pixel-wise aleatoric uncertainty during the sampling phase of diffusion models and utilise the uncertainty to improve the sample generation quality. The uncertainty is computed as the variance of the denoising scores with a perturbation scheme that is specifically designed for diffusion models. We then show that the aleatoric uncertainty estimates are related to the second-order derivative of the diffusion noise distribution. We evaluate our uncertainty estimation algorithm and the uncertainty-guided sampling on the ImageNet and CIFAR-10 datasets. In our comparisons with the related work, we demonstrate promising results in filtering out low quality samples. Furthermore, we show that our guided approach leads to better sample generation in terms of FID scores.
Related papers
- VIPaint: Image Inpainting with Pre-Trained Diffusion Models via Variational Inference [5.852077003870417]
We show that our VIPaint method significantly outperforms previous approaches in both the plausibility and diversity of imputations.
We show that our VIPaint method significantly outperforms previous approaches in both the plausibility and diversity of imputations.
arXiv Detail & Related papers (2024-11-28T05:35:36Z) - One step closer to unbiased aleatoric uncertainty estimation [71.55174353766289]
We propose a new estimation method by actively de-noising the observed data.
By conducting a broad range of experiments, we demonstrate that our proposed approach provides a much closer approximation to the actual data uncertainty than the standard method.
arXiv Detail & Related papers (2023-12-16T14:59:11Z) - Mitigating Exposure Bias in Discriminator Guided Diffusion Models [4.5349436061325425]
We propose SEDM-G++, which incorporates a modified sampling approach, combining Discriminator Guidance and Epsilon Scaling.
Our proposed approach outperforms the current state-of-the-art, by achieving an FID score of 1.73 on the unconditional CIFAR-10 dataset.
arXiv Detail & Related papers (2023-11-18T20:49:50Z) - BayesDiff: Estimating Pixel-wise Uncertainty in Diffusion via Bayesian
Inference [29.682407300058394]
BayesDiff is a pixel-wise uncertainty estimator for generations from diffusion models based on Bayesian inference.
The estimated pixel-wise uncertainty can not only be aggregated into a sample-wise metric to filter out low-fidelity images but also aids in augmenting successful generations and rectifying artifacts in failed generations in text-to-image tasks.
arXiv Detail & Related papers (2023-10-17T10:45:28Z) - Semi-Implicit Denoising Diffusion Models (SIDDMs) [50.30163684539586]
Existing models such as Denoising Diffusion Probabilistic Models (DDPM) deliver high-quality, diverse samples but are slowed by an inherently high number of iterative steps.
We introduce a novel approach that tackles the problem by matching implicit and explicit factors.
We demonstrate that our proposed method obtains comparable generative performance to diffusion-based models and vastly superior results to models with a small number of sampling steps.
arXiv Detail & Related papers (2023-06-21T18:49:22Z) - Solving Diffusion ODEs with Optimal Boundary Conditions for Better Image Super-Resolution [82.50210340928173]
randomness of diffusion models results in ineffectiveness and instability, making it challenging for users to guarantee the quality of SR results.
We propose a plug-and-play sampling method that owns the potential to benefit a series of diffusion-based SR methods.
The quality of SR results sampled by the proposed method with fewer steps outperforms the quality of results sampled by current methods with randomness from the same pre-trained diffusion-based SR model.
arXiv Detail & Related papers (2023-05-24T17:09:54Z) - Bi-Noising Diffusion: Towards Conditional Diffusion Models with
Generative Restoration Priors [64.24948495708337]
We introduce a new method that brings predicted samples to the training data manifold using a pretrained unconditional diffusion model.
We perform comprehensive experiments to demonstrate the effectiveness of our approach on super-resolution, colorization, turbulence removal, and image-deraining tasks.
arXiv Detail & Related papers (2022-12-14T17:26:35Z) - Deblurring via Stochastic Refinement [85.42730934561101]
We present an alternative framework for blind deblurring based on conditional diffusion models.
Our method is competitive in terms of distortion metrics such as PSNR.
arXiv Detail & Related papers (2021-12-05T04:36:09Z) - Uncertainty-aware Generalized Adaptive CycleGAN [44.34422859532988]
Unpaired image-to-image translation refers to learning inter-image-domain mapping in an unsupervised manner.
Existing methods often learn deterministic mappings without explicitly modelling the robustness to outliers or predictive uncertainty.
We propose a novel probabilistic method called Uncertainty-aware Generalized Adaptive Cycle Consistency (UGAC)
arXiv Detail & Related papers (2021-02-23T15:22:35Z) - The Hidden Uncertainty in a Neural Networks Activations [105.4223982696279]
The distribution of a neural network's latent representations has been successfully used to detect out-of-distribution (OOD) data.
This work investigates whether this distribution correlates with a model's epistemic uncertainty, thus indicating its ability to generalise to novel inputs.
arXiv Detail & Related papers (2020-12-05T17:30:35Z) - A deep-learning based Bayesian approach to seismic imaging and
uncertainty quantification [0.4588028371034407]
Uncertainty is essential when dealing with ill-conditioned inverse problems.
It is often not possible to formulate a prior distribution that precisely encodes our prior knowledge about the unknown.
We propose to use the functional form of a randomly convolutional neural network as an implicit structured prior.
arXiv Detail & Related papers (2020-01-13T23:46:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.