Blind Image Deblurring with Unknown Kernel Size and Substantial Noise
- URL: http://arxiv.org/abs/2208.09483v2
- Date: Fri, 15 Sep 2023 04:54:50 GMT
- Title: Blind Image Deblurring with Unknown Kernel Size and Substantial Noise
- Authors: Zhong Zhuang, Taihui Li, Hengkang Wang, Ju Sun
- Abstract summary: Blind image deblurring (BID) has been extensively studied in computer vision and adjacent fields.
We propose a practical BID method that is stable against both, the first of its kind.
Our method builds on the recent ideas of solving inverse problems by integrating the physical models and structured deep neural networks.
- Score: 1.346207204106034
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Blind image deblurring (BID) has been extensively studied in computer vision
and adjacent fields. Modern methods for BID can be grouped into two categories:
single-instance methods that deal with individual instances using statistical
inference and numerical optimization, and data-driven methods that train
deep-learning models to deblur future instances directly. Data-driven methods
can be free from the difficulty in deriving accurate blur models, but are
fundamentally limited by the diversity and quality of the training data --
collecting sufficiently expressive and realistic training data is a standing
challenge. In this paper, we focus on single-instance methods that remain
competitive and indispensable. However, most such methods do not prescribe how
to deal with unknown kernel size and substantial noise, precluding practical
deployment. Indeed, we show that several state-of-the-art (SOTA)
single-instance methods are unstable when the kernel size is overspecified,
and/or the noise level is high. On the positive side, we propose a practical
BID method that is stable against both, the first of its kind. Our method
builds on the recent ideas of solving inverse problems by integrating the
physical models and structured deep neural networks, without extra training
data. We introduce several crucial modifications to achieve the desired
stability. Extensive empirical tests on standard synthetic datasets, as well as
real-world NTIRE2020 and RealBlur datasets, show the superior effectiveness and
practicality of our BID method compared to SOTA single-instance as well as
data-driven methods. The code of our method is available at:
\url{https://github.com/sun-umn/Blind-Image-Deblurring}.
Related papers
- Truncated Consistency Models [57.50243901368328]
Training consistency models requires learning to map all intermediate points along PF ODE trajectories to their corresponding endpoints.
We empirically find that this training paradigm limits the one-step generation performance of consistency models.
We propose a new parameterization of the consistency function and a two-stage training procedure that prevents the truncated-time training from collapsing to a trivial solution.
arXiv Detail & Related papers (2024-10-18T22:38:08Z) - Adversarial Robustification via Text-to-Image Diffusion Models [56.37291240867549]
Adrial robustness has been conventionally believed as a challenging property to encode for neural networks.
We develop a scalable and model-agnostic solution to achieve adversarial robustness without using any data.
arXiv Detail & Related papers (2024-07-26T10:49:14Z) - Optimal Parameter and Neuron Pruning for Out-of-Distribution Detection [36.4610463573214]
We propose an textbfOptimal textbfParameter and textbfNeuron textbfPruning (textbfOPNP) approach to detect out-of-distribution (OOD) samples.
Our proposal is training-free, compatible with other post-hoc methods, and exploring the information from all training data.
arXiv Detail & Related papers (2024-02-04T07:31:06Z) - Efficient Transfer Learning in Diffusion Models via Adversarial Noise [21.609168219488982]
Diffusion Probabilistic Models (DPMs) have demonstrated substantial promise in image generation tasks.
Previous works, like GANs, have tackled the limited data problem by transferring pre-trained models learned with sufficient data.
We propose a novel DPMs-based transfer learning method, TAN, to address the limited data problem.
arXiv Detail & Related papers (2023-08-23T06:44:44Z) - BOOT: Data-free Distillation of Denoising Diffusion Models with
Bootstrapping [64.54271680071373]
Diffusion models have demonstrated excellent potential for generating diverse images.
Knowledge distillation has been recently proposed as a remedy that can reduce the number of inference steps to one or a few.
We present a novel technique called BOOT, that overcomes limitations with an efficient data-free distillation algorithm.
arXiv Detail & Related papers (2023-06-08T20:30:55Z) - Deep Active Learning with Noise Stability [24.54974925491753]
Uncertainty estimation for unlabeled data is crucial to active learning.
We propose a novel algorithm that leverages noise stability to estimate data uncertainty.
Our method is generally applicable in various tasks, including computer vision, natural language processing, and structural data analysis.
arXiv Detail & Related papers (2022-05-26T13:21:01Z) - Unsupervised Noisy Tracklet Person Re-identification [100.85530419892333]
We present a novel selective tracklet learning (STL) approach that can train discriminative person re-id models from unlabelled tracklet data.
This avoids the tedious and costly process of exhaustively labelling person image/tracklet true matching pairs across camera views.
Our method is particularly more robust against arbitrary noisy data of raw tracklets therefore scalable to learning discriminative models from unconstrained tracking data.
arXiv Detail & Related papers (2021-01-16T07:31:00Z) - Attentional-Biased Stochastic Gradient Descent [74.49926199036481]
We present a provable method (named ABSGD) for addressing the data imbalance or label noise problem in deep learning.
Our method is a simple modification to momentum SGD where we assign an individual importance weight to each sample in the mini-batch.
ABSGD is flexible enough to combine with other robust losses without any additional cost.
arXiv Detail & Related papers (2020-12-13T03:41:52Z) - An Online Method for A Class of Distributionally Robust Optimization
with Non-Convex Objectives [54.29001037565384]
We propose a practical online method for solving a class of online distributionally robust optimization (DRO) problems.
Our studies demonstrate important applications in machine learning for improving the robustness of networks.
arXiv Detail & Related papers (2020-06-17T20:19:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.