The Perception-Robustness Tradeoff in Deterministic Image Restoration
- URL: http://arxiv.org/abs/2311.09253v4
- Date: Sat, 8 Jun 2024 16:54:33 GMT
- Title: The Perception-Robustness Tradeoff in Deterministic Image Restoration
- Authors: Guy Ohayon, Tomer Michaeli, Michael Elad,
- Abstract summary: We study the behavior of deterministic methods for solving inverse problems in imaging.
To approach perfect quality and perfect consistency, the Lipschitz constant of the model must grow to infinity.
We demonstrate our theory on single image super-resolution algorithms, addressing both noisy and noiseless settings.
- Score: 34.50287066865267
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the behavior of deterministic methods for solving inverse problems in imaging. These methods are commonly designed to achieve two goals: (1) attaining high perceptual quality, and (2) generating reconstructions that are consistent with the measurements. We provide a rigorous proof that the better a predictor satisfies these two requirements, the larger its Lipschitz constant must be, regardless of the nature of the degradation involved. In particular, to approach perfect perceptual quality and perfect consistency, the Lipschitz constant of the model must grow to infinity. This implies that such methods are necessarily more susceptible to adversarial attacks. We demonstrate our theory on single image super-resolution algorithms, addressing both noisy and noiseless settings. We also show how this undesired behavior can be leveraged to explore the posterior distribution, thereby allowing the deterministic model to imitate stochastic methods.
Related papers
- Intrinsic Image Decomposition via Ordinal Shading [0.0]
Intrinsic decomposition is a fundamental mid-level vision problem that plays a crucial role in inverse rendering and computational photography pipelines.
We present a dense ordinal shading formulation using a shift- and scale-invariant loss to estimate ordinal shading cues.
We then combine low- and high-resolution ordinal estimations using a second network to generate a shading estimate with both global coherency and local details.
arXiv Detail & Related papers (2023-11-21T18:58:01Z) - Regularized Vector Quantization for Tokenized Image Synthesis [126.96880843754066]
Quantizing images into discrete representations has been a fundamental problem in unified generative modeling.
deterministic quantization suffers from severe codebook collapse and misalignment with inference stage while quantization suffers from low codebook utilization and reconstruction objective.
This paper presents a regularized vector quantization framework that allows to mitigate perturbed above issues effectively by applying regularization from two perspectives.
arXiv Detail & Related papers (2023-03-11T15:20:54Z) - Reasons for the Superiority of Stochastic Estimators over Deterministic
Ones: Robustness, Consistency and Perceptual Quality [44.47246905244631]
We prove that any restoration algorithm that attains perfect quality must be a posterior sampler.
We illustrate that while deterministic restoration algorithms may attain high quality, this can be achieved only by filling up the space of all possible source images.
arXiv Detail & Related papers (2022-11-16T14:49:10Z) - Composed Image Retrieval with Text Feedback via Multi-grained
Uncertainty Regularization [73.04187954213471]
We introduce a unified learning approach to simultaneously modeling the coarse- and fine-grained retrieval.
The proposed method has achieved +4.03%, +3.38%, and +2.40% Recall@50 accuracy over a strong baseline.
arXiv Detail & Related papers (2022-11-14T14:25:40Z) - Towards Bidirectional Arbitrary Image Rescaling: Joint Optimization and
Cycle Idempotence [76.93002743194974]
We propose a method to treat arbitrary rescaling, both upscaling and downscaling, as one unified process.
The proposed model is able to learn upscaling and downscaling simultaneously and achieve bidirectional arbitrary image rescaling.
It is shown to be robust in cycle idempotence test, free of severe degradations in reconstruction accuracy when the downscaling-to-upscaling cycle is applied repetitively.
arXiv Detail & Related papers (2022-03-02T07:42:15Z) - Deblurring via Stochastic Refinement [85.42730934561101]
We present an alternative framework for blind deblurring based on conditional diffusion models.
Our method is competitive in terms of distortion metrics such as PSNR.
arXiv Detail & Related papers (2021-12-05T04:36:09Z) - High Probability Complexity Bounds for Non-Smooth Stochastic Optimization with Heavy-Tailed Noise [51.31435087414348]
It is essential to theoretically guarantee that algorithms provide small objective residual with high probability.
Existing methods for non-smooth convex optimization have complexity bounds with dependence on confidence level.
We propose novel stepsize rules for two methods with gradient clipping.
arXiv Detail & Related papers (2021-06-10T17:54:21Z) - Uncertainty-aware Generalized Adaptive CycleGAN [44.34422859532988]
Unpaired image-to-image translation refers to learning inter-image-domain mapping in an unsupervised manner.
Existing methods often learn deterministic mappings without explicitly modelling the robustness to outliers or predictive uncertainty.
We propose a novel probabilistic method called Uncertainty-aware Generalized Adaptive Cycle Consistency (UGAC)
arXiv Detail & Related papers (2021-02-23T15:22:35Z) - Robust, Accurate Stochastic Optimization for Variational Inference [68.83746081733464]
We show that common optimization methods lead to poor variational approximations if the problem is moderately large.
Motivated by these findings, we develop a more robust and accurate optimization framework by viewing the underlying algorithm as producing a Markov chain.
arXiv Detail & Related papers (2020-09-01T19:12:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.