Defocus Map Estimation and Deblurring from a Single Dual-Pixel Image
- URL: http://arxiv.org/abs/2110.05655v1
- Date: Tue, 12 Oct 2021 00:09:07 GMT
- Title: Defocus Map Estimation and Deblurring from a Single Dual-Pixel Image
- Authors: Shumian Xin, Neal Wadhwa, Tianfan Xue, Jonathan T. Barron, Pratul P.
Srinivasan, Jiawen Chen, Ioannis Gkioulekas, Rahul Garg
- Abstract summary: We present a method that takes as input a single dual-pixel image, and simultaneously estimates the image's defocus map.
Our approach improves upon prior works for both defocus map estimation and blur removal, despite being entirely unsupervised.
- Score: 54.10957300181677
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a method that takes as input a single dual-pixel image, and
simultaneously estimates the image's defocus map -- the amount of defocus blur
at each pixel -- and recovers an all-in-focus image. Our method is inspired
from recent works that leverage the dual-pixel sensors available in many
consumer cameras to assist with autofocus, and use them for recovery of defocus
maps or all-in-focus images. These prior works have solved the two recovery
problems independently of each other, and often require large labeled datasets
for supervised training. By contrast, we show that it is beneficial to treat
these two closely-connected problems simultaneously. To this end, we set up an
optimization problem that, by carefully modeling the optics of dual-pixel
images, jointly solves both problems. We use data captured with a consumer
smartphone camera to demonstrate that, after a one-time calibration step, our
approach improves upon prior works for both defocus map estimation and blur
removal, despite being entirely unsupervised.
Related papers
- Reblurring-Guided Single Image Defocus Deblurring: A Learning Framework with Misaligned Training Pairs [65.25002116216771]
We introduce a reblurring-guided learning framework for single image defocus deblurring.
Our reblurring module ensures spatial consistency between the deblurred image, the reblurred image and the input blurry image.
We have collected a new dataset specifically for single image defocus deblurring with typical misalignments.
arXiv Detail & Related papers (2024-09-26T12:37:50Z) - $\text{DC}^2$: Dual-Camera Defocus Control by Learning to Refocus [38.24734623691387]
We propose a system for defocus control for synthetically varying camera aperture, focus distance and arbitrary defocus effects.
Our key insight is to leverage real-world smartphone camera dataset by using image refocus as a proxy task for learning to control defocus.
We demonstrate creative post-capture defocus control enabled by our method, including tilt-shift and content-based defocus effects.
arXiv Detail & Related papers (2023-04-06T17:59:58Z) - Learnable Blur Kernel for Single-Image Defocus Deblurring in the Wild [9.246199263116067]
We propose a novel defocus deblurring method that uses the guidance of the defocus map to implement image deblurring.
The proposed method consists of a learnable blur kernel to estimate the defocus map, and a single-image defocus deblurring generative adversarial network (DefocusGAN) for the first time.
arXiv Detail & Related papers (2022-11-25T10:47:19Z) - Learning Dual-Pixel Alignment for Defocus Deblurring [73.80328094662976]
We propose a Dual-Pixel Alignment Network (DPANet) for defocus deblurring.
It is notably superior to state-of-the-art deblurring methods in reducing defocus blur while recovering visually plausible sharp structures and textures.
arXiv Detail & Related papers (2022-04-26T07:02:58Z) - Learning to Deblur using Light Field Generated and Real Defocus Images [4.926805108788465]
Defocus deblurring is a challenging task due to the spatially varying nature of defocus blur.
We propose a novel deep defocus deblurring network that leverages the strength and overcomes the shortcoming of light fields.
arXiv Detail & Related papers (2022-04-01T11:35:51Z) - Improving Single-Image Defocus Deblurring: How Dual-Pixel Images Help
Through Multi-Task Learning [48.063176079878055]
We propose a single-image deblurring network that incorporates the two sub-aperture views into a multi-task framework.
Our experiments show this multi-task strategy achieves +1dB PSNR improvement over state-of-the-art defocus deblurring methods.
These high-quality DP views can be used for other DP-based applications, such as reflection removal.
arXiv Detail & Related papers (2021-08-11T14:45:15Z) - Single image deep defocus estimation and its applications [82.93345261434943]
We train a deep neural network to classify image patches into one of the 20 levels of blurriness.
The trained model is used to determine the patch blurriness which is then refined by applying an iterative weighted guided filter.
The result is a defocus map that carries the information of the degree of blurriness for each pixel.
arXiv Detail & Related papers (2021-07-30T06:18:16Z) - Geometric Scene Refocusing [9.198471344145092]
We study the fine characteristics of images with a shallow depth-of-field in the context of focal stacks.
We identify in-focus pixels, dual-focus pixels, pixels that exhibit bokeh and spatially-varying blur kernels between focal slices.
We present a comprehensive algorithm for post-capture refocusing in a geometrically correct manner.
arXiv Detail & Related papers (2020-12-20T06:33:55Z) - Defocus Deblurring Using Dual-Pixel Data [41.201653787083735]
Defocus blur arises in images that are captured with a shallow depth of field due to the use of a wide aperture.
We propose an effective defocus deblurring method that exploits data available on dual-pixel (DP) sensors found on most modern cameras.
arXiv Detail & Related papers (2020-05-01T10:38:00Z) - Rapid Whole Slide Imaging via Learning-based Two-shot Virtual
Autofocusing [57.90239401665367]
Whole slide imaging (WSI) is an emerging technology for digital pathology.
We propose the concept of textitvirtual autofocusing, which does not rely on mechanical adjustment to conduct refocusing.
arXiv Detail & Related papers (2020-03-14T13:40:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.