Deep Exposure Fusion with Deghosting via Homography Estimation and
Attention Learning
- URL: http://arxiv.org/abs/2004.09089v1
- Date: Mon, 20 Apr 2020 07:00:14 GMT
- Title: Deep Exposure Fusion with Deghosting via Homography Estimation and
Attention Learning
- Authors: Sheng-Yeh Chen and Yung-Yu Chuang
- Abstract summary: We propose a deep network for exposure fusion to deal with ghosting artifacts and detail loss caused by camera motion or moving objects.
Experiments on real-world photos taken using handheld mobile phones show that the proposed method can generate high-quality images with faithful detail and vivid color rendition in both dark and bright areas.
- Score: 29.036754445277314
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Modern cameras have limited dynamic ranges and often produce images with
saturated or dark regions using a single exposure. Although the problem could
be addressed by taking multiple images with different exposures, exposure
fusion methods need to deal with ghosting artifacts and detail loss caused by
camera motion or moving objects. This paper proposes a deep network for
exposure fusion. For reducing the potential ghosting problem, our network only
takes two images, an underexposed image and an overexposed one. Our network
integrates together homography estimation for compensating camera motion,
attention mechanism for correcting remaining misalignment and moving pixels,
and adversarial learning for alleviating other remaining artifacts. Experiments
on real-world photos taken using handheld mobile phones show that the proposed
method can generate high-quality images with faithful detail and vivid color
rendition in both dark and bright areas.
Related papers
- High Dynamic Range Imaging of Dynamic Scenes with Saturation
Compensation but without Explicit Motion Compensation [20.911738532410766]
High dynamic range (LDR) imaging is a highly challenging task since a large amount of information is lost due to the limitations of camera sensors.
For HDR imaging, some methods capture multiple low dynamic range (LDR) images with altering exposures to aggregate more information.
Most existing methods focus on motion compensation to reduce the ghosting artifacts, but they still produce unsatisfying results.
arXiv Detail & Related papers (2023-08-22T02:44:03Z) - Exposure Fusion for Hand-held Camera Inputs with Optical Flow and
PatchMatch [53.149395644547226]
We propose a hybrid synthesis method for multi-exposure image fusion taken by hand-held cameras.
Our method can deal with such motions and maintain the exposure information of each input effectively.
Experiment results demonstrate the effectiveness and robustness of our method.
arXiv Detail & Related papers (2023-04-10T09:06:37Z) - Multi-Exposure HDR Composition by Gated Swin Transformer [8.619880437958525]
This paper provides a novel multi-exposure fusion model based on Swin Transformer.
We exploit the long distance contextual dependency in the exposure-space pyramid by the self-attention mechanism.
Experiments show that our model achieves the accuracy on par with current top performing multi-exposure HDR imaging models.
arXiv Detail & Related papers (2023-03-15T15:38:43Z) - Self-Supervised Image Restoration with Blurry and Noisy Pairs [66.33313180767428]
Images with high ISO usually have inescapable noise, while the long-exposure ones may be blurry due to camera shake or object motion.
Existing solutions generally suggest to seek a balance between noise and blur, and learn denoising or deblurring models under either full- or self-supervision.
We propose jointly leveraging the short-exposure noisy image and the long-exposure blurry image for better image restoration.
arXiv Detail & Related papers (2022-11-14T12:57:41Z) - High Dynamic Range and Super-Resolution from Raw Image Bursts [52.341483902624006]
This paper introduces the first approach to reconstruct high-resolution, high-dynamic range color images from raw photographic bursts captured by a handheld camera with exposure bracketing.
The proposed algorithm is fast, with low memory requirements compared to state-of-the-art learning-based approaches to image restoration.
Experiments demonstrate its excellent performance with super-resolution factors of up to $times 4$ on real photographs taken in the wild with hand-held cameras.
arXiv Detail & Related papers (2022-07-29T13:31:28Z) - Robust Scene Inference under Noise-Blur Dual Corruptions [20.0721386176278]
Scene inference under low-light is a challenging problem due to severe noise in the captured images.
With the rise of cameras capable of capturing multiple exposures of the same scene simultaneously, it is possible to overcome this trade-off.
We propose a method to leverage these multi exposure captures for robust inference under low-light and motion.
arXiv Detail & Related papers (2022-07-24T02:52:00Z) - Learning Spatially Varying Pixel Exposures for Motion Deblurring [49.07867902677453]
We present a novel approach of leveraging spatially varying pixel exposures for motion deblurring.
Our work illustrates the promising role that focal-plane sensor--processors can play in the future of computational imaging.
arXiv Detail & Related papers (2022-04-14T23:41:49Z) - Neural Radiance Fields for Outdoor Scene Relighting [70.97747511934705]
We present NeRF-OSR, the first approach for outdoor scene relighting based on neural radiance fields.
In contrast to the prior art, our technique allows simultaneous editing of both scene illumination and camera viewpoint.
It also includes a dedicated network for shadow reproduction, which is crucial for high-quality outdoor scene relighting.
arXiv Detail & Related papers (2021-12-09T18:59:56Z) - Learning Multi-Scale Photo Exposure Correction [51.57836446833474]
Capturing photographs with wrong exposures remains a major source of errors in camera-based imaging.
We propose a coarse-to-fine deep neural network (DNN) model, trainable in an end-to-end manner, that addresses each sub-problem separately.
Our method achieves results on par with existing state-of-the-art methods on underexposed images.
arXiv Detail & Related papers (2020-03-25T19:33:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.