Low-light Image Restoration with Short- and Long-exposure Raw Pairs
- URL: http://arxiv.org/abs/2007.00199v2
- Date: Sun, 28 Feb 2021 07:42:04 GMT
- Title: Low-light Image Restoration with Short- and Long-exposure Raw Pairs
- Authors: Meng Chang, Huajun Feng, Zhihai Xu, Qi Li
- Abstract summary: We propose a new low-light image restoration method by using the complementary information of short- and long-exposure images.
We first propose a novel data generation method to synthesize realistic short- and longexposure raw images.
Then, we design a new long-short-exposure fusion network (LSFNet) to deal with the problems of low-light image fusion.
- Score: 14.643663950015334
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Low-light imaging with handheld mobile devices is a challenging issue.
Limited by the existing models and training data, most existing methods cannot
be effectively applied in real scenarios. In this paper, we propose a new
low-light image restoration method by using the complementary information of
short- and long-exposure images. We first propose a novel data generation
method to synthesize realistic short- and longexposure raw images by simulating
the imaging pipeline in lowlight environment. Then, we design a new
long-short-exposure fusion network (LSFNet) to deal with the problems of
low-light image fusion, including high noise, motion blur, color distortion and
misalignment. The proposed LSFNet takes pairs of shortand long-exposure raw
images as input, and outputs a clear RGB image. Using our data generation
method and the proposed LSFNet, we can recover the details and color of the
original scene, and improve the low-light image quality effectively.
Experiments demonstrate that our method can outperform the state-of-the art
methods.
Related papers
- Improving Lens Flare Removal with General Purpose Pipeline and Multiple
Light Sources Recovery [69.71080926778413]
flare artifacts can affect image visual quality and downstream computer vision tasks.
Current methods do not consider automatic exposure and tone mapping in image signal processing pipeline.
We propose a solution to improve the performance of lens flare removal by revisiting the ISP and design a more reliable light sources recovery strategy.
arXiv Detail & Related papers (2023-08-31T04:58:17Z) - Lighting up NeRF via Unsupervised Decomposition and Enhancement [40.89359754872889]
We propose a novel approach, called Low-Light NeRF (or LLNeRF), to enhance the scene representation and synthesize normal-light novel views directly from sRGB low-light images.
Our method is able to produce novel view images with proper lighting and vivid colors and details, given a collection of camera-finished low dynamic range (8-bits/channel) images from a low-light scene.
arXiv Detail & Related papers (2023-07-20T07:46:34Z) - Enhancing Low-Light Images Using Infrared-Encoded Images [81.8710581927427]
Previous arts mainly focus on the low-light images captured in the visible spectrum using pixel-wise loss.
We propose a novel approach to increase the visibility of images captured under low-light environments by removing the in-camera infrared (IR) cut-off filter.
arXiv Detail & Related papers (2023-07-09T08:29:19Z) - Diffusion in the Dark: A Diffusion Model for Low-Light Text Recognition [78.50328335703914]
Diffusion in the Dark (DiD) is a diffusion model for low-light image reconstruction for text recognition.
We demonstrate that DiD, without any task-specific optimization, can outperform SOTA low-light methods in low-light text recognition on real images.
arXiv Detail & Related papers (2023-03-07T23:52:51Z) - Seeing Through The Noisy Dark: Toward Real-world Low-Light Image
Enhancement and Denoising [125.56062454927755]
Real-world low-light environment usually suffer from lower visibility and heavier noise, due to insufficient light or hardware limitation.
We propose a novel end-to-end method termed Real-world Low-light Enhancement & Denoising Network (RLED-Net)
arXiv Detail & Related papers (2022-10-02T14:57:23Z) - Enhancing Low-Light Images in Real World via Cross-Image Disentanglement [58.754943762945864]
We propose a new low-light image enhancement dataset consisting of misaligned training images with real-world corruptions.
Our model achieves state-of-the-art performances on both the newly proposed dataset and other popular low-light datasets.
arXiv Detail & Related papers (2022-01-10T03:12:52Z) - Degrade is Upgrade: Learning Degradation for Low-light Image Enhancement [52.49231695707198]
We investigate the intrinsic degradation and relight the low-light image while refining the details and color in two steps.
Inspired by the color image formulation, we first estimate the degradation from low-light inputs to simulate the distortion of environment illumination color, and then refine the content to recover the loss of diffuse illumination color.
Our proposed method has surpassed the SOTA by 0.95dB in PSNR on LOL1000 dataset and 3.18% in mAP on ExDark dataset.
arXiv Detail & Related papers (2021-03-19T04:00:27Z) - Learning an Adaptive Model for Extreme Low-light Raw Image Processing [5.706764509663774]
We propose an adaptive low-light raw image enhancement network to improve image quality.
The proposed method has the lowest Noise Level Estimation (NLE) score compared with the state-of-the-art low-light algorithms.
The potential application in video processing is briefly discussed.
arXiv Detail & Related papers (2020-04-22T09:01:07Z) - Burst Denoising of Dark Images [19.85860245798819]
We propose a deep learning framework for obtaining clean and colorful RGB images from extremely dark raw images.
The backbone of our framework is a novel coarse-to-fine network architecture that generates high-quality outputs in a progressive manner.
Our experiments demonstrate that the proposed approach leads to perceptually more pleasing results than state-of-the-art methods.
arXiv Detail & Related papers (2020-03-17T17:17:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.