From Generation to Suppression: Towards Effective Irregular Glow Removal
for Nighttime Visibility Enhancement
- URL: http://arxiv.org/abs/2307.16783v1
- Date: Mon, 31 Jul 2023 15:51:15 GMT
- Title: From Generation to Suppression: Towards Effective Irregular Glow Removal
for Nighttime Visibility Enhancement
- Authors: Wanyu Wu, Wei Wang, Zheng Wang, Kui Jiang and Xin Xu
- Abstract summary: Existing Low-Light Image Enhancement (LLIE) methods are primarily designed to improve brightness in dark regions, which suffer from severe degradation in nighttime images.
These methods have limited exploration in another major visibility damage, the glow effects in real night scenes.
We propose a new method for learning physical glow generation via multiple scattering estimation according to the Atmospheric Point Spread Function (APSF)
The proposed method is based on zero-shot learning and does not rely on any paired or unpaired training data. Empirical evaluations demonstrate the effectiveness of the proposed method in both glow suppression and low-light enhancement tasks.
- Score: 22.565044107631696
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most existing Low-Light Image Enhancement (LLIE) methods are primarily
designed to improve brightness in dark regions, which suffer from severe
degradation in nighttime images. However, these methods have limited
exploration in another major visibility damage, the glow effects in real night
scenes. Glow effects are inevitable in the presence of artificial light sources
and cause further diffused blurring when directly enhanced. To settle this
issue, we innovatively consider the glow suppression task as learning physical
glow generation via multiple scattering estimation according to the Atmospheric
Point Spread Function (APSF). In response to the challenges posed by uneven
glow intensity and varying source shapes, an APSF-based Nighttime Imaging Model
with Near-field Light Sources (NIM-NLS) is specifically derived to design a
scalable Light-aware Blind Deconvolution Network (LBDN). The glow-suppressed
result is then brightened via a Retinex-based Enhancement Module (REM).
Remarkably, the proposed glow suppression method is based on zero-shot learning
and does not rely on any paired or unpaired training data. Empirical
evaluations demonstrate the effectiveness of the proposed method in both glow
suppression and low-light enhancement tasks.
Related papers
- DAP-LED: Learning Degradation-Aware Priors with CLIP for Joint Low-light Enhancement and Deblurring [14.003870853594972]
We propose a novel transformer-based joint learning framework, named DAP-LED.
It can jointly achieve low-light enhancement and deblurring, benefiting downstream tasks, such as depth estimation, segmentation, and detection in the dark.
The key insight is to leverage CLIP to adaptively learn the degradation levels from images at night.
arXiv Detail & Related papers (2024-09-20T13:37:53Z) - NDELS: A Novel Approach for Nighttime Dehazing, Low-Light Enhancement,
and Light Suppression [4.976703689624386]
This paper introduces a pioneering solution named Nighttime Dehazing, Low-Light Enhancement, and Light Suppression (NDELS)
NDELS utilizes a unique network that combines three essential processes to enhance visibility, low-light regions, and effectively suppress glare from bright light sources.
The efficacy of NDELS is rigorously validated through extensive comparisons with eight state-of-the-art algorithms across four diverse datasets.
arXiv Detail & Related papers (2023-12-11T21:38:32Z) - Improving Lens Flare Removal with General Purpose Pipeline and Multiple
Light Sources Recovery [69.71080926778413]
flare artifacts can affect image visual quality and downstream computer vision tasks.
Current methods do not consider automatic exposure and tone mapping in image signal processing pipeline.
We propose a solution to improve the performance of lens flare removal by revisiting the ISP and design a more reliable light sources recovery strategy.
arXiv Detail & Related papers (2023-08-31T04:58:17Z) - Enhancing Visibility in Nighttime Haze Images Using Guided APSF and
Gradient Adaptive Convolution [28.685126418090338]
Existing nighttime dehazing methods often struggle with handling glow or low-light conditions.
In this paper, we enhance the visibility from a single nighttime haze image by suppressing glow and enhancing low-light regions.
Our method achieves a PSNR of 30.38dB, outperforming state-of-the-art methods by 13% on GTA5 nighttime haze dataset.
arXiv Detail & Related papers (2023-08-03T12:58:23Z) - Flare7K++: Mixing Synthetic and Real Datasets for Nighttime Flare
Removal and Beyond [77.72043833102191]
We introduce the first comprehensive nighttime flare removal dataset, consisting of 962 real-captured flare images (Flare-R) and 7,000 synthetic flares (Flare7K)
Compared to Flare7K, Flare7K++ is particularly effective in eliminating complicated degradation around the light source, which is intractable by using synthetic flares alone.
To address this issue, we additionally provide the annotations of light sources in Flare7K++ and propose a new end-to-end pipeline to preserve the light source while removing lens flares.
arXiv Detail & Related papers (2023-06-07T08:27:44Z) - Unsupervised Night Image Enhancement: When Layer Decomposition Meets
Light-Effects Suppression [67.7508230688415]
We introduce an unsupervised method that integrates a layer decomposition network and a light-effects suppression network.
Our method outperforms state-of-the-art methods in suppressing night light effects and boosting the intensity of dark regions.
arXiv Detail & Related papers (2022-07-21T16:10:24Z) - Cycle-Interactive Generative Adversarial Network for Robust Unsupervised
Low-Light Enhancement [109.335317310485]
Cycle-Interactive Generative Adversarial Network (CIGAN) is capable of not only better transferring illumination distributions between low/normal-light images but also manipulating detailed signals.
In particular, the proposed low-light guided transformation feed-forwards the features of low-light images from the generator of enhancement GAN into the generator of degradation GAN.
arXiv Detail & Related papers (2022-07-03T06:37:46Z) - Light Pollution Reduction in Nighttime Photography [32.87477623401456]
Nighttime photographers are often troubled by light pollution of unwanted artificial lights.
In this paper we develop a physically-based light pollution reduction (LPR) algorithm that can substantially alleviate the degradations of perceptual quality.
arXiv Detail & Related papers (2021-06-18T10:38:13Z) - Degrade is Upgrade: Learning Degradation for Low-light Image Enhancement [52.49231695707198]
We investigate the intrinsic degradation and relight the low-light image while refining the details and color in two steps.
Inspired by the color image formulation, we first estimate the degradation from low-light inputs to simulate the distortion of environment illumination color, and then refine the content to recover the loss of diffuse illumination color.
Our proposed method has surpassed the SOTA by 0.95dB in PSNR on LOL1000 dataset and 3.18% in mAP on ExDark dataset.
arXiv Detail & Related papers (2021-03-19T04:00:27Z) - Low-light Image Enhancement Using the Cell Vibration Model [12.400040803969501]
Low light very likely leads to the degradation of an image's quality and even causes visual task failures.
We propose a new single low-light image lightness enhancement method.
Experimental results show that the proposed algorithm is superior to nine state-of-the-art methods.
arXiv Detail & Related papers (2020-06-03T13:39:10Z) - Unsupervised Low-light Image Enhancement with Decoupled Networks [103.74355338972123]
We learn a two-stage GAN-based framework to enhance the real-world low-light images in a fully unsupervised fashion.
Our proposed method outperforms the state-of-the-art unsupervised image enhancement methods in terms of both illumination enhancement and noise reduction.
arXiv Detail & Related papers (2020-05-06T13:37:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.