Unsupervised Night Image Enhancement: When Layer Decomposition Meets
Light-Effects Suppression
- URL: http://arxiv.org/abs/2207.10564v1
- Date: Thu, 21 Jul 2022 16:10:24 GMT
- Title: Unsupervised Night Image Enhancement: When Layer Decomposition Meets
Light-Effects Suppression
- Authors: Yeying Jin, Wenhan Yang and Robby T. Tan
- Abstract summary: We introduce an unsupervised method that integrates a layer decomposition network and a light-effects suppression network.
Our method outperforms state-of-the-art methods in suppressing night light effects and boosting the intensity of dark regions.
- Score: 67.7508230688415
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Night images suffer not only from low light, but also from uneven
distributions of light. Most existing night visibility enhancement methods
focus mainly on enhancing low-light regions. This inevitably leads to over
enhancement and saturation in bright regions, such as those regions affected by
light effects (glare, floodlight, etc). To address this problem, we need to
suppress the light effects in bright regions while, at the same time, boosting
the intensity of dark regions. With this idea in mind, we introduce an
unsupervised method that integrates a layer decomposition network and a
light-effects suppression network. Given a single night image as input, our
decomposition network learns to decompose shading, reflectance and
light-effects layers, guided by unsupervised layer-specific prior losses. Our
light-effects suppression network further suppresses the light effects and, at
the same time, enhances the illumination in dark regions. This light-effects
suppression network exploits the estimated light-effects layer as the guidance
to focus on the light-effects regions. To recover the background details and
reduce hallucination/artefacts, we propose structure and high-frequency
consistency losses. Our quantitative and qualitative evaluations on real images
show that our method outperforms state-of-the-art methods in suppressing night
light effects and boosting the intensity of dark regions.
Related papers
- Beyond Night Visibility: Adaptive Multi-Scale Fusion of Infrared and
Visible Images [49.75771095302775]
We propose an Adaptive Multi-scale Fusion network (AMFusion) with infrared and visible images.
First, we separately fuse spatial and semantic features from infrared and visible images, where the former are used for the adjustment of light distribution.
Second, we utilize detection features extracted by a pre-trained backbone that guide the fusion of semantic features.
Third, we propose a new illumination loss to constrain fusion image with normal light intensity.
arXiv Detail & Related papers (2024-03-02T03:52:07Z) - NDELS: A Novel Approach for Nighttime Dehazing, Low-Light Enhancement,
and Light Suppression [4.976703689624386]
This paper introduces a pioneering solution named Nighttime Dehazing, Low-Light Enhancement, and Light Suppression (NDELS)
NDELS utilizes a unique network that combines three essential processes to enhance visibility, low-light regions, and effectively suppress glare from bright light sources.
The efficacy of NDELS is rigorously validated through extensive comparisons with eight state-of-the-art algorithms across four diverse datasets.
arXiv Detail & Related papers (2023-12-11T21:38:32Z) - Enhancing Visibility in Nighttime Haze Images Using Guided APSF and
Gradient Adaptive Convolution [28.685126418090338]
Existing nighttime dehazing methods often struggle with handling glow or low-light conditions.
In this paper, we enhance the visibility from a single nighttime haze image by suppressing glow and enhancing low-light regions.
Our method achieves a PSNR of 30.38dB, outperforming state-of-the-art methods by 13% on GTA5 nighttime haze dataset.
arXiv Detail & Related papers (2023-08-03T12:58:23Z) - From Generation to Suppression: Towards Effective Irregular Glow Removal
for Nighttime Visibility Enhancement [22.565044107631696]
Existing Low-Light Image Enhancement (LLIE) methods are primarily designed to improve brightness in dark regions, which suffer from severe degradation in nighttime images.
These methods have limited exploration in another major visibility damage, the glow effects in real night scenes.
We propose a new method for learning physical glow generation via multiple scattering estimation according to the Atmospheric Point Spread Function (APSF)
The proposed method is based on zero-shot learning and does not rely on any paired or unpaired training data. Empirical evaluations demonstrate the effectiveness of the proposed method in both glow suppression and low-light enhancement tasks.
arXiv Detail & Related papers (2023-07-31T15:51:15Z) - Seeing Through The Noisy Dark: Toward Real-world Low-Light Image
Enhancement and Denoising [125.56062454927755]
Real-world low-light environment usually suffer from lower visibility and heavier noise, due to insufficient light or hardware limitation.
We propose a novel end-to-end method termed Real-world Low-light Enhancement & Denoising Network (RLED-Net)
arXiv Detail & Related papers (2022-10-02T14:57:23Z) - When the Sun Goes Down: Repairing Photometric Losses for All-Day Depth
Estimation [47.617222712429026]
We show how to use a combination of three techniques to allow the existing photometric losses to work for both day and nighttime images.
First, we introduce a per-pixel neural intensity transformation to compensate for the light changes that occur between successive frames.
Second, we predict a per-pixel residual flow map that we use to correct the reprojection correspondences induced by the estimated ego-motion and depth.
arXiv Detail & Related papers (2022-06-28T09:29:55Z) - Low-light Image Enhancement via Breaking Down the Darkness [8.707025631892202]
This paper presents a novel framework inspired by the divide-and-rule principle.
We propose to convert an image from the RGB space into a luminance-chrominance one.
An adjustable noise suppression network is designed to eliminate noise in the brightened luminance.
The enhanced luminance further serves as guidance for the chrominance mapper to generate realistic colors.
arXiv Detail & Related papers (2021-11-30T16:50:59Z) - Light Pollution Reduction in Nighttime Photography [32.87477623401456]
Nighttime photographers are often troubled by light pollution of unwanted artificial lights.
In this paper we develop a physically-based light pollution reduction (LPR) algorithm that can substantially alleviate the degradations of perceptual quality.
arXiv Detail & Related papers (2021-06-18T10:38:13Z) - Deep Bilateral Retinex for Low-Light Image Enhancement [96.15991198417552]
Low-light images suffer from poor visibility caused by low contrast, color distortion and measurement noise.
This paper proposes a deep learning method for low-light image enhancement with a particular focus on handling the measurement noise.
The proposed method is very competitive to the state-of-the-art methods, and has significant advantage over others when processing images captured in extremely low lighting conditions.
arXiv Detail & Related papers (2020-07-04T06:26:44Z) - Unsupervised Low-light Image Enhancement with Decoupled Networks [103.74355338972123]
We learn a two-stage GAN-based framework to enhance the real-world low-light images in a fully unsupervised fashion.
Our proposed method outperforms the state-of-the-art unsupervised image enhancement methods in terms of both illumination enhancement and noise reduction.
arXiv Detail & Related papers (2020-05-06T13:37:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.