The Devil is in the Darkness: Diffusion-Based Nighttime Dehazing Anchored in Brightness Perception
- URL: http://arxiv.org/abs/2506.02395v1
- Date: Tue, 03 Jun 2025 03:21:13 GMT
- Title: The Devil is in the Darkness: Diffusion-Based Nighttime Dehazing Anchored in Brightness Perception
- Authors: Xiaofeng Cong, Yu-Xin Zhang, Haoran Wei, Yeying Jin, Junming Hou, Jie Gui, Jing Zhang, Dacheng Tao,
- Abstract summary: We introduce the Diffusion-Based Nighttime Dehazing framework, which excels in both data synthesis and lighting reconstruction.<n>We propose a restoration model that integrates a pre-trained diffusion model guided by a brightness perception network.<n>Experiments validate our dataset's utility and the model's superior performance in joint haze removal and brightness mapping.
- Score: 58.895000127068194
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While nighttime image dehazing has been extensively studied, converting nighttime hazy images to daytime-equivalent brightness remains largely unaddressed. Existing methods face two critical limitations: (1) datasets overlook the brightness relationship between day and night, resulting in the brightness mapping being inconsistent with the real world during image synthesis; and (2) models do not explicitly incorporate daytime brightness knowledge, limiting their ability to reconstruct realistic lighting. To address these challenges, we introduce the Diffusion-Based Nighttime Dehazing (DiffND) framework, which excels in both data synthesis and lighting reconstruction. Our approach starts with a data synthesis pipeline that simulates severe distortions while enforcing brightness consistency between synthetic and real-world scenes, providing a strong foundation for learning night-to-day brightness mapping. Next, we propose a restoration model that integrates a pre-trained diffusion model guided by a brightness perception network. This design harnesses the diffusion model's generative ability while adapting it to nighttime dehazing through brightness-aware optimization. Experiments validate our dataset's utility and the model's superior performance in joint haze removal and brightness mapping.
Related papers
- Seeing Beyond Haze: Generative Nighttime Image Dehazing [16.949777020546264]
BeyondHaze is a generative nighttime dehazing method that infers background information in regions where it may be absent.<n>Our approach is developed on two main ideas: gaining strong background priors by adapting image diffusion models to the nighttime dehazing problem, and enhancing generative ability for haze- and glow-obscured scene areas through guided training.<n>Experiments on real-world images demonstrate that BeyondHaze effectively restores visibility in dense nighttime haze.
arXiv Detail & Related papers (2025-03-11T06:08:25Z) - Night-to-Day Translation via Illumination Degradation Disentanglement [51.77716565167767]
Night-to-Day translation aims to achieve day-like vision for nighttime scenes.
processing night images with complex degradations remains a significant challenge under unpaired conditions.
We propose textbfN2D3 to identify different degradation patterns in nighttime images.
arXiv Detail & Related papers (2024-11-21T08:51:32Z) - Exploring Reliable Matching with Phase Enhancement for Night-time Semantic Segmentation [58.180226179087086]
We propose a novel end-to-end optimized approach, named NightFormer, tailored for night-time semantic segmentation.
Specifically, we design a pixel-level texture enhancement module to acquire texture-aware features hierarchically with phase enhancement and amplified attention.
Our proposed method performs favorably against state-of-the-art night-time semantic segmentation methods.
arXiv Detail & Related papers (2024-08-25T13:59:31Z) - Sun Off, Lights On: Photorealistic Monocular Nighttime Simulation for Robust Semantic Perception [53.631644875171595]
Nighttime scenes are hard to semantically perceive with learned models and annotate for humans.
Our method, named Sun Off, Lights On (SOLO), is the first to perform nighttime simulation on single images in a photorealistic fashion by operating in 3D.
Not only is the visual quality and photorealism of our nighttime images superior to competing approaches including diffusion models, but the former images are also proven more beneficial for semantic nighttime segmentation in day-to-night adaptation.
arXiv Detail & Related papers (2024-07-29T18:00:09Z) - A Semi-supervised Nighttime Dehazing Baseline with Spatial-Frequency Aware and Realistic Brightness Constraint [19.723367790947684]
We propose a semi-supervised model for real-world nighttime dehazing.
First, the spatial attention and frequency spectrum filtering are implemented as a spatial-frequency domain information interaction module.
Second, a pseudo-label-based retraining strategy and a local window-based brightness loss for semi-supervised training process is designed to suppress haze and glow.
arXiv Detail & Related papers (2024-03-27T13:27:02Z) - NightHazeFormer: Single Nighttime Haze Removal Using Prior Query
Transformer [39.90066556289063]
We propose an end-to-end transformer-based framework for nighttime haze removal, called NightHazeFormer.
Our proposed approach consists of two stages: supervised pre-training and semi-supervised fine-tuning.
Experiments on several synthetic and real-world datasets demonstrate the superiority of our NightHazeFormer over state-of-the-art nighttime haze removal methods.
arXiv Detail & Related papers (2023-05-16T15:26:09Z) - Regularizing Nighttime Weirdness: Efficient Self-supervised Monocular
Depth Estimation in the Dark [20.66405067066299]
We introduce Priors-Based Regularization to learn distribution knowledge from unpaired depth maps.
We also leverage Mapping-Consistent Image Enhancement module to enhance image visibility and contrast.
Our framework achieves remarkable improvements and state-of-the-art results on two nighttime datasets.
arXiv Detail & Related papers (2021-08-09T06:24:35Z) - Degrade is Upgrade: Learning Degradation for Low-light Image Enhancement [52.49231695707198]
We investigate the intrinsic degradation and relight the low-light image while refining the details and color in two steps.
Inspired by the color image formulation, we first estimate the degradation from low-light inputs to simulate the distortion of environment illumination color, and then refine the content to recover the loss of diffuse illumination color.
Our proposed method has surpassed the SOTA by 0.95dB in PSNR on LOL1000 dataset and 3.18% in mAP on ExDark dataset.
arXiv Detail & Related papers (2021-03-19T04:00:27Z) - Nighttime Dehazing with a Synthetic Benchmark [147.21955799938115]
We propose a novel synthetic method called 3R to simulate nighttime hazy images from daytime clear images.
We generate realistic nighttime hazy images by sampling real-world light colors from a prior empirical distribution.
Experiment results demonstrate their superiority over state-of-the-art methods in terms of both image quality and runtime.
arXiv Detail & Related papers (2020-08-10T02:16:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.