Learning Unpaired Image Dehazing with Physics-based Rehazy Generation
- URL: http://arxiv.org/abs/2506.12824v1
- Date: Sun, 15 Jun 2025 12:12:28 GMT
- Title: Learning Unpaired Image Dehazing with Physics-based Rehazy Generation
- Authors: Haoyou Deng, Zhiqiang Li, Feng Zhang, Qingbo Lu, Zisheng Cao, Yuanjie Shao, Shuhang Gu, Changxin Gao, Nong Sang,
- Abstract summary: Overfitting to synthetic training pairs remains a critical challenge in image dehazing.<n>We propose a novel training strategy for unpaired image dehazing, termed Rehazy, to improve both dehazing performance and training stability.
- Score: 50.37414006427923
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Overfitting to synthetic training pairs remains a critical challenge in image dehazing, leading to poor generalization capability to real-world scenarios. To address this issue, existing approaches utilize unpaired realistic data for training, employing CycleGAN or contrastive learning frameworks. Despite their progress, these methods often suffer from training instability, resulting in limited dehazing performance. In this paper, we propose a novel training strategy for unpaired image dehazing, termed Rehazy, to improve both dehazing performance and training stability. This strategy explores the consistency of the underlying clean images across hazy images and utilizes hazy-rehazy pairs for effective learning of real haze characteristics. To favorably construct hazy-rehazy pairs, we develop a physics-based rehazy generation pipeline, which is theoretically validated to reliably produce high-quality rehazy images. Additionally, leveraging the rehazy strategy, we introduce a dual-branch framework for dehazing network training, where a clean branch provides a basic dehazing capability in a synthetic manner, and a hazy branch enhances the generalization ability with hazy-rehazy pairs. Moreover, we design a new dehazing network within these branches to improve the efficiency, which progressively restores clean scenes from coarse to fine. Extensive experiments on four benchmarks demonstrate the superior performance of our approach, exceeding the previous state-of-the-art methods by 3.58 dB on the SOTS-Indoor dataset and by 1.85 dB on the SOTS-Outdoor dataset in PSNR. Our code will be publicly available.
Related papers
- When Schrödinger Bridge Meets Real-World Image Dehazing with Unpaired Training [11.606495142345477]
We propose DehazeSB, a novel unpaired dehazing framework based on the Schr"odinger Bridge.<n>By leveraging optimal transport (OT) theory, DehazeSB directly bridges the distributions between hazy and clear images.<n>We introduce detail-preserving regularization, which enforces pixel-level alignment between hazy inputs and dehazed outputs.
arXiv Detail & Related papers (2025-07-13T07:39:44Z) - Learning Hazing to Dehazing: Towards Realistic Haze Generation for Real-World Image Dehazing [59.43187521828543]
We introduce a novel hazing-dehazing pipeline consisting of a Realistic Hazy Image Generation framework (HazeGen) and a Diffusion-based Dehazing framework (DiffDehaze)<n>HazeGen harnesses robust generative diffusion priors of real-world hazy images embedded in a pre-trained text-to-image diffusion model.<n>By employing specialized hybrid training and blended sampling strategies, HazeGen produces realistic and diverse hazy images as high-quality training data for DiffDehaze.
arXiv Detail & Related papers (2025-03-25T01:55:39Z) - DRACO-DehazeNet: An Efficient Image Dehazing Network Combining Detail Recovery and a Novel Contrastive Learning Paradigm [3.649619954898362]
We develop Detail Recovery And Contrastive DehazeNet.<n>It provides efficient and effective dehazing via a dense dilated inverted residual block and an attention-based detail recovery network.<n>A major innovation is its ability to train effectively with limited data, achieved through a novel quadruplet loss-based contrastive dehazing paradigm.
arXiv Detail & Related papers (2024-10-18T16:48:31Z) - HazeCLIP: Towards Language Guided Real-World Image Dehazing [62.4454483961341]
Existing methods have achieved remarkable performance in image dehazing, particularly on synthetic datasets.<n>This paper introduces HazeCLIP, a language-guided adaptation framework designed to enhance the real-world performance of pre-trained dehazing networks.
arXiv Detail & Related papers (2024-07-18T17:18:25Z) - UCL-Dehaze: Towards Real-world Image Dehazing via Unsupervised
Contrastive Learning [57.40713083410888]
This paper explores contrastive learning with an adversarial training effort to leverage unpaired real-world hazy and clean images.
We propose an effective unsupervised contrastive learning paradigm for image dehazing, dubbed UCL-Dehaze.
We conduct comprehensive experiments to evaluate our UCL-Dehaze and demonstrate its superiority over the state-of-the-arts.
arXiv Detail & Related papers (2022-05-04T03:25:13Z) - Robust Single Image Dehazing Based on Consistent and Contrast-Assisted
Reconstruction [95.5735805072852]
We propose a novel density-variational learning framework to improve the robustness of the image dehzing model.
Specifically, the dehazing network is optimized under the consistency-regularized framework.
Our method significantly surpasses the state-of-the-art approaches.
arXiv Detail & Related papers (2022-03-29T08:11:04Z) - Mutual Learning for Domain Adaptation: Self-distillation Image Dehazing
Network with Sample-cycle [7.452382358080454]
We propose a mutual learning dehazing framework for domain adaption.
Specifically, we first devise two siamese networks: a teacher network in the synthetic domain and a student network in the real domain.
We show that the framework outperforms state-of-the-art dehazing techniques in terms of subjective and objective evaluation.
arXiv Detail & Related papers (2022-03-17T16:32:14Z) - FD-GAN: Generative Adversarial Networks with Fusion-discriminator for
Single Image Dehazing [48.65974971543703]
We propose a fully end-to-end Generative Adversarial Networks with Fusion-discriminator (FD-GAN) for image dehazing.
Our model can generator more natural and realistic dehazed images with less color distortion and fewer artifacts.
Experiments have shown that our method reaches state-of-the-art performance on both public synthetic datasets and real-world images.
arXiv Detail & Related papers (2020-01-20T04:36:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.