A PDE-Based Image Dehazing Method via Atmospheric Scattering Theory
- URL: http://arxiv.org/abs/2506.08793v2
- Date: Sun, 12 Oct 2025 13:26:29 GMT
- Title: A PDE-Based Image Dehazing Method via Atmospheric Scattering Theory
- Authors: Liubing Hu, Pu Wang, Guangwei Gao, Chunyan Wang, Zhuoran Zheng,
- Abstract summary: We introduce a novel partial differential equation (PDE) framework for single-image dehazing.<n>A key innovation is an adaptive regularization mechanism guided by the dark channel prior.<n>Experiments confirm our method effective haze removal while preserving high image fidelity.
- Score: 21.305574997938685
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper introduces a novel partial differential equation (PDE) framework for single-image dehazing. We embed the atmospheric scattering model into a PDE featuring edge-preserving diffusion and a nonlocal operator to maintain both local details and global structures. A key innovation is an adaptive regularization mechanism guided by the dark channel prior, which adjusts smoothing strength based on haze density. The framework's mathematical well-posedness is rigorously established by proving the existence and uniqueness of its weak solution in $H_0^1(\Omega)$. An efficient, GPU-accelerated fixed-point solver is used for implementation. Experiments confirm our method achieves effective haze removal while preserving high image fidelity, offering a principled alternative to purely data-driven techniques.
Related papers
- Diffusion Models for Solving Inverse Problems via Posterior Sampling with Piecewise Guidance [52.705112811734566]
A novel diffusion-based framework is introduced for solving inverse problems using a piecewise guidance scheme.<n>The proposed method is problem-agnostic and readily adaptable to a variety of inverse problems.<n>The framework achieves a reduction in inference time of (25%) for inpainting with both random and center masks, and (23%) and (24%) for (4times) and (8times) super-resolution tasks.
arXiv Detail & Related papers (2025-07-22T19:35:14Z) - Beyond Blur: A Fluid Perspective on Generative Diffusion Models [0.6999740786886535]
We propose a novel PDE-driven corruption process for generative image synthesis based on advection-diffusion processes.<n>This work bridges fluid dynamics, dimensionless PDE theory, and deep generative modeling, offering a fresh perspective on physically informed image corruption processes.
arXiv Detail & Related papers (2025-06-20T08:31:30Z) - VideoPDE: Unified Generative PDE Solving via Video Inpainting Diffusion Models [8.189440319895823]
We present a unified framework for solving partial differential equations (PDEs) using video-inpainting diffusion transformer models.<n>Our method proposes pixel-space video diffusion models for fine-grained, high-fidelity inpainting and conditioning.<n>Our method offers an accurate and versatile solution across a wide range of PDEs and problem setups, outperforming state-of-the-art baselines.
arXiv Detail & Related papers (2025-06-16T17:58:00Z) - Physics-Informed Distillation of Diffusion Models for PDE-Constrained Generation [19.734778762515468]
diffusion models have gained increasing attention in the modeling of physical systems, particularly those governed by partial differential equations (PDEs)<n>We propose a simple yet effective post-hoc distillation approach, where PDE constraints are not injected directly into the diffusion process, but instead enforced during a post-hoc distillation stage.
arXiv Detail & Related papers (2025-05-28T14:17:58Z) - Forward-only Diffusion Probabilistic Models [23.247850964456177]
This work presents a forward-only diffusion (FoD) approach for generative modelling.<n>FoD directly learns data generation through a single forward diffusion process, yielding a simple yet efficient generative framework.<n>Our code is available at https://github.com/Algolzw/FoD.
arXiv Detail & Related papers (2025-05-22T14:47:07Z) - Outsourced diffusion sampling: Efficient posterior inference in latent spaces of generative models [65.71506381302815]
We propose amortize the cost of sampling from a posterior distribution of the form $p(mathbfxmidmathbfy) propto p_theta(mathbfx)$.<n>For many models and constraints, the posterior in noise space is smoother than in data space, making it more suitable for amortized inference.
arXiv Detail & Related papers (2025-02-10T19:49:54Z) - Variable Selection in Convex Piecewise Linear Regression [5.366354612549172]
This paper presents Sparse Gradient as a solution for variable selection in convex piecewise linear regression.
A non-asymptotic local convergence analysis is provided for SpGD under subGaussian noise.
arXiv Detail & Related papers (2024-11-04T16:19:09Z) - Conditioning diffusion models by explicit forward-backward bridging [18.358369507787742]
Given an unconditional diffusion model targeting a joint model $pi(x, y)$, using it to perform conditional simulation $pi(x mid y)$ is still largely an open question.<n>We express emphexact conditional simulation within the emphapproximate diffusion model as an inference problem on an augmented space corresponding to a partial SDE bridge.
arXiv Detail & Related papers (2024-05-22T16:17:03Z) - Convergence analysis of kernel learning FBSDE filter [0.8528368686417979]
Kernel learning forward backward SDE filter is an iterative and adaptive meshfree approach to solve the nonlinear filtering problem.
It builds from forward backward SDE for Fokker-Planker equation, which defines evolving density for the state variable, and employs to approximate density.
arXiv Detail & Related papers (2024-05-22T07:02:35Z) - Distilling Diffusion Models into Conditional GANs [90.76040478677609]
We distill a complex multistep diffusion model into a single-step conditional GAN student model.
For efficient regression loss, we propose E-LatentLPIPS, a perceptual loss operating directly in diffusion model's latent space.
We demonstrate that our one-step generator outperforms cutting-edge one-step diffusion distillation models.
arXiv Detail & Related papers (2024-05-09T17:59:40Z) - BlindDiff: Empowering Degradation Modelling in Diffusion Models for Blind Image Super-Resolution [52.47005445345593]
BlindDiff is a DM-based blind SR method to tackle the blind degradation settings in SISR.
BlindDiff seamlessly integrates the MAP-based optimization into DMs.
Experiments on both synthetic and real-world datasets show that BlindDiff achieves the state-of-the-art performance.
arXiv Detail & Related papers (2024-03-15T11:21:34Z) - Minimax Optimality of Score-based Diffusion Models: Beyond the Density Lower Bound Assumptions [11.222970035173372]
kernel-based score estimator achieves an optimal mean square error of $widetildeOleft(n-1 t-fracd+22(tfracd2 vee 1)right)
We show that a kernel-based score estimator achieves an optimal mean square error of $widetildeOleft(n-1/2 t-fracd4right)$ upper bound for the total variation error of the distribution of the sample generated by the diffusion model under a mere sub-Gaussian
arXiv Detail & Related papers (2024-02-23T20:51:31Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - Hierarchical Integration Diffusion Model for Realistic Image Deblurring [71.76410266003917]
Diffusion models (DMs) have been introduced in image deblurring and exhibited promising performance.
We propose the Hierarchical Integration Diffusion Model (HI-Diff), for realistic image deblurring.
Experiments on synthetic and real-world blur datasets demonstrate that our HI-Diff outperforms state-of-the-art methods.
arXiv Detail & Related papers (2023-05-22T12:18:20Z) - A Variational Perspective on Solving Inverse Problems with Diffusion
Models [101.831766524264]
Inverse tasks can be formulated as inferring a posterior distribution over data.
This is however challenging in diffusion models since the nonlinear and iterative nature of the diffusion process renders the posterior intractable.
We propose a variational approach that by design seeks to approximate the true posterior distribution.
arXiv Detail & Related papers (2023-05-07T23:00:47Z) - Efficient Sampling of Stochastic Differential Equations with Positive
Semi-Definite Models [91.22420505636006]
This paper deals with the problem of efficient sampling from a differential equation, given the drift function and the diffusion matrix.
It is possible to obtain independent and identically distributed (i.i.d.) samples at precision $varepsilon$ with a cost that is $m2 d log (1/varepsilon)$
Our results suggest that as the true solution gets smoother, we can circumvent the curse of dimensionality without requiring any sort of convexity.
arXiv Detail & Related papers (2023-03-30T02:50:49Z) - Improved Langevin Monte Carlo for stochastic optimization via landscape
modification [0.0]
Given a target function $H$ to minimize or a target Gibbs distribution $pi_beta0 propto e-beta H$ to sample from in the low temperature, in this paper we propose and analyze Langevin Monte Carlo (LMC) algorithms that run on an alternative landscape.
We show that the energy barrier of the transformed landscape is reduced which consequently leads to dependence on both $beta$ and $M$ in the modified Log-Sobolev constant associated with $pif_beta,c,1$.
arXiv Detail & Related papers (2023-02-08T10:08:37Z) - Optimal Gradient Sliding and its Application to Distributed Optimization
Under Similarity [121.83085611327654]
We structured convex optimization problems with additive objective $r:=p + q$, where $r$ is $mu$-strong convex similarity.
We proposed a method to solve problems master to agents' communication and local calls.
The proposed method is much sharper than the $mathcalO(sqrtL_q/mu)$ method.
arXiv Detail & Related papers (2022-05-30T14:28:02Z) - A first-order primal-dual method with adaptivity to local smoothness [64.62056765216386]
We consider the problem of finding a saddle point for the convex-concave objective $min_x max_y f(x) + langle Ax, yrangle - g*(y)$, where $f$ is a convex function with locally Lipschitz gradient and $g$ is convex and possibly non-smooth.
We propose an adaptive version of the Condat-Vu algorithm, which alternates between primal gradient steps and dual steps.
arXiv Detail & Related papers (2021-10-28T14:19:30Z) - On the Self-Penalization Phenomenon in Feature Selection [69.16452769334367]
We describe an implicit sparsity-inducing mechanism based on over a family of kernels.
As an application, we use this sparsity-inducing mechanism to build algorithms consistent for feature selection.
arXiv Detail & Related papers (2021-10-12T09:36:41Z) - Asymptotic Theory of $\ell_1$-Regularized PDE Identification from a
Single Noisy Trajectory [2.0299248281970956]
We prove the support recovery for a general class of linear and nonlinear evolutionary partial differential equation (PDE) identification from a single noisy trajectory.
We provide a set of sufficient conditions which guarantee that, from a single trajectory data denoised by a Local-Polynomial filter, the support of $mathbfc(lambda)$ally converges to the true signed-support associated with the underlying PDE.
arXiv Detail & Related papers (2021-03-12T02:23:04Z) - Linear Time Sinkhorn Divergences using Positive Features [51.50788603386766]
Solving optimal transport with an entropic regularization requires computing a $ntimes n$ kernel matrix that is repeatedly applied to a vector.
We propose to use instead ground costs of the form $c(x,y)=-logdotpvarphi(x)varphi(y)$ where $varphi$ is a map from the ground space onto the positive orthant $RRr_+$, with $rll n$.
arXiv Detail & Related papers (2020-06-12T10:21:40Z) - Spectral density estimation with the Gaussian Integral Transform [91.3755431537592]
spectral density operator $hatrho(omega)=delta(omega-hatH)$ plays a central role in linear response theory.
We describe a near optimal quantum algorithm providing an approximation to the spectral density.
arXiv Detail & Related papers (2020-04-10T03:14:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.