An Improved Optimal Proximal Gradient Algorithm for Non-Blind Image Deblurring
- URL: http://arxiv.org/abs/2502.07602v1
- Date: Tue, 11 Feb 2025 14:52:11 GMT
- Title: An Improved Optimal Proximal Gradient Algorithm for Non-Blind Image Deblurring
- Authors: Qingsong Wang, Shengze Xu, Xiaojiao Tong, Tieyong Zeng,
- Abstract summary: We introduce an improved optimal proximal gradient algorithm (IOptISTA) to efficiently address the non-blind image deblurring problem.
The results indicate that our algorithm yields enhanced PSNR and SSIM values, as well as a reduced tolerance, compared to existing methods.
- Score: 15.645711819668582
- License:
- Abstract: Image deblurring remains a central research area within image processing, critical for its role in enhancing image quality and facilitating clearer visual representations across diverse applications. This paper tackles the optimization problem of image deblurring, assuming a known blurring kernel. We introduce an improved optimal proximal gradient algorithm (IOptISTA), which builds upon the optimal gradient method and a weighting matrix, to efficiently address the non-blind image deblurring problem. Based on two regularization cases, namely the $l_1$ norm and total variation norm, we perform numerical experiments to assess the performance of our proposed algorithm. The results indicate that our algorithm yields enhanced PSNR and SSIM values, as well as a reduced tolerance, compared to existing methods.
Related papers
- Comparing Image Segmentation Algorithms [0.0]
We propose an energy function E(x, y) that captures the relationship between the noisy image y and the desired clean image x.
We evaluate the performance of the proposed method against traditional iterative conditional modes.
arXiv Detail & Related papers (2025-02-10T06:54:30Z) - Learning Efficient and Effective Trajectories for Differential Equation-based Image Restoration [59.744840744491945]
We reformulate the trajectory optimization of this kind of method, focusing on enhancing both reconstruction quality and efficiency.
We propose cost-aware trajectory distillation to streamline complex paths into several manageable steps with adaptable sizes.
Experiments showcase the significant superiority of the proposed method, achieving a maximum PSNR improvement of 2.1 dB over state-of-the-art methods.
arXiv Detail & Related papers (2024-10-07T07:46:08Z) - Optimal Guarantees for Algorithmic Reproducibility and Gradient
Complexity in Convex Optimization [55.115992622028685]
Previous work suggests that first-order methods would need to trade-off convergence rate (gradient convergence rate) for better.
We demonstrate that both optimal complexity and near-optimal convergence guarantees can be achieved for smooth convex minimization and smooth convex-concave minimax problems.
arXiv Detail & Related papers (2023-10-26T19:56:52Z) - Poisson-Gaussian Holographic Phase Retrieval with Score-based Image
Prior [19.231581775644617]
We propose a new algorithm called "AWFS" that uses the accelerated Wirtinger flow (AWF) with a score function as generative prior.
We calculate the gradient of the log-likelihood function for PR and determine the Lipschitz constant.
We provide theoretical analysis that establishes a critical-point convergence guarantee for the proposed algorithm.
arXiv Detail & Related papers (2023-05-12T18:08:47Z) - Optimizing CT Scan Geometries With and Without Gradients [7.788823739816626]
We show that gradient-based optimization algorithms are a possible alternative to gradient-free algorithms.
gradient-based algorithms converge substantially faster while being comparable to gradient-free algorithms in terms of capture range and robustness to the number of free parameters.
arXiv Detail & Related papers (2023-02-13T10:44:41Z) - Sub-Image Histogram Equalization using Coot Optimization Algorithm for
Segmentation and Parameter Selection [0.0]
Mean and variance based sub-image histogram equalization (MVSIHE) algorithm is one of these contrast enhancements methods proposed in the literature.
In this study, we employed one of the most recent optimization algorithms, namely, coot optimization algorithm (COA) for selecting appropriate parameters for the MVSIHE algorithm.
The results show that the proposed method can be used in the field of biomedical image processing.
arXiv Detail & Related papers (2022-05-31T06:51:45Z) - Fast Multi-grid Methods for Minimizing Curvature Energy [6.882141405929301]
We propose fast multi-grid algorithms for minimizing mean curvature and Gaussian curvature energy functionals.
No artificial parameters are introduced in our formulation, which guarantees the robustness of the proposed algorithm.
Numerical experiments are presented on both image denoising and CT reconstruction problem to demonstrate the ability to recover image texture.
arXiv Detail & Related papers (2022-04-17T04:34:38Z) - On Measuring and Controlling the Spectral Bias of the Deep Image Prior [63.88575598930554]
The deep image prior has demonstrated the remarkable ability that untrained networks can address inverse imaging problems.
It requires an oracle to determine when to stop the optimization as the performance degrades after reaching a peak.
We study the deep image prior from a spectral bias perspective to address these problems.
arXiv Detail & Related papers (2021-07-02T15:10:42Z) - Zeroth-Order Hybrid Gradient Descent: Towards A Principled Black-Box
Optimization Framework [100.36569795440889]
This work is on the iteration of zero-th-order (ZO) optimization which does not require first-order information.
We show that with a graceful design in coordinate importance sampling, the proposed ZO optimization method is efficient both in terms of complexity as well as as function query cost.
arXiv Detail & Related papers (2020-12-21T17:29:58Z) - Learned Block Iterative Shrinkage Thresholding Algorithm for
Photothermal Super Resolution Imaging [52.42007686600479]
We propose a learned block-sparse optimization approach using an iterative algorithm unfolded into a deep neural network.
We show the benefits of using a learned block iterative shrinkage thresholding algorithm that is able to learn the choice of regularization parameters.
arXiv Detail & Related papers (2020-12-07T09:27:16Z) - Towards Better Understanding of Adaptive Gradient Algorithms in
Generative Adversarial Nets [71.05306664267832]
Adaptive algorithms perform gradient updates using the history of gradients and are ubiquitous in training deep neural networks.
In this paper we analyze a variant of OptimisticOA algorithm for nonconcave minmax problems.
Our experiments show that adaptive GAN non-adaptive gradient algorithms can be observed empirically.
arXiv Detail & Related papers (2019-12-26T22:10:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.