HPPP: Halpern-type Preconditioned Proximal Point Algorithms and Applications to Image Restoration
- URL: http://arxiv.org/abs/2407.13120v2
- Date: Sun, 21 Jul 2024 12:13:03 GMT
- Title: HPPP: Halpern-type Preconditioned Proximal Point Algorithms and Applications to Image Restoration
- Authors: Shuchang Zhang, Hui Zhang, Hongxia Wang,
- Abstract summary: Preconditioned Proximal Point (PPP) algorithms provide a unified framework for splitting methods in image restoration.
PPP algorithms typically degenerate in infinite-dimensional convergence leading to uncertain solutions.
We propose the Halpern-type HPPP algorithm, which leverages the strong convergence properties to achieve a particular solution.
- Score: 7.614347936574962
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Preconditioned Proximal Point (PPP) algorithms provide a unified framework for splitting methods in image restoration. Recent advancements with RED (Regularization by Denoising) and PnP (Plug-and-Play) priors have achieved state-of-the-art performance in this domain, emphasizing the need for a meaningful particular solution. However, degenerate PPP algorithms typically exhibit weak convergence in infinite-dimensional Hilbert space, leading to uncertain solutions. To address this issue, we propose the Halpern-type Preconditioned Proximal Point (HPPP) algorithm, which leverages the strong convergence properties of Halpern iteration to achieve a particular solution. Based on the implicit regularization defined by gradient RED, we further introduce the Gradient REgularization by Denoising via HPPP called GraRED-HP3 algorithm. The HPPP algorithm is shown to have the regularity converging to a particular solution by a toy example. Additionally, experiments in image deblurring and inpainting validate the effectiveness of GraRED-HP3, showing it surpasses classical methods such as Chambolle-Pock (CP), PPP, RED, and RED-PRO.
Related papers
- A Unified Plug-and-Play Algorithm with Projected Landweber Operator for Split Convex Feasibility Problems [6.185478918618347]
In recent years Plug-and-Play methods have achieved state-resolution-of-the-art performance in inverse imaging problems by replacing operators with denoisers.
Applying methods with theoretically guaranteed step sizes is difficult, and algorithms are limited to noise.
Project Landweber Operator (PLOPLO) is proposed to address these issues.
arXiv Detail & Related papers (2024-08-22T03:29:51Z) - Qudit inspired optimization for graph coloring [0.0]
We introduce a quantum-inspired algorithm for Graph Coloring Problems (GCPs)
We use qudits in a product state, with each qudit representing a node in the graph and parameterized by d-dimensional spherical coordinates.
We benchmark two optimization strategies: qudit gradient descent (QdGD), initiating qudits in random states and employing gradient descent to minimize a cost function.
arXiv Detail & Related papers (2024-06-02T16:19:55Z) - Flattened one-bit stochastic gradient descent: compressed distributed optimization with controlled variance [55.01966743652196]
We propose a novel algorithm for distributed gradient descent (SGD) with compressed gradient communication in the parameter-server framework.
Our gradient compression technique, named flattened one-bit gradient descent (FO-SGD), relies on two simple algorithmic ideas.
arXiv Detail & Related papers (2024-05-17T21:17:27Z) - An Optimization-based Deep Equilibrium Model for Hyperspectral Image
Deconvolution with Convergence Guarantees [71.57324258813675]
We propose a novel methodology for addressing the hyperspectral image deconvolution problem.
A new optimization problem is formulated, leveraging a learnable regularizer in the form of a neural network.
The derived iterative solver is then expressed as a fixed-point calculation problem within the Deep Equilibrium framework.
arXiv Detail & Related papers (2023-06-10T08:25:16Z) - Poisson-Gaussian Holographic Phase Retrieval with Score-based Image
Prior [19.231581775644617]
We propose a new algorithm called "AWFS" that uses the accelerated Wirtinger flow (AWF) with a score function as generative prior.
We calculate the gradient of the log-likelihood function for PR and determine the Lipschitz constant.
We provide theoretical analysis that establishes a critical-point convergence guarantee for the proposed algorithm.
arXiv Detail & Related papers (2023-05-12T18:08:47Z) - A relaxed proximal gradient descent algorithm for convergent
plug-and-play with proximal denoiser [6.2484576862659065]
This paper presents a new convergent Plug-and-fidelity Descent (Play) algorithm.
The algorithm converges for a wider range of regular convexization parameters, thus allowing more accurate restoration of an image.
arXiv Detail & Related papers (2023-01-31T16:11:47Z) - An Interpretation of Regularization by Denoising and its Application
with the Back-Projected Fidelity Term [55.34375605313277]
We show that the RED gradient can be seen as a (sub)gradient of a prior function--but taken at a denoised version of the point.
We propose to combine RED with the Back-Projection (BP) fidelity term rather than the common Least Squares (LS) term that is used in previous works.
arXiv Detail & Related papers (2021-01-27T18:45:35Z) - Zeroth-Order Hybrid Gradient Descent: Towards A Principled Black-Box
Optimization Framework [100.36569795440889]
This work is on the iteration of zero-th-order (ZO) optimization which does not require first-order information.
We show that with a graceful design in coordinate importance sampling, the proposed ZO optimization method is efficient both in terms of complexity as well as as function query cost.
arXiv Detail & Related papers (2020-12-21T17:29:58Z) - Regularization by Denoising via Fixed-Point Projection (RED-PRO) [34.89374374708481]
Regularization by Denoising (RED) and Plug-and-Play Prior (RED) are used in image processing.
While both have shown state-of-the-art results in various recovery tasks, their theoretical justification is incomplete.
arXiv Detail & Related papers (2020-08-01T09:35:22Z) - Conditional gradient methods for stochastically constrained convex
minimization [54.53786593679331]
We propose two novel conditional gradient-based methods for solving structured convex optimization problems.
The most important feature of our framework is that only a subset of the constraints is processed at each iteration.
Our algorithms rely on variance reduction and smoothing used in conjunction with conditional gradient steps, and are accompanied by rigorous convergence guarantees.
arXiv Detail & Related papers (2020-07-07T21:26:35Z) - The Power of Triply Complementary Priors for Image Compressive Sensing [89.14144796591685]
We propose a joint low-rank deep (LRD) image model, which contains a pair of complementaryly trip priors.
We then propose a novel hybrid plug-and-play framework based on the LRD model for image CS.
To make the optimization tractable, a simple yet effective algorithm is proposed to solve the proposed H-based image CS problem.
arXiv Detail & Related papers (2020-05-16T08:17:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.