Fast Multi-grid Methods for Minimizing Curvature Energy
- URL: http://arxiv.org/abs/2204.07921v1
- Date: Sun, 17 Apr 2022 04:34:38 GMT
- Title: Fast Multi-grid Methods for Minimizing Curvature Energy
- Authors: Zhenwei Zhang and Ke Chen and Yuping Duan
- Abstract summary: We propose fast multi-grid algorithms for minimizing mean curvature and Gaussian curvature energy functionals.
No artificial parameters are introduced in our formulation, which guarantees the robustness of the proposed algorithm.
Numerical experiments are presented on both image denoising and CT reconstruction problem to demonstrate the ability to recover image texture.
- Score: 6.882141405929301
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The geometric high-order regularization methods such as mean curvature and
Gaussian curvature, have been intensively studied during the last decades due
to their abilities in preserving geometric properties including image edges,
corners, and image contrast. However, the dilemma between restoration quality
and computational efficiency is an essential roadblock for high-order methods.
In this paper, we propose fast multi-grid algorithms for minimizing both mean
curvature and Gaussian curvature energy functionals without sacrificing the
accuracy for efficiency. Unlike the existing approaches based on operator
splitting and the Augmented Lagrangian method (ALM), no artificial parameters
are introduced in our formulation, which guarantees the robustness of the
proposed algorithm. Meanwhile, we adopt the domain decomposition method to
promote parallel computing and use the fine-to-coarse structure to accelerate
the convergence. Numerical experiments are presented on both image denoising
and CT reconstruction problem to demonstrate the ability to recover image
texture and the efficiency of the proposed method.
Related papers
- A Fast Minimization Algorithm for the Euler Elastica Model Based on a
Bilinear Decomposition [5.649764770305694]
We propose a new, fast, hybrid alternating minimization (HALM) algorithm for the Euler Elastica (EE) model.
The HALM algorithm comprises three sub-minimization problems and each is either solved in the closed form or approximated by fast solvers.
A host of numerical experiments are conducted to show that the new algorithm produces good results with much-improved efficiency.
arXiv Detail & Related papers (2023-08-25T16:15:38Z) - An Optimization-based Deep Equilibrium Model for Hyperspectral Image
Deconvolution with Convergence Guarantees [71.57324258813675]
We propose a novel methodology for addressing the hyperspectral image deconvolution problem.
A new optimization problem is formulated, leveraging a learnable regularizer in the form of a neural network.
The derived iterative solver is then expressed as a fixed-point calculation problem within the Deep Equilibrium framework.
arXiv Detail & Related papers (2023-06-10T08:25:16Z) - Optimizing CT Scan Geometries With and Without Gradients [7.788823739816626]
We show that gradient-based optimization algorithms are a possible alternative to gradient-free algorithms.
gradient-based algorithms converge substantially faster while being comparable to gradient-free algorithms in terms of capture range and robustness to the number of free parameters.
arXiv Detail & Related papers (2023-02-13T10:44:41Z) - An Operator-Splitting Method for the Gaussian Curvature Regularization
Model with Applications in Surface Smoothing and Imaging [6.860238280163609]
We propose an operator-splitting method for a general Gaussian curvature model.
The proposed method is not sensitive to the choice of parameters, its efficiency and performances being demonstrated.
arXiv Detail & Related papers (2021-08-04T08:59:41Z) - Learned Block Iterative Shrinkage Thresholding Algorithm for
Photothermal Super Resolution Imaging [52.42007686600479]
We propose a learned block-sparse optimization approach using an iterative algorithm unfolded into a deep neural network.
We show the benefits of using a learned block iterative shrinkage thresholding algorithm that is able to learn the choice of regularization parameters.
arXiv Detail & Related papers (2020-12-07T09:27:16Z) - Cogradient Descent for Bilinear Optimization [124.45816011848096]
We introduce a Cogradient Descent algorithm (CoGD) to address the bilinear problem.
We solve one variable by considering its coupling relationship with the other, leading to a synchronous gradient descent.
Our algorithm is applied to solve problems with one variable under the sparsity constraint.
arXiv Detail & Related papers (2020-06-16T13:41:54Z) - IDEAL: Inexact DEcentralized Accelerated Augmented Lagrangian Method [64.15649345392822]
We introduce a framework for designing primal methods under the decentralized optimization setting where local functions are smooth and strongly convex.
Our approach consists of approximately solving a sequence of sub-problems induced by the accelerated augmented Lagrangian method.
When coupled with accelerated gradient descent, our framework yields a novel primal algorithm whose convergence rate is optimal and matched by recently derived lower bounds.
arXiv Detail & Related papers (2020-06-11T18:49:06Z) - Effective Dimension Adaptive Sketching Methods for Faster Regularized
Least-Squares Optimization [56.05635751529922]
We propose a new randomized algorithm for solving L2-regularized least-squares problems based on sketching.
We consider two of the most popular random embeddings, namely, Gaussian embeddings and the Subsampled Randomized Hadamard Transform (SRHT)
arXiv Detail & Related papers (2020-06-10T15:00:09Z) - Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization [71.03797261151605]
Adaptivity is an important yet under-studied property in modern optimization theory.
Our algorithm is proved to achieve the best-available convergence for non-PL objectives simultaneously while outperforming existing algorithms for PL objectives.
arXiv Detail & Related papers (2020-02-13T05:42:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.