Efficient and Flexible Sublabel-Accurate Energy Minimization
- URL: http://arxiv.org/abs/2206.09596v1
- Date: Mon, 20 Jun 2022 06:58:55 GMT
- Title: Efficient and Flexible Sublabel-Accurate Energy Minimization
- Authors: Zhakshylyk Nurlanov, Daniel Cremers, Florian Bernard
- Abstract summary: We address the problem of minimizing a class of energy functions consisting of data and smoothness terms.
Existing continuous optimization methods can find sublabel-accurate solutions, but they are not efficient for large label spaces.
We propose an efficient sublabel-accurate method that utilizes the best properties of both continuous and discrete models.
- Score: 62.50191141358778
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We address the problem of minimizing a class of energy functions consisting
of data and smoothness terms that commonly occur in machine learning, computer
vision, and pattern recognition. While discrete optimization methods are able
to give theoretical optimality guarantees, they can only handle a finite number
of labels and therefore suffer from label discretization bias. Existing
continuous optimization methods can find sublabel-accurate solutions, but they
are not efficient for large label spaces. In this work, we propose an efficient
sublabel-accurate method that utilizes the best properties of both continuous
and discrete models. We separate the problem into two sequential steps: (i)
global discrete optimization for selecting the label range, and (ii) efficient
continuous sublabel-accurate local refinement of a convex approximation of the
energy function in the chosen range. Doing so allows us to achieve a boost in
time and memory efficiency while practically keeping the accuracy at the same
level as continuous convex relaxation methods, and in addition, providing
theoretical optimality guarantees at the level of discrete methods. Finally, we
show the flexibility of the proposed approach to general pairwise smoothness
terms, so that it is applicable to a wide range of regularizations. Experiments
on the illustrating example of the image denoising problem demonstrate the
properties of the proposed method. The code reproducing experiments is
available at
\url{https://github.com/nurlanov-zh/sublabel-accurate-alpha-expansion}.
Related papers
- Discretize Relaxed Solution of Spectral Clustering via a Non-Heuristic
Algorithm [77.53604156112144]
We develop a first-order term to bridge the original problem and discretization algorithm.
Since the non-heuristic method is aware of the original graph cut problem, the final discrete solution is more reliable.
arXiv Detail & Related papers (2023-10-19T13:57:38Z) - Scalable Bayesian Meta-Learning through Generalized Implicit Gradients [64.21628447579772]
Implicit Bayesian meta-learning (iBaML) method broadens the scope of learnable priors, but also quantifies the associated uncertainty.
Analytical error bounds are established to demonstrate the precision and efficiency of the generalized implicit gradient over the explicit one.
arXiv Detail & Related papers (2023-03-31T02:10:30Z) - Accelerated First-Order Optimization under Nonlinear Constraints [73.2273449996098]
We exploit between first-order algorithms for constrained optimization and non-smooth systems to design a new class of accelerated first-order algorithms.
An important property of these algorithms is that constraints are expressed in terms of velocities instead of sparse variables.
arXiv Detail & Related papers (2023-02-01T08:50:48Z) - Optimizing Diffusion Rate and Label Reliability in a Graph-Based
Semi-supervised Classifier [2.4366811507669124]
The Local and Global Consistency (LGC) algorithm is one of the most well-known graph-based semi-supervised (GSSL) classifiers.
We discuss how removing the self-influence of a labeled instance may be beneficial, and how it relates to leave-one-out error.
Within this framework, we propose methods to estimate label reliability and diffusion rate.
arXiv Detail & Related papers (2022-01-10T16:58:52Z) - Sparse Quadratic Optimisation over the Stiefel Manifold with Application
to Permutation Synchronisation [71.27989298860481]
We address the non- optimisation problem of finding a matrix on the Stiefel manifold that maximises a quadratic objective function.
We propose a simple yet effective sparsity-promoting algorithm for finding the dominant eigenspace matrix.
arXiv Detail & Related papers (2021-09-30T19:17:35Z) - Near-Optimal High Probability Complexity Bounds for Non-Smooth
Stochastic Optimization with Heavy-Tailed Noise [63.304196997102494]
It is essential to theoretically guarantee that algorithms provide small objective residual with high probability.
Existing methods for non-smooth convex optimization have complexity with bounds dependence on the confidence level that is either negative-power or logarithmic.
We propose novel stepsize rules for two gradient methods with clipping.
arXiv Detail & Related papers (2021-06-10T17:54:21Z) - Regularization in neural network optimization via trimmed stochastic
gradient descent with noisy label [2.66512000865131]
Regularization is essential for avoiding over-fitting to training data in neural network optimization.
We propose a first-order optimization method (Label-Noised Trim-SGD) which combines the label noise with the example trimming.
The proposed algorithm enables us to impose a large label noise and obtain a better regularization effect than the original methods.
arXiv Detail & Related papers (2020-12-21T01:31:53Z) - Approximate optimization of MAXCUT with a local spin algorithm [0.0]
Local tensor methods are a class of optimization algorithms introduced in [Hastings,arXiv:1905.07047v2].
We investigate performance in practice through benchmarking experiments on instances of the MAXCUT problem.
We find time to solution achieved by the local tensor method is highly uncorrelated with that achieved by a widely used commercial optimization package.
arXiv Detail & Related papers (2020-08-13T18:00:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.