Image comparison and scaling via nonlinear elasticity
- URL: http://arxiv.org/abs/2303.10103v2
- Date: Mon, 20 Mar 2023 16:21:34 GMT
- Title: Image comparison and scaling via nonlinear elasticity
- Authors: John M. Ball and Christopher L. Horner
- Abstract summary: The existence of minimizers in a suitable class of homeomorphisms between image domains is established under natural hypotheses.
We investigate whether for linearly related images the minimization algorithm delivers the linear transformation as the unique minimizer.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A nonlinear elasticity model for comparing images is formulated and analyzed,
in which optimal transformations between images are sought as minimizers of an
integral functional. The existence of minimizers in a suitable class of
homeomorphisms between image domains is established under natural hypotheses.
We investigate whether for linearly related images the minimization algorithm
delivers the linear transformation as the unique minimizer.
Related papers
- A nonlinear elasticity model in computer vision [0.0]
The purpose of this paper is to analyze a nonlinear elasticity model previously introduced by the authors for comparing two images.
The existence of transformations is proved among derivatives of $-valued pairs of gradient vector-valued intensity maps.
The question is as to whether for images related by a linear mapping the uniquer is given by that.
arXiv Detail & Related papers (2024-08-30T12:27:22Z) - Verification of Geometric Robustness of Neural Networks via Piecewise Linear Approximation and Lipschitz Optimisation [57.10353686244835]
We address the problem of verifying neural networks against geometric transformations of the input image, including rotation, scaling, shearing, and translation.
The proposed method computes provably sound piecewise linear constraints for the pixel values by using sampling and linear approximations in combination with branch-and-bound Lipschitz.
We show that our proposed implementation resolves up to 32% more verification cases than present approaches.
arXiv Detail & Related papers (2024-08-23T15:02:09Z) - Stable Nonconvex-Nonconcave Training via Linear Interpolation [51.668052890249726]
This paper presents a theoretical analysis of linearahead as a principled method for stabilizing (large-scale) neural network training.
We argue that instabilities in the optimization process are often caused by the nonmonotonicity of the loss landscape and show how linear can help by leveraging the theory of nonexpansive operators.
arXiv Detail & Related papers (2023-10-20T12:45:12Z) - Linearization Algorithms for Fully Composite Optimization [61.20539085730636]
This paper studies first-order algorithms for solving fully composite optimization problems convex compact sets.
We leverage the structure of the objective by handling differentiable and non-differentiable separately, linearizing only the smooth parts.
arXiv Detail & Related papers (2023-02-24T18:41:48Z) - Linear Convergence of ISTA and FISTA [8.261388753972234]
We revisit the class of iterative shrinkage-thresholding algorithms (ISTA) for solving the linear inverse problem with sparse representation.
We find that the previous assumption for the smooth part to be convex weakens the least-square model.
We generalize the linear convergence to composite optimization in both the objective value and the squared proximal subgradient norm.
arXiv Detail & Related papers (2022-12-13T02:02:50Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Hessian Eigenspectra of More Realistic Nonlinear Models [73.31363313577941]
We make a emphprecise characterization of the Hessian eigenspectra for a broad family of nonlinear models.
Our analysis takes a step forward to identify the origin of many striking features observed in more complex machine learning models.
arXiv Detail & Related papers (2021-03-02T06:59:52Z) - Joint Estimation of Image Representations and their Lie Invariants [57.3768308075675]
Images encode both the state of the world and its content.
The automatic extraction of this information is challenging because of the high-dimensionality and entangled encoding inherent to the image representation.
This article introduces two theoretical approaches aimed at the resolution of these challenges.
arXiv Detail & Related papers (2020-12-05T00:07:41Z) - The role of optimization geometry in single neuron learning [12.891722496444036]
Recent experiments have demonstrated the choice of optimization geometry can impact generalization performance when learning expressive neural model networks.
We show how the interplay between geometry and the feature geometry sets the out-of-sample leads and improves performance.
arXiv Detail & Related papers (2020-06-15T17:39:44Z) - Weighted Encoding Based Image Interpolation With Nonlocal Linear
Regression Model [8.013127492678272]
In image super-resolution, the low-resolution image is directly down-sampled from its high-resolution counterpart without blurring and noise.
To address this problem, we propose a novel image model based on sparse representation.
New approach to learn adaptive sub-dictionary online instead of clustering.
arXiv Detail & Related papers (2020-03-04T03:20:21Z) - Halpern Iteration for Near-Optimal and Parameter-Free Monotone Inclusion
and Strong Solutions to Variational Inequalities [14.848525762485872]
We leverage the connections between nonexpansive maps, monotone Lipschitz operators, and proximal mappings to obtain near-optimal solutions to monotone inclusion problems.
These results translate into near-optimal guarantees for approximating strong solutions to variational inequality problems, approximating convex-concave min-max optimization problems, and minimizing the norm of the gradient in min-max optimization problems.
arXiv Detail & Related papers (2020-02-20T17:12:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.