Riemannian Geometry Approach for Minimizing Distortion and its
Applications
- URL: http://arxiv.org/abs/2207.12038v3
- Date: Thu, 28 Jul 2022 06:56:50 GMT
- Title: Riemannian Geometry Approach for Minimizing Distortion and its
Applications
- Authors: Dror Ozeri
- Abstract summary: We find an affine transformation $T$ that minimize the overall distortion $sum_i=1NDist_F2(T-1T_i)$.
The transformation can be useful in some fields -- in particular, we apply it for rendering affine panoramas.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Given an affine transformation $T$, we define its Fisher distortion
$Dist_F(T)$. We show that the Fisher distortion has Riemannian metric structure
and provide an algorithm for finding mean distorting transformation -- namely
-- for a given set $\{T_{i}\}_{i=1}^N$ of affine transformations, find an
affine transformation $T$ that minimize the overall distortion
$\sum_{i=1}^NDist_F^{2}(T^{-1}T_{i}).$ The mean distorting transformation can
be useful in some fields -- in particular, we apply it for rendering affine
panoramas.
Related papers
- A Bi-variant Variational Model for Diffeomorphic Image Registration with
Relaxed Jacobian Determinant Constraints [17.93018427389816]
We propose a new bi-variant diffeomorphic image registration model.
A soft constraint on the Jacobian equation $det(nablabmvarphi(bmx)) = f(bmx) > 0$ allows local deformations to shrink and grow within a flexible range.
A positive constraint is imposed on the optimization of the relaxation function $f(bmx)$, and a regularizer is used to ensure the smoothness of $f(bmx)$.
arXiv Detail & Related papers (2023-08-04T15:47:06Z) - Planar Curve Registration using Bayesian Inversion [0.0]
We study parameterisation-independent closed curve matching as a Bayesian inverse problem.
The motion of the curve is modelled via a curve on the diffeomorphism group acting on the ambient space.
We adopt ensemble Kalman inversion using a negative Sobolev mismatch penalty to measure the discrepancy between the target and the ensemble mean shape.
arXiv Detail & Related papers (2023-07-10T21:26:43Z) - Revisiting Subgradient Method: Complexity and Convergence Beyond Lipschitz Continuity [24.45688490844496]
Subgradient method is one of the most fundamental algorithmic schemes for nonsmooth optimization.
In this work, we first extend the typical iteration complexity results for the subgradient method to cover non-Lipschitz convex and weakly convex minimization.
arXiv Detail & Related papers (2023-05-23T15:26:36Z) - Deep neural networks on diffeomorphism groups for optimal shape
reparameterization [44.99833362998488]
We propose an algorithm for constructing approximations of orientation-preserving diffeomorphisms by composition of elementary diffeomorphisms.
The algorithm is implemented using PyTorch, and is applicable for both unparametrized curves and surfaces.
arXiv Detail & Related papers (2022-07-22T15:25:59Z) - 3D Equivariant Graph Implicit Functions [51.5559264447605]
We introduce a novel family of graph implicit functions with equivariant layers that facilitates modeling fine local details.
Our method improves over the existing rotation-equivariant implicit function from 0.69 to 0.89 on the ShapeNet reconstruction task.
arXiv Detail & Related papers (2022-03-31T16:51:25Z) - Hybrid Model-based / Data-driven Graph Transform for Image Coding [54.31406300524195]
We present a hybrid model-based / data-driven approach to encode an intra-prediction residual block.
The first $K$ eigenvectors of a transform matrix are derived from a statistical model, e.g., the asymmetric discrete sine transform (ADST) for stability.
Using WebP as a baseline image, experimental results show that our hybrid graph transform achieved better energy compaction than default discrete cosine transform (DCT) and better stability than KLT.
arXiv Detail & Related papers (2022-03-02T15:36:44Z) - Revisiting Transformation Invariant Geometric Deep Learning: Are Initial
Representations All You Need? [80.86819657126041]
We show that transformation-invariant and distance-preserving initial representations are sufficient to achieve transformation invariance.
Specifically, we realize transformation-invariant and distance-preserving initial point representations by modifying multi-dimensional scaling.
We prove that TinvNN can strictly guarantee transformation invariance, being general and flexible enough to be combined with the existing neural networks.
arXiv Detail & Related papers (2021-12-23T03:52:33Z) - A first-order primal-dual method with adaptivity to local smoothness [64.62056765216386]
We consider the problem of finding a saddle point for the convex-concave objective $min_x max_y f(x) + langle Ax, yrangle - g*(y)$, where $f$ is a convex function with locally Lipschitz gradient and $g$ is convex and possibly non-smooth.
We propose an adaptive version of the Condat-Vu algorithm, which alternates between primal gradient steps and dual steps.
arXiv Detail & Related papers (2021-10-28T14:19:30Z) - Invariant Deep Compressible Covariance Pooling for Aerial Scene
Categorization [80.55951673479237]
We propose a novel invariant deep compressible covariance pooling (IDCCP) to solve nuisance variations in aerial scene categorization.
We conduct extensive experiments on the publicly released aerial scene image data sets and demonstrate the superiority of this method compared with state-of-the-art methods.
arXiv Detail & Related papers (2020-11-11T11:13:07Z) - A deep network construction that adapts to intrinsic dimensionality
beyond the domain [79.23797234241471]
We study the approximation of two-layer compositions $f(x) = g(phi(x))$ via deep networks with ReLU activation.
We focus on two intuitive and practically relevant choices for $phi$: the projection onto a low-dimensional embedded submanifold and a distance to a collection of low-dimensional sets.
arXiv Detail & Related papers (2020-08-06T09:50:29Z) - An Integer Approximation Method for Discrete Sinusoidal Transforms [0.0]
We propose and analyze a class of integer transforms for the discrete Fourier, Hartley, and cosine transforms (DFT, DHT, and DCT)
The introduced method is general, applicable to several block-lengths, whereas existing approaches are usually dedicated to specific transform sizes.
New 8-point square wave approximate transforms for the DFT, DHT, and DCT are also introduced as particular cases of the introduced methodology.
arXiv Detail & Related papers (2020-07-05T03:37:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.