Applying Lie Groups Approaches for Rigid Registration of Point Clouds
- URL: http://arxiv.org/abs/2006.13341v1
- Date: Tue, 23 Jun 2020 21:26:57 GMT
- Title: Applying Lie Groups Approaches for Rigid Registration of Point Clouds
- Authors: Liliane Rodrigues de Almeida, Gilson A. Giraldi, Marcelo Bernardes
Vieira
- Abstract summary: We use Lie groups and Lie algebras to find the rigid transformation that best register two surfaces represented by point clouds.
The so called pairwise rigid registration can be formulated by comparing intrinsic second-order orientation tensors.
We show promising results when embedding orientation tensor fields in Lie algebras.
- Score: 3.308743964406687
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In the last decades, some literature appeared using the Lie groups theory to
solve problems in computer vision. On the other hand, Lie algebraic
representations of the transformations therein were introduced to overcome the
difficulties behind group structure by mapping the transformation groups to
linear spaces. In this paper we focus on application of Lie groups and Lie
algebras to find the rigid transformation that best register two surfaces
represented by point clouds. The so called pairwise rigid registration can be
formulated by comparing intrinsic second-order orientation tensors that encode
local geometry. These tensors can be (locally) represented by symmetric
non-negative definite matrices. In this paper we interpret the obtained tensor
field as a multivariate normal model. So, we start with the fact that the space
of Gaussians can be equipped with a Lie group structure, that is isomorphic to
a subgroup of the upper triangular matrices. Consequently, the associated Lie
algebra structure enables us to handle Gaussians, and consequently, to compare
orientation tensors, with Euclidean operations. We apply this methodology to
variants of the Iterative Closest Point (ICP), a known technique for pairwise
registration. We compare the obtained results with the original implementations
that apply the comparative tensor shape factor (CTSF), which is a similarity
notion based on the eigenvalues of the orientation tensors. We notice that the
similarity measure in tensor spaces directly derived from Lie's approach is not
invariant under rotations, which is a problem in terms of rigid registration.
Despite of this, the performed computational experiments show promising results
when embedding orientation tensor fields in Lie algebras.
Related papers
- Symmetry Discovery for Different Data Types [52.2614860099811]
Equivariant neural networks incorporate symmetries into their architecture, achieving higher generalization performance.
We propose LieSD, a method for discovering symmetries via trained neural networks which approximate the input-output mappings of the tasks.
We validate the performance of LieSD on tasks with symmetries such as the two-body problem, the moment of inertia matrix prediction, and top quark tagging.
arXiv Detail & Related papers (2024-10-13T13:39:39Z) - Understanding Matrix Function Normalizations in Covariance Pooling through the Lens of Riemannian Geometry [63.694184882697435]
Global Covariance Pooling (GCP) has been demonstrated to improve the performance of Deep Neural Networks (DNNs) by exploiting second-order statistics of high-level representations.
arXiv Detail & Related papers (2024-07-15T07:11:44Z) - Tensor cumulants for statistical inference on invariant distributions [49.80012009682584]
We show that PCA becomes computationally hard at a critical value of the signal's magnitude.
We define a new set of objects, which provide an explicit, near-orthogonal basis for invariants of a given degree.
It also lets us analyze a new problem of distinguishing between different ensembles.
arXiv Detail & Related papers (2024-04-29T14:33:24Z) - Lie Group Decompositions for Equivariant Neural Networks [12.139222986297261]
We show how convolution kernels can be parametrized to build models equivariant with respect to affine transformations.
We evaluate the robustness and out-of-distribution generalisation capability of our model on the benchmark affine-invariant classification task.
arXiv Detail & Related papers (2023-10-17T16:04:33Z) - G-invariant diffusion maps [11.852406625172216]
We derive diffusion maps that intrinsically account for the group action on the data.
In particular, we construct both equivariant and invariant embeddings, which can be used to cluster and align the data points.
arXiv Detail & Related papers (2023-06-12T18:16:33Z) - Fast computation of permutation equivariant layers with the partition
algebra [0.0]
Linear neural network layers that are either equivariant or invariant to permutations of their inputs form core building blocks of modern deep learning architectures.
Examples include the layers of DeepSets, as well as linear layers occurring in attention blocks of transformers and some graph neural networks.
arXiv Detail & Related papers (2023-03-10T21:13:12Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Unified Fourier-based Kernel and Nonlinearity Design for Equivariant
Networks on Homogeneous Spaces [52.424621227687894]
We introduce a unified framework for group equivariant networks on homogeneous spaces.
We take advantage of the sparsity of Fourier coefficients of the lifted feature fields.
We show that other methods treating features as the Fourier coefficients in the stabilizer subgroup are special cases of our activation.
arXiv Detail & Related papers (2022-06-16T17:59:01Z) - LieTransformer: Equivariant self-attention for Lie Groups [49.9625160479096]
Group equivariant neural networks are used as building blocks of group invariant neural networks.
We extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models.
We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups.
arXiv Detail & Related papers (2020-12-20T11:02:49Z) - Learning Irreducible Representations of Noncommutative Lie Groups [3.1727619150610837]
Recent work has constructed neural networks that are equivariant to continuous symmetry groups such as 2D and 3D rotations.
We present two contributions motivated by frontier applications of equivariance beyond rotations and translations.
arXiv Detail & Related papers (2020-06-01T05:14:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.