Robust Ellipsoid Fitting Using Axial Distance and Combination
- URL: http://arxiv.org/abs/2304.00517v2
- Date: Fri, 22 Sep 2023 12:23:30 GMT
- Title: Robust Ellipsoid Fitting Using Axial Distance and Combination
- Authors: Min Han, Jiangming Kan, Gongping Yang, and Xinghui Li
- Abstract summary: In random sample consensus (RANSAC), the problem of ellipsoid fitting can be formulated as a problem of minimization of point-to-model distance.
We propose a novel distance metric called the axial distance, which is converted from the algebraic distance.
A novel sample-consensus-based ellipsoid fitting method is proposed by using the combination between the axial distance and Sampson distance.
- Score: 15.39157287924673
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In random sample consensus (RANSAC), the problem of ellipsoid fitting can be
formulated as a problem of minimization of point-to-model distance, which is
realized by maximizing model score. Hence, the performance of ellipsoid fitting
is affected by distance metric. In this paper, we proposed a novel distance
metric called the axial distance, which is converted from the algebraic
distance by introducing a scaling factor to solve nongeometric problems of the
algebraic distance. There is complementarity between the axial distance and
Sampson distance because their combination is a stricter metric when
calculating the model score of sample consensus and the weight of the weighted
least squares (WLS) fitting. Subsequently, a novel sample-consensus-based
ellipsoid fitting method is proposed by using the combination between the axial
distance and Sampson distance (CAS). We compare the proposed method with
several representative fitting methods through experiments on synthetic and
real datasets. The results show that the proposed method has a higher
robustness against outliers, consistently high accuracy, and a speed close to
that of the method based on sample consensus.
Related papers
- Multivariate root-n-consistent smoothing parameter free matching estimators and estimators of inverse density weighted expectations [51.000851088730684]
We develop novel modifications of nearest-neighbor and matching estimators which converge at the parametric $sqrt n $-rate.
We stress that our estimators do not involve nonparametric function estimators and in particular do not rely on sample-size dependent parameters smoothing.
arXiv Detail & Related papers (2024-07-11T13:28:34Z) - SPARE: Symmetrized Point-to-Plane Distance for Robust Non-Rigid Registration [76.40993825836222]
We propose SPARE, a novel formulation that utilizes a symmetrized point-to-plane distance for robust non-rigid registration.
The proposed method greatly improves the accuracy of non-rigid registration problems and maintains relatively high solution efficiency.
arXiv Detail & Related papers (2024-05-30T15:55:04Z) - Nearest Neighbor Sampling for Covariate Shift Adaptation [7.940293148084844]
We propose a new covariate shift adaptation method which avoids estimating the weights.
The basic idea is to directly work on unlabeled target data, labeled according to the $k$-nearest neighbors in the source dataset.
Our experiments show that it achieves drastic reduction in the running time with remarkable accuracy.
arXiv Detail & Related papers (2023-12-15T17:28:09Z) - Diffeomorphic Mesh Deformation via Efficient Optimal Transport for Cortical Surface Reconstruction [40.73187749820041]
Mesh deformation plays a pivotal role in many 3D vision tasks including dynamic simulations, rendering, and reconstruction.
A prevalent approach in current deep learning is the set-based approach which measures the discrepancy between two surfaces by comparing two randomly sampled point-clouds from the two meshes with Chamfer pseudo-distance.
We propose a novel metric for learning mesh deformation, defined by sliced Wasserstein distance on meshes represented as probability measures that generalize the set-based approach.
arXiv Detail & Related papers (2023-05-27T19:10:19Z) - Robust Ellipsoid-specific Fitting via Expectation Maximization [0.0]
Ellipsoid fitting is of general interest in machine vision, such as object detection and shape approximation.
We propose a novel and robust method for ellipsoid fitting in a noisy, outlier-contaminated 3D environment.
Our method is ellipsoid-specific, parameter free, and more robust against noise, outliers, and the large axis ratio.
arXiv Detail & Related papers (2021-10-26T00:43:02Z) - A Stochastic Newton Algorithm for Distributed Convex Optimization [62.20732134991661]
We analyze a Newton algorithm for homogeneous distributed convex optimization, where each machine can calculate gradients of the same population objective.
We show that our method can reduce the number, and frequency, of required communication rounds compared to existing methods without hurting performance.
arXiv Detail & Related papers (2021-10-07T17:51:10Z) - Kernel distance measures for time series, random fields and other
structured data [71.61147615789537]
kdiff is a novel kernel-based measure for estimating distances between instances of structured data.
It accounts for both self and cross similarities across the instances and is defined using a lower quantile of the distance distribution.
Some theoretical results are provided for separability conditions using kdiff as a distance measure for clustering and classification problems.
arXiv Detail & Related papers (2021-09-29T22:54:17Z) - Instance-Optimal Compressed Sensing via Posterior Sampling [101.43899352984774]
We show for Gaussian measurements and emphany prior distribution on the signal, that the posterior sampling estimator achieves near-optimal recovery guarantees.
We implement the posterior sampling estimator for deep generative priors using Langevin dynamics, and empirically find that it produces accurate estimates with more diversity than MAP.
arXiv Detail & Related papers (2021-06-21T22:51:56Z) - MongeNet: Efficient Sampler for Geometric Deep Learning [17.369783838267942]
MongeNet is a fast and optimal transport based sampler that allows for an accurate discretization of a mesh with better approximation properties.
We compare our method to the ubiquitous random uniform sampling and show that the approximation error is almost half with a very small computational overhead.
arXiv Detail & Related papers (2021-04-29T17:59:01Z) - The Unbalanced Gromov Wasserstein Distance: Conic Formulation and
Relaxation [0.0]
Comparing metric measure spaces (i.e. a metric space endowed with aprobability distribution) is at the heart of many machine learning problems.
The most popular distance between such metric spaces is the metric measureGro-Wasserstein (GW) distance of which is a quadratic.
The GW formulation alleviates the comparison of metric spaces equipped with arbitrary positive measures up to isometries.
arXiv Detail & Related papers (2020-09-09T12:38:14Z) - Minimax Optimal Estimation of KL Divergence for Continuous Distributions [56.29748742084386]
Esting Kullback-Leibler divergence from identical and independently distributed samples is an important problem in various domains.
One simple and effective estimator is based on the k nearest neighbor between these samples.
arXiv Detail & Related papers (2020-02-26T16:37:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.