Revisiting Robust Model Fitting Using Truncated Loss
- URL: http://arxiv.org/abs/2008.01574v2
- Date: Sun, 25 Jun 2023 13:10:25 GMT
- Title: Revisiting Robust Model Fitting Using Truncated Loss
- Authors: Fei Wen, Hewen Wei, Yipeng Liu, and Peilin Liu
- Abstract summary: New algorithms are applied to various 2D/3D registration problems.
They outperform RANSAC and approximate approximate MC methods at high outlier ratios.
New algorithms also compare favorably with state-of-the-art registration methods, especially in high noise and outliers.
- Score: 19.137291311347788
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Robust fitting is a fundamental problem in low-level vision, which is
typically achieved by maximum consensus (MC) estimators to identify inliers
first or by M-estimators directly. While these two methods are discriminately
preferred in different applications, truncated loss based M-estimators are
similar to MC as they can also identify inliers. This work revisits a
formulation that achieves simultaneous inlier identification and model
estimation (SIME) using truncated loss. It has a generalized form adapts to
both linear and nonlinear residual models. We show that as SIME takes fitting
residual into account in finding inliers, its lowest achievable residual in
model fitting is lower than that of MC robust fitting. Then, an alternating
minimization (AM) algorithm is employed to solve the SIME formulation.
Meanwhile, a semidefinite relaxation (SDR) embedded AM algorithm is developed
in order to ease the high nonconvexity of the SIME formulation. Furthermore,
the new algorithms are applied to various 2D/3D registration problems.
Experimental results show that the new algorithms significantly outperform
RANSAC and deterministic approximate MC methods at high outlier ratios.
Besides, in rotation and Euclidean registration problems, the new algorithms
also compare favorably with state-of-the-art registration methods, especially
in high noise and outliers. Code is available at
\textit{https://github.com/FWen/mcme.git}.
Related papers
- On the Performance of Empirical Risk Minimization with Smoothed Data [59.3428024282545]
Empirical Risk Minimization (ERM) is able to achieve sublinear error whenever a class is learnable with iid data.
We show that ERM is able to achieve sublinear error whenever a class is learnable with iid data.
arXiv Detail & Related papers (2024-02-22T21:55:41Z) - Fast Semisupervised Unmixing Using Nonconvex Optimization [80.11512905623417]
We introduce a novel convex convex model for semi/library-based unmixing.
We demonstrate the efficacy of Alternating Methods of sparse unsupervised unmixing.
arXiv Detail & Related papers (2024-01-23T10:07:41Z) - Outlier-Insensitive Kalman Filtering Using NUV Priors [24.413595920205907]
In practice, observations are corrupted by outliers, severely impairing the Kalman filter (KF)s performance.
In this work, an outlier-insensitive KF is proposed, where is achieved by modeling each potential outlier as a normally distributed random variable with unknown variance (NUV)
The NUVs variances are estimated online, using both expectation-maximization (EM) and alternating robustness (AM)
arXiv Detail & Related papers (2022-10-12T11:00:13Z) - On Learning Mixture of Linear Regressions in the Non-Realizable Setting [44.307245411703704]
We show that mixture of linear regressions (MLR) can be used for prediction where instead of predicting a label, the model predicts a list of values.
In this paper we show that a version of the popular minimization (AM) algorithm finds the best fit lines in a dataset even when a realizable model is not assumed.
arXiv Detail & Related papers (2022-05-26T05:34:57Z) - Effective multi-view registration of point sets based on student's t
mixture model [15.441928157356477]
This paper proposes an effective registration method based on Student's t Mixture Model (StMM)
It is more efficient to achieve multi-view registration since all t-distribution centroids can be obtained by the NN search method.
Experimental results illustrate its superior performance and accuracy over state-of-the-art methods.
arXiv Detail & Related papers (2020-12-13T08:27:29Z) - DeepGMR: Learning Latent Gaussian Mixture Models for Registration [113.74060941036664]
Point cloud registration is a fundamental problem in 3D computer vision, graphics and robotics.
In this paper, we introduce Deep Gaussian Mixture Registration (DeepGMR), the first learning-based registration method.
Our proposed method shows favorable performance when compared with state-of-the-art geometry-based and learning-based registration methods.
arXiv Detail & Related papers (2020-08-20T17:25:16Z) - Outlier-Robust Estimation: Hardness, Minimally Tuned Algorithms, and
Applications [25.222024234900445]
This paper introduces two unifying formulations for outlier-robust estimation, Generalized Maximum Consensus (G-MC) and Generalized Truncated Least Squares (G-TLS)
Our first contribution is a proof that outlier-robust estimation is inapproximable: in the worst case, it is impossible to (even approximately) find the set of outliers.
We propose the first minimally tuned algorithms for outlier rejection, that dynamically decide how to separate inliers from outliers.
arXiv Detail & Related papers (2020-07-29T21:06:13Z) - Making Affine Correspondences Work in Camera Geometry Computation [62.7633180470428]
Local features provide region-to-region rather than point-to-point correspondences.
We propose guidelines for effective use of region-to-region matches in the course of a full model estimation pipeline.
Experiments show that affine solvers can achieve accuracy comparable to point-based solvers at faster run-times.
arXiv Detail & Related papers (2020-07-20T12:07:48Z) - Robust Compressed Sensing using Generative Models [98.64228459705859]
In this paper we propose an algorithm inspired by the Median-of-Means (MOM)
Our algorithm guarantees recovery for heavy-tailed data, even in the presence of outliers.
arXiv Detail & Related papers (2020-06-16T19:07:41Z) - Least Squares Regression with Markovian Data: Fundamental Limits and
Algorithms [69.45237691598774]
We study the problem of least squares linear regression where the data-points are dependent and are sampled from a Markov chain.
We establish sharp information theoretic minimax lower bounds for this problem in terms of $tau_mathsfmix$.
We propose an algorithm based on experience replay--a popular reinforcement learning technique--that achieves a significantly better error rate.
arXiv Detail & Related papers (2020-06-16T04:26:50Z) - Quasi-Newton Solver for Robust Non-Rigid Registration [35.66014845211251]
We propose a formulation for robust non-rigid registration based on a globally smooth robust estimator for data fitting and regularization.
We apply the majorization-minimization algorithm to the problem, which reduces each iteration to solving a simple least-squares problem with L-BFGS.
arXiv Detail & Related papers (2020-04-09T01:45:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.