VSAC: Efficient and Accurate Estimator for H and F
- URL: http://arxiv.org/abs/2106.10240v1
- Date: Fri, 18 Jun 2021 17:04:57 GMT
- Title: VSAC: Efficient and Accurate Estimator for H and F
- Authors: Maksym Ivashechkin, Daniel Barath, Jiri Matas
- Abstract summary: VSAC is a RANSAC-type robust estimator with a number of novelties.
It is significantly faster than all its predecessors and runs on average in 1-2 ms, on a CPU.
It is two orders of magnitude faster and yet as precise as MAGSAC++, the currently most accurate estimator of two-view geometry.
- Score: 68.65610177368617
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present VSAC, a RANSAC-type robust estimator with a number of novelties.
It benefits from the introduction of the concept of independent inliers that
improves significantly the efficacy of the dominant plane handling and, also,
allows near error-free rejection of incorrect models, without false positives.
The local optimization process and its application is improved so that it is
run on average only once. Further technical improvements include adaptive
sequential hypothesis verification and efficient model estimation via Gaussian
elimination. Experiments on four standard datasets show that VSAC is
significantly faster than all its predecessors and runs on average in 1-2 ms,
on a CPU. It is two orders of magnitude faster and yet as precise as MAGSAC++,
the currently most accurate estimator of two-view geometry. In the repeated
runs on EVD, HPatches, PhotoTourism, and Kusvod2 datasets, it never failed.
Related papers
- SureMap: Simultaneous Mean Estimation for Single-Task and Multi-Task Disaggregated Evaluation [75.56845750400116]
Disaggregated evaluation -- estimation of performance of a machine learning model on different subpopulations -- is a core task when assessing performance and group-fairness of AI systems.
We develop SureMap that has high estimation accuracy for both multi-task and single-task disaggregated evaluations of blackbox models.
Our method combines maximum a posteriori (MAP) estimation using a well-chosen prior together with cross-validation-free tuning via Stein's unbiased risk estimate (SURE)
arXiv Detail & Related papers (2024-11-14T17:53:35Z) - Consensus-Adaptive RANSAC [104.87576373187426]
We propose a new RANSAC framework that learns to explore the parameter space by considering the residuals seen so far via a novel attention layer.
The attention mechanism operates on a batch of point-to-model residuals, and updates a per-point estimation state to take into account the consensus found through a lightweight one-step transformer.
arXiv Detail & Related papers (2023-07-26T08:25:46Z) - Winner-Take-All Column Row Sampling for Memory Efficient Adaptation of Language Model [89.8764435351222]
We propose a new family of unbiased estimators called WTA-CRS, for matrix production with reduced variance.
Our work provides both theoretical and experimental evidence that, in the context of tuning transformers, our proposed estimators exhibit lower variance compared to existing ones.
arXiv Detail & Related papers (2023-05-24T15:52:08Z) - A Computational Exploration of Emerging Methods of Variable Importance
Estimation [0.0]
Estimating the importance of variables is an essential task in modern machine learning.
We propose a computational and theoretical exploration of the emerging methods of variable importance estimation.
The implementation has shown that PERF has the best performance in the case of highly correlated data.
arXiv Detail & Related papers (2022-08-05T20:00:56Z) - Enhanced Doubly Robust Learning for Debiasing Post-click Conversion Rate
Estimation [29.27760413892272]
Post-click conversion, as a strong signal indicating the user preference, is salutary for building recommender systems.
Currently, most existing methods utilize counterfactual learning to debias recommender systems.
We propose a novel double learning approach for the MRDR estimator, which can convert the error imputation into the general CVR estimation.
arXiv Detail & Related papers (2021-05-28T06:59:49Z) - Approximate Cross-Validation with Low-Rank Data in High Dimensions [35.74302895575951]
Cross-validation is an important tool for model assessment.
ACV methods can lose both speed and accuracy in high dimensions unless sparsity structure is present in the data.
We develop a new algorithm for ACV that is fast and accurate in the presence of ALR data.
arXiv Detail & Related papers (2020-08-24T16:34:05Z) - AdamP: Slowing Down the Slowdown for Momentum Optimizers on
Scale-invariant Weights [53.8489656709356]
Normalization techniques are a boon for modern deep learning.
It is often overlooked, however, that the additional introduction of momentum results in a rapid reduction in effective step sizes for scale-invariant weights.
In this paper, we verify that the widely-adopted combination of the two ingredients lead to the premature decay of effective step sizes and sub-optimal model performances.
arXiv Detail & Related papers (2020-06-15T08:35:15Z) - DADA: Differentiable Automatic Data Augmentation [58.560309490774976]
We propose Differentiable Automatic Data Augmentation (DADA) which dramatically reduces the cost.
We conduct extensive experiments on CIFAR-10, CIFAR-100, SVHN, and ImageNet datasets.
Results show our DADA is at least one order of magnitude faster than the state-of-the-art while achieving very comparable accuracy.
arXiv Detail & Related papers (2020-03-08T13:23:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.