Neural Non-Rigid Tracking
- URL: http://arxiv.org/abs/2006.13240v2
- Date: Tue, 12 Jan 2021 18:15:37 GMT
- Title: Neural Non-Rigid Tracking
- Authors: Alja\v{z} Bo\v{z}i\v{c}, Pablo Palafox, Michael Zollh\"ofer, Angela
Dai, Justus Thies, Matthias Nie{\ss}ner
- Abstract summary: We introduce a novel, end-to-end learnable, differentiable non-rigid tracker.
We employ a convolutional neural network to predict dense correspondences and their confidences.
Compared to state-of-the-art approaches, our algorithm shows improved reconstruction performance.
- Score: 26.41847163649205
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce a novel, end-to-end learnable, differentiable non-rigid tracker
that enables state-of-the-art non-rigid reconstruction by a learned robust
optimization. Given two input RGB-D frames of a non-rigidly moving object, we
employ a convolutional neural network to predict dense correspondences and
their confidences. These correspondences are used as constraints in an
as-rigid-as-possible (ARAP) optimization problem. By enabling gradient
back-propagation through the weighted non-linear least squares solver, we are
able to learn correspondences and confidences in an end-to-end manner such that
they are optimal for the task of non-rigid tracking. Under this formulation,
correspondence confidences can be learned via self-supervision, informing a
learned robust optimization, where outliers and wrong correspondences are
automatically down-weighted to enable effective tracking. Compared to
state-of-the-art approaches, our algorithm shows improved reconstruction
performance, while simultaneously achieving 85 times faster correspondence
prediction than comparable deep-learning based methods. We make our code
available.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.