SiamRCR: Reciprocal Classification and Regression for Visual Object
Tracking
- URL: http://arxiv.org/abs/2105.11237v1
- Date: Mon, 24 May 2021 12:21:25 GMT
- Title: SiamRCR: Reciprocal Classification and Regression for Visual Object
Tracking
- Authors: Jinlong Peng, Zhengkai Jiang, Yueyang Gu, Yang Wu, Yabiao Wang, Ying
Tai, Chengjie Wang, Weiyao Lin
- Abstract summary: We propose a novel siamese tracking algorithm called SiamRCR, addressing this problem with a simple, light and effective solution.
It builds reciprocal links between classification and regression branches, which can dynamically re-weight their losses for each positive sample.
In addition, we add a localization branch to predict the localization accuracy, so that it can work as the replacement of the regression assistance link during inference.
- Score: 47.647615772027606
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, most siamese network based trackers locate targets via object
classification and bounding-box regression. Generally, they select the
bounding-box with maximum classification confidence as the final prediction.
This strategy may miss the right result due to the accuracy misalignment
between classification and regression. In this paper, we propose a novel
siamese tracking algorithm called SiamRCR, addressing this problem with a
simple, light and effective solution. It builds reciprocal links between
classification and regression branches, which can dynamically re-weight their
losses for each positive sample. In addition, we add a localization branch to
predict the localization accuracy, so that it can work as the replacement of
the regression assistance link during inference. This branch makes the training
and inference more consistent. Extensive experimental results demonstrate the
effectiveness of SiamRCR and its superiority over the state-of-the-art
competitors on GOT-10k, LaSOT, TrackingNet, OTB-2015, VOT-2018 and VOT-2019.
Moreover, our SiamRCR runs at 65 FPS, far above the real-time requirement.
Related papers
- Multi-attention Associate Prediction Network for Visual Tracking [3.9628431811908533]
classification-regression prediction networks have realized impressive success in several modern deep trackers.
There is an inherent difference between classification and regression tasks, so they have diverse even opposite demands for feature matching.
We propose a multi-attention associate prediction network (MAPNet) to tackle the above problems.
arXiv Detail & Related papers (2024-03-25T03:18:58Z) - SiamTHN: Siamese Target Highlight Network for Visual Tracking [11.111738354621595]
Siamese network based trackers treat each channel in the feature maps generated by the backbone network equally.
No structural links between the classification and regression branches in these trackers, and the two branches are optimized separately during training.
A Target Highlight Module is proposed to help the generated similarity response maps to be more focused on the target region.
arXiv Detail & Related papers (2023-03-22T04:33:02Z) - Open-Set Recognition: A Good Closed-Set Classifier is All You Need [146.6814176602689]
We show that the ability of a classifier to make the 'none-of-above' decision is highly correlated with its accuracy on the closed-set classes.
We use this correlation to boost the performance of the cross-entropy OSR 'baseline' by improving its closed-set accuracy.
We also construct new benchmarks which better respect the task of detecting semantic novelty.
arXiv Detail & Related papers (2021-10-12T17:58:59Z) - Target Transformed Regression for Accurate Tracking [30.516462193231888]
This paper repurposes a Transformer-alike regression branch, termed as Target Transformed Regression (TREG) for accurate anchor-free tracking.
The core to our TREG is to model pair-wise relation between elements in target template and search region, and use the resulted target enhanced visual representation for accurate bounding box regression.
In addition, we devise a simple online template update mechanism to select reliable templates, increasing the robustness for appearance variations and geometric deformations of target in time.
arXiv Detail & Related papers (2021-04-01T11:25:23Z) - Higher Performance Visual Tracking with Dual-Modal Localization [106.91097443275035]
Visual Object Tracking (VOT) has synchronous needs for both robustness and accuracy.
We propose a dual-modal framework for target localization, consisting of robust localization suppressingors via ONR and the accurate localization attending to the target center precisely via OFC.
arXiv Detail & Related papers (2021-03-18T08:47:56Z) - CRACT: Cascaded Regression-Align-Classification for Robust Visual
Tracking [97.84109669027225]
We introduce an improved proposal refinement module, Cascaded Regression-Align- Classification (CRAC)
CRAC yields new state-of-the-art performances on many benchmarks.
In experiments on seven benchmarks including OTB-2015, UAV123, NfS, VOT-2018, TrackingNet, GOT-10k and LaSOT, our CRACT exhibits very promising results in comparison with state-of-the-art competitors.
arXiv Detail & Related papers (2020-11-25T02:18:33Z) - Cascaded Regression Tracking: Towards Online Hard Distractor
Discrimination [202.2562153608092]
We propose a cascaded regression tracker with two sequential stages.
In the first stage, we filter out abundant easily-identified negative candidates.
In the second stage, a discrete sampling based ridge regression is designed to double-check the remaining ambiguous hard samples.
arXiv Detail & Related papers (2020-06-18T07:48:01Z) - Fully Convolutional Online Tracking [47.78513247048846]
We present a fully convolutional online tracking framework, coined as FCOT, for both classification and regression branches.
Our key contribution is to introduce an online regression model generator (RMG) for initializing weights of the target filter with online samples.
Thanks to the unique design of RMG, our FCOT can not only be more effective in handling target variation along temporal dimension thus generating more precise results.
arXiv Detail & Related papers (2020-04-15T14:21:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.