Visual Object Tracking with Discriminative Filters and Siamese Networks:
  A Survey and Outlook
        - URL: http://arxiv.org/abs/2112.02838v1
 - Date: Mon, 6 Dec 2021 07:57:10 GMT
 - Title: Visual Object Tracking with Discriminative Filters and Siamese Networks:
  A Survey and Outlook
 - Authors: Sajid Javed, Martin Danelljan, Fahad Shahbaz Khan, Muhammad Haris
  Khan, Michael Felsberg, and Jiri Matas
 - Abstract summary: Discriminative Correlation Filters (DCFs) and deep Siamese Networks (SNs) have emerged as dominating tracking paradigms.
This survey presents a systematic and thorough review of more than 90 DCFs and Siamese trackers, based on results in nine tracking benchmarks.
 - Score: 97.27199633649991
 - License: http://creativecommons.org/licenses/by-sa/4.0/
 - Abstract:   Accurate and robust visual object tracking is one of the most challenging and
fundamental computer vision problems. It entails estimating the trajectory of
the target in an image sequence, given only its initial location, and
segmentation, or its rough approximation in the form of a bounding box.
Discriminative Correlation Filters (DCFs) and deep Siamese Networks (SNs) have
emerged as dominating tracking paradigms, which have led to significant
progress. Following the rapid evolution of visual object tracking in the last
decade, this survey presents a systematic and thorough review of more than 90
DCFs and Siamese trackers, based on results in nine tracking benchmarks. First,
we present the background theory of both the DCF and Siamese tracking core
formulations. Then, we distinguish and comprehensively review the shared as
well as specific open research challenges in both these tracking paradigms.
Furthermore, we thoroughly analyze the performance of DCF and Siamese trackers
on nine benchmarks, covering different experimental aspects of visual tracking:
datasets, evaluation metrics, performance, and speed comparisons. We finish the
survey by presenting recommendations and suggestions for distinguished open
challenges based on our analysis.
 
       
      
        Related papers
        - OmniTracker: Unifying Object Tracking by Tracking-with-Detection [119.51012668709502]
OmniTracker is presented to resolve all the tracking tasks with a fully shared network architecture, model weights, and inference pipeline.
Experiments on 7 tracking datasets, including LaSOT, TrackingNet, DAVIS16-17, MOT17, MOTS20, and YTVIS19, demonstrate that OmniTracker achieves on-par or even better results than both task-specific and unified tracking models.
arXiv  Detail & Related papers  (2023-03-21T17:59:57Z) - AVisT: A Benchmark for Visual Object Tracking in Adverse Visibility [125.77396380698639]
AVisT is a benchmark for visual tracking in diverse scenarios with adverse visibility.
AVisT comprises 120 challenging sequences with 80k annotated frames, spanning 18 diverse scenarios.
We benchmark 17 popular and recent trackers on AVisT with detailed analysis of their tracking performance across attributes.
arXiv  Detail & Related papers  (2022-08-14T17:49:37Z) - A Bayesian Detect to Track System for Robust Visual Object Tracking and
  Semi-Supervised Model Learning [1.7268829007643391]
We ad-dress problems in a Bayesian tracking and detection framework parameterized by neural network outputs.
We propose a particle filter-based approximate sampling algorithm for tracking object state estimation.
Based on our particle filter inference algorithm, a semi-supervised learn-ing algorithm is utilized for learning tracking network on intermittent labeled frames.
arXiv  Detail & Related papers  (2022-05-05T00:18:57Z) - Single Object Tracking Research: A Survey [44.24280758718638]
This paper presents the rationale and works of two most popular tracking frameworks in past ten years.
We present some deep learning based tracking methods categorized by different network structures.
We also introduce some classical strategies for handling the challenges in tracking problem.
arXiv  Detail & Related papers  (2022-04-25T02:59:15Z) - Coarse-to-Fine Object Tracking Using Deep Features and Correlation
  Filters [2.3526458707956643]
This paper presents a novel deep learning tracking algorithm.
We exploit the generalization ability of deep features to coarsely estimate target translation.
Then, we capitalize on the discriminative power of correlation filters to precisely localize the tracked object.
arXiv  Detail & Related papers  (2020-12-23T16:43:21Z) - Multi-modal Visual Tracking: Review and Experimental Comparison [85.20414397784937]
We summarize the multi-modal tracking algorithms, especially visible-depth (RGB-D) tracking and visible-thermal (RGB-T) tracking.
We conduct experiments to analyze the effectiveness of trackers on five datasets.
arXiv  Detail & Related papers  (2020-12-08T02:39:38Z) - Self-supervised Object Tracking with Cycle-consistent Siamese Networks [55.040249900677225]
We exploit an end-to-end Siamese network in a cycle-consistent self-supervised framework for object tracking.
We propose to integrate a Siamese region proposal and mask regression network in our tracking framework so that a fast and more accurate tracker can be learned without the annotation of each frame.
arXiv  Detail & Related papers  (2020-08-03T04:10:38Z) - TAO: A Large-Scale Benchmark for Tracking Any Object [95.87310116010185]
Tracking Any Object dataset consists of 2,907 high resolution videos, captured in diverse environments, which are half a minute long on average.
We ask annotators to label objects that move at any point in the video, and give names to them post factum.
Our vocabulary is both significantly larger and qualitatively different from existing tracking datasets.
arXiv  Detail & Related papers  (2020-05-20T21:07:28Z) 
        This list is automatically generated from the titles and abstracts of the papers in this site.
       
     
           This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.