Occlusion Aware Kernel Correlation Filter Tracker using RGB-D
- URL: http://arxiv.org/abs/2105.12161v1
- Date: Tue, 25 May 2021 18:37:39 GMT
- Title: Occlusion Aware Kernel Correlation Filter Tracker using RGB-D
- Authors: Srishti Yadav
- Abstract summary: This thesis first details the workings prototype of the Kernelized Correlation Filter tracker.
We investigate its effectiveness in real-time applications and supporting visualizations.
We also study the use of particle filters to improve trackers' accuracy.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Unlike deep learning which requires large training datasets, correlation
filter-based trackers like Kernelized Correlation Filter (KCF) uses implicit
properties of tracked images (circulant matrices) for training in real-time.
Despite their practical application in tracking, a need for a better
understanding of the fundamentals associated with KCF in terms of
theoretically, mathematically, and experimentally exists. This thesis first
details the workings prototype of the tracker and investigates its
effectiveness in real-time applications and supporting visualizations. We
further address some of the drawbacks of the tracker in cases of occlusions,
scale changes, object rotation, out-of-view and model drift with our novel
RGB-D Kernel Correlation tracker. We also study the use of particle filters to
improve trackers' accuracy. Our results are experimentally evaluated using a)
standard dataset and b) real-time using the Microsoft Kinect V2 sensor. We
believe this work will set the basis for a better understanding of the
effectiveness of kernel-based correlation filter trackers and to further define
some of its possible advantages in tracking.
Related papers
- Temporal Correlation Meets Embedding: Towards a 2nd Generation of JDE-based Real-Time Multi-Object Tracking [52.04679257903805]
Joint Detection and Embedding (JDE) trackers have demonstrated excellent performance in Multi-Object Tracking (MOT) tasks.
Our tracker, named TCBTrack, achieves state-of-the-art performance on multiple public benchmarks.
arXiv Detail & Related papers (2024-07-19T07:48:45Z) - Exploring Dynamic Transformer for Efficient Object Tracking [58.120191254379854]
We propose DyTrack, a dynamic transformer framework for efficient tracking.
DyTrack automatically learns to configure proper reasoning routes for various inputs, gaining better utilization of the available computational budget.
Experiments on multiple benchmarks demonstrate that DyTrack achieves promising speed-precision trade-offs with only a single model.
arXiv Detail & Related papers (2024-03-26T12:31:58Z) - Beyond Kalman Filters: Deep Learning-Based Filters for Improved Object
Tracking [3.5693768338940304]
We propose two innovative data-driven filtering methods for tracking-by-detection systems.
The first method employs a Bayesian filter with a trainable motion model to predict an object's future location.
The second method, an end-to-end trainable filter, goes a step further by learning to correct detector errors.
arXiv Detail & Related papers (2024-02-15T10:47:44Z) - iKUN: Speak to Trackers without Retraining [21.555469501789577]
We propose an insertable Knowledge Unification Network, termed iKUN, to enable communication with off-the-shelf trackers.
To improve the localization accuracy, we present a neural version of Kalman filter (NKF) to dynamically adjust process noise.
We also contribute a more challenging dataset, Refer-Dance, by extending public DanceTrack dataset with motion and dressing descriptions.
arXiv Detail & Related papers (2023-12-25T11:48:55Z) - UncLe-SLAM: Uncertainty Learning for Dense Neural SLAM [60.575435353047304]
We present an uncertainty learning framework for dense neural simultaneous localization and mapping (SLAM)
We propose an online framework for sensor uncertainty estimation that can be trained in a self-supervised manner from only 2D input data.
arXiv Detail & Related papers (2023-06-19T16:26:25Z) - Learning Dual-Fused Modality-Aware Representations for RGBD Tracking [67.14537242378988]
Compared with the traditional RGB object tracking, the addition of the depth modality can effectively solve the target and background interference.
Some existing RGBD trackers use the two modalities separately and thus some particularly useful shared information between them is ignored.
We propose a novel Dual-fused Modality-aware Tracker (termed DMTracker) which aims to learn informative and discriminative representations of the target objects for robust RGBD tracking.
arXiv Detail & Related papers (2022-11-06T07:59:07Z) - Learning Dynamic Compact Memory Embedding for Deformable Visual Object
Tracking [82.34356879078955]
We propose a compact memory embedding to enhance the discrimination of the segmentation-based deformable visual tracking method.
Our method outperforms the excellent segmentation-based trackers, i.e., D3S and SiamMask on DAVIS 2017 benchmark.
arXiv Detail & Related papers (2021-11-23T03:07:12Z) - MFGNet: Dynamic Modality-Aware Filter Generation for RGB-T Tracking [72.65494220685525]
We propose a new dynamic modality-aware filter generation module (named MFGNet) to boost the message communication between visible and thermal data.
We generate dynamic modality-aware filters with two independent networks. The visible and thermal filters will be used to conduct a dynamic convolutional operation on their corresponding input feature maps respectively.
To address issues caused by heavy occlusion, fast motion, and out-of-view, we propose to conduct a joint local and global search by exploiting a new direction-aware target-driven attention mechanism.
arXiv Detail & Related papers (2021-07-22T03:10:51Z) - Learning Residue-Aware Correlation Filters and Refining Scale Estimates
with the GrabCut for Real-Time UAV Tracking [12.718396980204961]
Unmanned aerial vehicle (UAV)-based tracking is attracting increasing attention and developing rapidly in applications such as agriculture, aviation, navigation, transportation and public security.
Recently, discriminative correlation filters (DCF)-based trackers have stood out in UAV tracking community for their high efficiency and robustness on a single CPU.
In this paper, we explore using segmentation by the GrabCut to improve the wildly adopted discriminative scale estimation in DCF-based trackers.
arXiv Detail & Related papers (2021-04-07T13:35:01Z) - Coarse-to-Fine Object Tracking Using Deep Features and Correlation
Filters [2.3526458707956643]
This paper presents a novel deep learning tracking algorithm.
We exploit the generalization ability of deep features to coarsely estimate target translation.
Then, we capitalize on the discriminative power of correlation filters to precisely localize the tracked object.
arXiv Detail & Related papers (2020-12-23T16:43:21Z) - Learning Consistency Pursued Correlation Filters for Real-Time UAV
Tracking [12.292672531693794]
This work proposes a novel approach with dynamic consistency pursued correlation filters, i.e., the CPCF tracker.
By minimizing the difference between the practical and the scheduled ideal consistency map, the consistency level is constrained to maintain temporal smoothness.
The proposed tracker favorably surpasses the other 25 state-of-the-art trackers with real-time running speed ($sim$43FPS) on a single CPU.
arXiv Detail & Related papers (2020-08-09T10:22:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.