Opening up Open-World Tracking
- URL: http://arxiv.org/abs/2104.11221v1
- Date: Thu, 22 Apr 2021 17:58:15 GMT
- Title: Opening up Open-World Tracking
- Authors: Yang Liu and Idil Esen Zulfikar and Jonathon Luiten and Achal Dave and
Aljo\v{s}a O\v{s}ep and Deva Ramanan and Bastian Leibe and Laura Leal-Taix\'e
- Abstract summary: We propose and study Open-World Tracking (OWT)
This paper is the formalization of the OWT task, along with an evaluation protocol and metric (OWTA)
We show that our Open-World Tracking Baseline, while performing well in the OWT setting, also achieves near state-of-the-art results on traditional closed-world benchmarks.
- Score: 62.12659607088812
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we propose and study Open-World Tracking (OWT). Open-world
tracking goes beyond current multi-object tracking benchmarks and methods which
focus on tracking object classes that belong to a predefined closed-set of
frequently observed object classes. In OWT, we relax this assumption: we may
encounter objects at inference time that were not labeled for training. The
main contribution of this paper is the formalization of the OWT task, along
with an evaluation protocol and metric (Open-World Tracking Accuracy, OWTA),
which decomposes into two intuitive terms, one for measuring recall, and
another for measuring track association accuracy. This allows us to perform a
rigorous evaluation of several different baselines that follow design patterns
proposed in the multi-object tracking community. Further we show that our
Open-World Tracking Baseline, while performing well in the OWT setting, also
achieves near state-of-the-art results on traditional closed-world benchmarks,
without any adjustments or tuning. We believe that this paper is an initial
step towards studying multi-object tracking in the open world, a task of
crucial importance for future intelligent agents that will need to understand,
react to, and learn from, an infinite variety of objects that can appear in an
open world.
Related papers
- Open-World Object Detection with Instance Representation Learning [1.8749305679160366]
We propose a method to train an object detector that can both detect novel objects and extract semantically rich features in open-world conditions.
Our method learns a robust and generalizable feature space, outperforming other OWOD-based feature extraction methods.
arXiv Detail & Related papers (2024-09-24T13:13:34Z) - LOSS-SLAM: Lightweight Open-Set Semantic Simultaneous Localization and Mapping [9.289001828243512]
We show that a system of identifying, localizing, and encoding objects is tightly coupled with probabilistic graphical models for performing open-set semantic simultaneous localization and mapping (SLAM)
Results are presented demonstrating that the proposed lightweight object encoding can be used to perform more accurate object-based SLAM than existing open-set methods.
arXiv Detail & Related papers (2024-04-05T19:42:55Z) - OVTrack: Open-Vocabulary Multiple Object Tracking [64.73379741435255]
OVTrack is an open-vocabulary tracker capable of tracking arbitrary object classes.
It sets a new state-of-the-art on the large-scale, large-vocabulary TAO benchmark.
arXiv Detail & Related papers (2023-04-17T16:20:05Z) - Open World DETR: Transformer based Open World Object Detection [60.64535309016623]
We propose a two-stage training approach named Open World DETR for open world object detection based on Deformable DETR.
We fine-tune the class-specific components of the model with a multi-view self-labeling strategy and a consistency constraint.
Our proposed method outperforms other state-of-the-art open world object detection methods by a large margin.
arXiv Detail & Related papers (2022-12-06T13:39:30Z) - End-to-end Tracking with a Multi-query Transformer [96.13468602635082]
Multiple-object tracking (MOT) is a challenging task that requires simultaneous reasoning about location, appearance, and identity of the objects in the scene over time.
Our aim in this paper is to move beyond tracking-by-detection approaches, to class-agnostic tracking that performs well also for unknown object classes.
arXiv Detail & Related papers (2022-10-26T10:19:37Z) - Learning Open-World Object Proposals without Learning to Classify [110.30191531975804]
We propose a classification-free Object Localization Network (OLN) which estimates the objectness of each region purely by how well the location and shape of a region overlaps with any ground-truth object.
This simple strategy learns generalizable objectness and outperforms existing proposals on cross-category generalization.
arXiv Detail & Related papers (2021-08-15T14:36:02Z) - TAO: A Large-Scale Benchmark for Tracking Any Object [95.87310116010185]
Tracking Any Object dataset consists of 2,907 high resolution videos, captured in diverse environments, which are half a minute long on average.
We ask annotators to label objects that move at any point in the video, and give names to them post factum.
Our vocabulary is both significantly larger and qualitatively different from existing tracking datasets.
arXiv Detail & Related papers (2020-05-20T21:07:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.