Observation-Centric SORT: Rethinking SORT for Robust Multi-Object
Tracking
- URL: http://arxiv.org/abs/2203.14360v1
- Date: Sun, 27 Mar 2022 17:57:08 GMT
- Title: Observation-Centric SORT: Rethinking SORT for Robust Multi-Object
Tracking
- Authors: Jinkun Cao, Xinshuo Weng, Rawal Khirodkar, Jiangmiao Pang, Kris Kitani
- Abstract summary: We show that a simple motion model can obtain state-of-the-art tracking performance without other cues like appearance.
We thus name the proposed method as Observation-Centric SORT, OC-SORT for short.
- Score: 32.32109475782992
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multi-Object Tracking (MOT) has rapidly progressed with the development of
object detection and re-identification. However, motion modeling, which
facilitates object association by forecasting short-term trajectories with past
observations, has been relatively under-explored in recent years. Current
motion models in MOT typically assume that the object motion is linear in a
small time window and needs continuous observations, so these methods are
sensitive to occlusions and non-linear motion and require high frame-rate
videos. In this work, we show that a simple motion model can obtain
state-of-the-art tracking performance without other cues like appearance. We
emphasize the role of "observation" when recovering tracks from being lost and
reducing the error accumulated by linear motion models during the lost period.
We thus name the proposed method as Observation-Centric SORT, OC-SORT for
short. It remains simple, online, and real-time but improves robustness over
occlusion and non-linear motion. It achieves 63.2 and 62.1 HOTA on MOT17 and
MOT20, respectively, surpassing all published methods. It also sets new states
of the art on KITTI Pedestrian Tracking and DanceTrack where the object motion
is highly non-linear. The code and model are available at
https://github.com/noahcao/OC_SORT.
Related papers
- MambaTrack: A Simple Baseline for Multiple Object Tracking with State Space Model [18.607106274732885]
We introduce a Mamba-based motion model named Mamba moTion Predictor (MTP)
MTP takes the spatial-temporal location dynamics of objects as input, captures the motion pattern using a bi-Mamba encoding layer, and predicts the next motion.
Our proposed tracker, MambaTrack, demonstrates advanced performance on benchmarks such as Dancetrack and SportsMOT.
arXiv Detail & Related papers (2024-08-17T11:58:47Z) - ETTrack: Enhanced Temporal Motion Predictor for Multi-Object Tracking [4.250337979548885]
We propose a motion-based MOT approach with an enhanced temporal motion predictor, ETTrack.
Specifically, the motion predictor integrates a transformer model and a Temporal Convolutional Network (TCN) to capture short-term and long-term motion patterns.
We show ETTrack achieves a competitive performance compared with state-of-the-art trackers on DanceTrack and SportsMOT.
arXiv Detail & Related papers (2024-05-24T17:51:33Z) - Delving into Motion-Aware Matching for Monocular 3D Object Tracking [81.68608983602581]
We find that the motion cue of objects along different time frames is critical in 3D multi-object tracking.
We propose MoMA-M3T, a framework that mainly consists of three motion-aware components.
We conduct extensive experiments on the nuScenes and KITTI datasets to demonstrate our MoMA-M3T achieves competitive performance against state-of-the-art methods.
arXiv Detail & Related papers (2023-08-22T17:53:58Z) - TrajectoryFormer: 3D Object Tracking Transformer with Predictive
Trajectory Hypotheses [51.60422927416087]
3D multi-object tracking (MOT) is vital for many applications including autonomous driving vehicles and service robots.
We present TrajectoryFormer, a novel point-cloud-based 3D MOT framework.
arXiv Detail & Related papers (2023-06-09T13:31:50Z) - MotionTrack: Learning Motion Predictor for Multiple Object Tracking [68.68339102749358]
We introduce a novel motion-based tracker, MotionTrack, centered around a learnable motion predictor.
Our experimental results demonstrate that MotionTrack yields state-of-the-art performance on datasets such as Dancetrack and SportsMOT.
arXiv Detail & Related papers (2023-06-05T04:24:11Z) - MotionTrack: Learning Robust Short-term and Long-term Motions for
Multi-Object Tracking [56.92165669843006]
We propose MotionTrack, which learns robust short-term and long-term motions in a unified framework to associate trajectories from a short to long range.
For dense crowds, we design a novel Interaction Module to learn interaction-aware motions from short-term trajectories, which can estimate the complex movement of each target.
For extreme occlusions, we build a novel Refind Module to learn reliable long-term motions from the target's history trajectory, which can link the interrupted trajectory with its corresponding detection.
arXiv Detail & Related papers (2023-03-18T12:38:33Z) - Probabilistic Tracklet Scoring and Inpainting for Multiple Object
Tracking [83.75789829291475]
We introduce a probabilistic autoregressive motion model to score tracklet proposals.
This is achieved by training our model to learn the underlying distribution of natural tracklets.
Our experiments demonstrate the superiority of our approach at tracking objects in challenging sequences.
arXiv Detail & Related papers (2020-12-03T23:59:27Z) - MAT: Motion-Aware Multi-Object Tracking [9.098793914779161]
In this paper, we propose Motion-Aware Tracker (MAT), focusing more on various motion patterns of different objects.
Experiments on MOT16 and MOT17 challenging benchmarks demonstrate that our MAT approach can achieve the superior performance by a large margin.
arXiv Detail & Related papers (2020-09-10T11:51:33Z) - ArTIST: Autoregressive Trajectory Inpainting and Scoring for Tracking [80.02322563402758]
One of the core components in online multiple object tracking (MOT) frameworks is associating new detections with existing tracklets.
We introduce a probabilistic autoregressive generative model to score tracklet proposals by directly measuring the likelihood that a tracklet represents natural motion.
arXiv Detail & Related papers (2020-04-16T06:43:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.