FastTrack: an open-source software for tracking varying numbers of
deformable objects
- URL: http://arxiv.org/abs/2011.06837v1
- Date: Fri, 13 Nov 2020 09:52:58 GMT
- Title: FastTrack: an open-source software for tracking varying numbers of
deformable objects
- Authors: Benjamin Gallois and Rapha\"el Candelier
- Abstract summary: We compiled a database of two-dimensional movies for different biological and physical systems.
We developed a general-purpose, optimized, open-source, cross-platform, easy to install and use, self-updating software called FastTrack.
A benchmark shows that FastTrack is orders of magnitude faster than state-of-the-art tracking algorithms.
- Score: 0.6445605125467573
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Analyzing the dynamical properties of mobile objects requires to extract
trajectories from recordings, which is often done by tracking movies. We
compiled a database of two-dimensional movies for very different biological and
physical systems spanning a wide range of length scales and developed a
general-purpose, optimized, open-source, cross-platform, easy to install and
use, self-updating software called FastTrack. It can handle a changing number
of deformable objects in a region of interest, and is particularly suitable for
animal and cell tracking in two-dimensions. Furthermore, we introduce the
probability of incursions as a new measure of a movie's trackability that
doesn't require the knowledge of ground truth trajectories, since it is
resilient to small amounts of errors and can be computed on the basis of an ad
hoc tracking. We also leveraged the versatility and speed of FastTrack to
implement an iterative algorithm determining a set of nearly-optimized tracking
parameters -- yet further reducing the amount of human intervention -- and
demonstrate that FastTrack can be used to explore the space of tracking
parameters to optimize the number of swaps for a batch of similar movies. A
benchmark shows that FastTrack is orders of magnitude faster than
state-of-the-art tracking algorithms, with a comparable tracking accuracy. The
source code is available under the GNU GPLv3 at
https://github.com/FastTrackOrg/FastTrack and pre-compiled binaries for
Windows, Mac and Linux are available at http://www.fasttrack.sh.
Related papers
- Exploring Dynamic Transformer for Efficient Object Tracking [58.120191254379854]
We propose DyTrack, a dynamic transformer framework for efficient tracking.
DyTrack automatically learns to configure proper reasoning routes for various inputs, gaining better utilization of the available computational budget.
Experiments on multiple benchmarks demonstrate that DyTrack achieves promising speed-precision trade-offs with only a single model.
arXiv Detail & Related papers (2024-03-26T12:31:58Z) - Dense Optical Tracking: Connecting the Dots [82.79642869586587]
DOT is a novel, simple and efficient method for solving the problem of point tracking in a video.
We show that DOT is significantly more accurate than current optical flow techniques, outperforms sophisticated "universal trackers" like OmniMotion, and is on par with, or better than, the best point tracking algorithms like CoTracker.
arXiv Detail & Related papers (2023-12-01T18:59:59Z) - CoTracker: It is Better to Track Together [74.84109704301127]
CoTracker tracks dense points in a frame jointly across a video sequence.
We show that joint tracking results in a significantly higher tracking accuracy and robustness.
CoTracker operates causally on short windows, but is trained by unrolling the windows across longer video sequences.
arXiv Detail & Related papers (2023-07-14T21:13:04Z) - Real-time Online Multi-Object Tracking in Compressed Domain [66.40326768209]
Recent online Multi-Object Tracking (MOT) methods have achieved desirable tracking performance.
Inspired by the fact that the adjacent frames are highly relevant and redundant, we divide the frames into key and non-key frames.
Our tracker is about 6x faster while maintaining a comparable tracking performance.
arXiv Detail & Related papers (2022-04-05T09:47:24Z) - Context-aware Visual Tracking with Joint Meta-updating [11.226947525556813]
We propose a context-aware tracking model to optimize the tracker over the representation space, which jointly meta-update both branches by exploiting information along the whole sequence.
The proposed tracking method achieves an EAO score of 0.514 on VOT2018 with the speed of 40FPS, demonstrating its capability of improving the accuracy and robustness of the underlying tracker with little speed drop.
arXiv Detail & Related papers (2022-04-04T14:16:00Z) - Efficient Visual Tracking with Exemplar Transformers [98.62550635320514]
We introduce the Exemplar Transformer, an efficient transformer for real-time visual object tracking.
E.T.Track, our visual tracker that incorporates Exemplar Transformer layers, runs at 47 fps on a CPU.
This is up to 8 times faster than other transformer-based models.
arXiv Detail & Related papers (2021-12-17T18:57:54Z) - DeepScale: An Online Frame Size Adaptation Framework to Accelerate
Visual Multi-object Tracking [8.878656943106934]
DeepScale is a model agnostic frame size selection approach to accelerate tracking throughput.
It can find a suitable trade-off between tracking accuracy and speed by adapting frame sizes at run time.
Compared to a state-of-the-art tracker, DeepScale++, a variant of DeepScale achieves 1.57X accelerated with only moderate degradation.
arXiv Detail & Related papers (2021-07-22T00:12:58Z) - LightTrack: Finding Lightweight Neural Networks for Object Tracking via
One-Shot Architecture Search [104.84999119090887]
We present LightTrack, which uses neural architecture search (NAS) to design more lightweight and efficient object trackers.
Comprehensive experiments show that our LightTrack is effective.
It can find trackers that achieve superior performance compared to handcrafted SOTA trackers, such as SiamRPN++ and Ocean.
arXiv Detail & Related papers (2021-04-29T17:55:24Z) - STMTrack: Template-free Visual Tracking with Space-time Memory Networks [42.06375415765325]
Existing trackers with template updating mechanisms rely on time-consuming numerical optimization and complex hand-designed strategies to achieve competitive performance.
We propose a novel tracking framework built on top of a space-time memory network that is competent to make full use of historical information related to the target.
Specifically, a novel memory mechanism is introduced, which stores the historical information of the target to guide the tracker to focus on the most informative regions in the current frame.
arXiv Detail & Related papers (2021-04-01T08:10:56Z) - DroTrack: High-speed Drone-based Object Tracking Under Uncertainty [0.23204178451683263]
DroTrack is a high-speed visual single-object tracking framework for drone-captured video sequences.
We implement an effective object segmentation based on Fuzzy C Means.
We also leverage the geometrical angular motion to estimate a reliable object scale.
arXiv Detail & Related papers (2020-05-02T13:16:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.