BlinkTrack: Feature Tracking over 100 FPS via Events and Images
- URL: http://arxiv.org/abs/2409.17981v1
- Date: Thu, 26 Sep 2024 15:54:18 GMT
- Title: BlinkTrack: Feature Tracking over 100 FPS via Events and Images
- Authors: Yichen Shen, Yijin Li, Shuo Chen, Guanglin Li, Zhaoyang Huang, Hujun Bao, Zhaopeng Cui, Guofeng Zhang,
- Abstract summary: We propose a novel framework, BlinkTrack, which integrates event data with RGB images for high-frequency feature tracking.
Our method extends the traditional Kalman filter into a learning-based framework, utilizing differentiable Kalman filters in both event and image branches.
Experimental results indicate that BlinkTrack significantly outperforms existing event-based methods.
- Score: 50.98675227695814
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Feature tracking is crucial for, structure from motion (SFM), simultaneous localization and mapping (SLAM), object tracking and various computer vision tasks. Event cameras, known for their high temporal resolution and ability to capture asynchronous changes, have gained significant attention for their potential in feature tracking, especially in challenging conditions. However, event cameras lack the fine-grained texture information that conventional cameras provide, leading to error accumulation in tracking. To address this, we propose a novel framework, BlinkTrack, which integrates event data with RGB images for high-frequency feature tracking. Our method extends the traditional Kalman filter into a learning-based framework, utilizing differentiable Kalman filters in both event and image branches. This approach improves single-modality tracking, resolves ambiguities, and supports asynchronous data fusion. We also introduce new synthetic and augmented datasets to better evaluate our model. Experimental results indicate that BlinkTrack significantly outperforms existing event-based methods, exceeding 100 FPS with preprocessed event data and 80 FPS with multi-modality data.
Related papers
- DATAP-SfM: Dynamic-Aware Tracking Any Point for Robust Structure from Motion in the Wild [85.03973683867797]
This paper proposes a concise, elegant, and robust pipeline to estimate smooth camera trajectories and obtain dense point clouds for casual videos in the wild.
We show that the proposed method achieves state-of-the-art performance in terms of camera pose estimation even in complex dynamic challenge scenes.
arXiv Detail & Related papers (2024-11-20T13:01:16Z) - Tracking Any Point with Frame-Event Fusion Network at High Frame Rate [16.749590397918574]
We propose an image-event fusion point tracker, FE-TAP.
It combines the contextual information from image frames with the high temporal resolution of events.
FE-TAP achieves high frame rate and robust point tracking under various challenging conditions.
arXiv Detail & Related papers (2024-09-18T13:07:19Z) - SpikeMOT: Event-based Multi-Object Tracking with Sparse Motion Features [52.213656737672935]
SpikeMOT is an event-based multi-object tracker.
SpikeMOT uses spiking neural networks to extract sparsetemporal features from event streams associated with objects.
arXiv Detail & Related papers (2023-09-29T05:13:43Z) - Frame-Event Alignment and Fusion Network for High Frame Rate Tracking [37.35823883499189]
Most existing RGB-based trackers target low frame rate benchmarks of around 30 frames per second.
We propose an end-to-end network consisting of multi-modality alignment and fusion modules.
With the FE240hz dataset, our approach achieves high frame rate tracking up to 240Hz.
arXiv Detail & Related papers (2023-05-25T03:34:24Z) - Data-driven Feature Tracking for Event Cameras [48.04815194265117]
We introduce the first data-driven feature tracker for event cameras, which leverages low-latency events to track features detected in a grayscale frame.
By directly transferring zero-shot from synthetic to real data, our data-driven tracker outperforms existing approaches in relative feature age by up to 120%.
This performance gap is further increased to 130% by adapting our tracker to real data with a novel self-supervision strategy.
arXiv Detail & Related papers (2022-11-23T10:20:11Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - VisEvent: Reliable Object Tracking via Collaboration of Frame and Event
Flows [93.54888104118822]
We propose a large-scale Visible-Event benchmark (termed VisEvent) due to the lack of a realistic and scaled dataset for this task.
Our dataset consists of 820 video pairs captured under low illumination, high speed, and background clutter scenarios.
Based on VisEvent, we transform the event flows into event images and construct more than 30 baseline methods.
arXiv Detail & Related papers (2021-08-11T03:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.