Event-based Visual Tracking in Dynamic Environments
- URL: http://arxiv.org/abs/2212.07754v1
- Date: Thu, 15 Dec 2022 12:18:13 GMT
- Title: Event-based Visual Tracking in Dynamic Environments
- Authors: Irene Perez-Salesa, Rodrigo Aldana-Lopez, Carlos Sagues
- Abstract summary: We propose a framework to take advantage of both event cameras and off-the-shelf deep learning for object tracking.
We show that reconstructing event data into intensity frames improves the tracking performance in conditions under which conventional cameras fail to provide acceptable results.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Visual object tracking under challenging conditions of motion and light can
be hindered by the capabilities of conventional cameras, prone to producing
images with motion blur. Event cameras are novel sensors suited to robustly
perform vision tasks under these conditions. However, due to the nature of
their output, applying them to object detection and tracking is non-trivial. In
this work, we propose a framework to take advantage of both event cameras and
off-the-shelf deep learning for object tracking. We show that reconstructing
event data into intensity frames improves the tracking performance in
conditions under which conventional cameras fail to provide acceptable results.
Related papers
- BlinkTrack: Feature Tracking over 100 FPS via Events and Images [50.98675227695814]
We propose a novel framework, BlinkTrack, which integrates event data with RGB images for high-frequency feature tracking.
Our method extends the traditional Kalman filter into a learning-based framework, utilizing differentiable Kalman filters in both event and image branches.
Experimental results indicate that BlinkTrack significantly outperforms existing event-based methods.
arXiv Detail & Related papers (2024-09-26T15:54:18Z) - Distractor-aware Event-based Tracking [45.07711356111249]
We propose a distractor-aware event-based tracker that introduces transformer modules into Siamese network architecture (named DANet)
Our model is mainly composed of a motion-aware network and a target-aware network, which simultaneously exploits both motion cues and object contours from event data.
Our DANet can be trained in an end-to-end manner without any post-processing and can run at over 80 FPS on a single V100.
arXiv Detail & Related papers (2023-10-22T05:50:20Z) - SpikeMOT: Event-based Multi-Object Tracking with Sparse Motion Features [52.213656737672935]
SpikeMOT is an event-based multi-object tracker.
SpikeMOT uses spiking neural networks to extract sparsetemporal features from event streams associated with objects.
arXiv Detail & Related papers (2023-09-29T05:13:43Z) - On the Generation of a Synthetic Event-Based Vision Dataset for
Navigation and Landing [69.34740063574921]
This paper presents a methodology for generating event-based vision datasets from optimal landing trajectories.
We construct sequences of photorealistic images of the lunar surface with the Planet and Asteroid Natural Scene Generation Utility.
We demonstrate that the pipeline can generate realistic event-based representations of surface features by constructing a dataset of 500 trajectories.
arXiv Detail & Related papers (2023-08-01T09:14:20Z) - Data-driven Feature Tracking for Event Cameras [48.04815194265117]
We introduce the first data-driven feature tracker for event cameras, which leverages low-latency events to track features detected in a grayscale frame.
By directly transferring zero-shot from synthetic to real data, our data-driven tracker outperforms existing approaches in relative feature age by up to 120%.
This performance gap is further increased to 130% by adapting our tracker to real data with a novel self-supervision strategy.
arXiv Detail & Related papers (2022-11-23T10:20:11Z) - PL-EVIO: Robust Monocular Event-based Visual Inertial Odometry with
Point and Line Features [3.6355269783970394]
Event cameras are motion-activated sensors that capture pixel-level illumination changes instead of the intensity image with a fixed frame rate.
We propose a robust, high-accurate, and real-time optimization-based monocular event-based visual-inertial odometry (VIO) method.
arXiv Detail & Related papers (2022-09-25T06:14:12Z) - Are High-Resolution Event Cameras Really Needed? [62.70541164894224]
In low-illumination conditions and at high speeds, low-resolution cameras can outperform high-resolution ones, while requiring a significantly lower bandwidth.
We provide both empirical and theoretical evidence for this claim, which indicates that high-resolution event cameras exhibit higher per-pixel event rates.
In most cases, high-resolution event cameras show a lower task performance, compared to lower resolution sensors in these conditions.
arXiv Detail & Related papers (2022-03-28T12:06:20Z) - Moving Object Detection for Event-based Vision using k-means Clustering [0.0]
Moving object detection is a crucial task in computer vision.
Event-based cameras are bio-inspired cameras that work by mimicking the working of the human eye.
In this paper, we investigate the application of the k-means clustering technique in detecting moving objects in event-based data.
arXiv Detail & Related papers (2021-09-04T14:43:14Z) - Tracking 6-DoF Object Motion from Events and Frames [0.0]
We propose a novel approach for 6 degree-of-freedom (6-DoF)object motion tracking that combines measurements of eventand frame-based cameras.
arXiv Detail & Related papers (2021-03-29T12:39:38Z) - End-to-end Learning of Object Motion Estimation from Retinal Events for
Event-based Object Tracking [35.95703377642108]
We propose a novel deep neural network to learn and regress a parametric object-level motion/transform model for event-based object tracking.
To achieve this goal, we propose a synchronous Time-Surface with Linear Time Decay representation.
We feed the sequence of TSLTD frames to a novel Retinal Motion Regression Network (RMRNet) perform to an end-to-end 5-DoF object motion regression.
arXiv Detail & Related papers (2020-02-14T08:19:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.