TUM-VIE: The TUM Stereo Visual-Inertial Event Dataset
- URL: http://arxiv.org/abs/2108.07329v1
- Date: Mon, 16 Aug 2021 19:53:56 GMT
- Title: TUM-VIE: The TUM Stereo Visual-Inertial Event Dataset
- Authors: Simon Klenk, Jason Chui, Nikolaus Demmel, Daniel Cremers
- Abstract summary: Event cameras are bio-inspired vision sensors which measure per pixel brightness changes.
They offer numerous benefits over traditional, frame-based cameras, including low latency, high dynamic range, high temporal resolution and low power consumption.
To foster the development of 3D perception and navigation algorithms with event cameras, we present the TUM-VIE dataset.
- Score: 50.8779574716494
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Event cameras are bio-inspired vision sensors which measure per pixel
brightness changes. They offer numerous benefits over traditional, frame-based
cameras, including low latency, high dynamic range, high temporal resolution
and low power consumption. Thus, these sensors are suited for robotics and
virtual reality applications. To foster the development of 3D perception and
navigation algorithms with event cameras, we present the TUM-VIE dataset. It
consists of a large variety of handheld and head-mounted sequences in indoor
and outdoor environments, including rapid motion during sports and high dynamic
range scenarios. The dataset contains stereo event data, stereo grayscale
frames at 20Hz as well as IMU data at 200Hz. Timestamps between all sensors are
synchronized in hardware. The event cameras contain a large sensor of 1280x720
pixels, which is significantly larger than the sensors used in existing stereo
event datasets (at least by a factor of ten). We provide ground truth poses
from a motion capture system at 120Hz during the beginning and end of each
sequence, which can be used for trajectory evaluation. TUM-VIE includes
challenging sequences where state-of-the art visual SLAM algorithms either fail
or result in large drift. Hence, our dataset can help to push the boundary of
future research on event-based visual-inertial perception algorithms.
Related papers
- BlinkTrack: Feature Tracking over 100 FPS via Events and Images [50.98675227695814]
We propose a novel framework, BlinkTrack, which integrates event data with RGB images for high-frequency feature tracking.
Our method extends the traditional Kalman filter into a learning-based framework, utilizing differentiable Kalman filters in both event and image branches.
Experimental results indicate that BlinkTrack significantly outperforms existing event-based methods.
arXiv Detail & Related papers (2024-09-26T15:54:18Z) - ES-PTAM: Event-based Stereo Parallel Tracking and Mapping [11.801511288805225]
Event cameras offer advantages to overcome the limitations of standard cameras.
We propose a novel event-based stereo VO system by combining two ideas.
We evaluate the system on five real-world datasets.
arXiv Detail & Related papers (2024-08-28T07:56:28Z) - VECtor: A Versatile Event-Centric Benchmark for Multi-Sensor SLAM [31.779462222706346]
Event cameras hold strong potential to complement regular cameras in situations of high dynamics or challenging illumination.
Our contribution is the first complete set of benchmark datasets captured with a multi-sensor setup.
Individual sequences include both small and large-scale environments, and cover the specific challenges targeted by dynamic vision sensors.
arXiv Detail & Related papers (2022-07-04T13:37:26Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - Event Guided Depth Sensing [50.997474285910734]
We present an efficient bio-inspired event-camera-driven depth estimation algorithm.
In our approach, we illuminate areas of interest densely, depending on the scene activity detected by the event camera.
We show the feasibility of our approach in a simulated autonomous driving sequences and real indoor environments.
arXiv Detail & Related papers (2021-10-20T11:41:11Z) - Moving Object Detection for Event-based vision using Graph Spectral
Clustering [6.354824287948164]
Moving object detection has been a central topic of discussion in computer vision for its wide range of applications.
We present an unsupervised Graph Spectral Clustering technique for Moving Object Detection in Event-based data.
We additionally show how the optimum number of moving objects can be automatically determined.
arXiv Detail & Related papers (2021-09-30T10:19:22Z) - VisEvent: Reliable Object Tracking via Collaboration of Frame and Event
Flows [93.54888104118822]
We propose a large-scale Visible-Event benchmark (termed VisEvent) due to the lack of a realistic and scaled dataset for this task.
Our dataset consists of 820 video pairs captured under low illumination, high speed, and background clutter scenarios.
Based on VisEvent, we transform the event flows into event images and construct more than 30 baseline methods.
arXiv Detail & Related papers (2021-08-11T03:55:12Z) - DSEC: A Stereo Event Camera Dataset for Driving Scenarios [55.79329250951028]
This work presents the first high-resolution, large-scale stereo dataset with event cameras.
The dataset contains 53 sequences collected by driving in a variety of illumination conditions.
It provides ground truth disparity for the development and evaluation of event-based stereo algorithms.
arXiv Detail & Related papers (2021-03-10T12:10:33Z) - Asynchronous Corner Tracking Algorithm based on Lifetime of Events for
DAVIS Cameras [0.9988653233188148]
Event cameras, i.e., the Dynamic and Active-pixel Vision Sensor (DAVIS) ones, capture the intensity changes in the scene and generates a stream of events in an asynchronous fashion.
The output rate of such cameras can reach up to 10 million events per second in high dynamic environments.
A novel asynchronous corner tracking method is proposed that uses both events and intensity images captured by a DAVIS camera.
arXiv Detail & Related papers (2020-10-29T12:02:40Z) - A Multi-spectral Dataset for Evaluating Motion Estimation Systems [7.953825491774407]
This paper presents a novel dataset for evaluating the performance of multi-spectral motion estimation systems.
All the sequences are recorded from a handheld multi-spectral device.
The depth images are captured by a Microsoft Kinect2 and can have benefits for learning cross-modalities stereo matching.
arXiv Detail & Related papers (2020-07-01T17:11:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.