DSEC: A Stereo Event Camera Dataset for Driving Scenarios
- URL: http://arxiv.org/abs/2103.06011v1
- Date: Wed, 10 Mar 2021 12:10:33 GMT
- Title: DSEC: A Stereo Event Camera Dataset for Driving Scenarios
- Authors: Mathias Gehrig, Willem Aarents, Daniel Gehrig, Davide Scaramuzza
- Abstract summary: This work presents the first high-resolution, large-scale stereo dataset with event cameras.
The dataset contains 53 sequences collected by driving in a variety of illumination conditions.
It provides ground truth disparity for the development and evaluation of event-based stereo algorithms.
- Score: 55.79329250951028
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Once an academic venture, autonomous driving has received unparalleled
corporate funding in the last decade. Still, the operating conditions of
current autonomous cars are mostly restricted to ideal scenarios. This means
that driving in challenging illumination conditions such as night, sunrise, and
sunset remains an open problem. In these cases, standard cameras are being
pushed to their limits in terms of low light and high dynamic range
performance. To address these challenges, we propose, DSEC, a new dataset that
contains such demanding illumination conditions and provides a rich set of
sensory data. DSEC offers data from a wide-baseline stereo setup of two color
frame cameras and two high-resolution monochrome event cameras. In addition, we
collect lidar data and RTK GPS measurements, both hardware synchronized with
all camera data. One of the distinctive features of this dataset is the
inclusion of high-resolution event cameras. Event cameras have received
increasing attention for their high temporal resolution and high dynamic range
performance. However, due to their novelty, event camera datasets in driving
scenarios are rare. This work presents the first high-resolution, large-scale
stereo dataset with event cameras. The dataset contains 53 sequences collected
by driving in a variety of illumination conditions and provides ground truth
disparity for the development and evaluation of event-based stereo algorithms.
Related papers
- SID: Stereo Image Dataset for Autonomous Driving in Adverse Conditions [1.0805335573008565]
We introduce the Stereo Image dataset (SID), a large-scale stereo-image dataset that captures a wide spectrum of challenging real-world environmental scenarios.
The dataset includes sequence-level annotations for weather conditions, time of day, location, and road conditions, along with instances of camera lens soiling.
These algorithms support consistent and reliable operation across variable weather and lighting conditions, even when handling challenging situations like lens soiling.
arXiv Detail & Related papers (2024-07-06T00:58:31Z) - Dataset and Benchmark: Novel Sensors for Autonomous Vehicle Perception [7.474695739346621]
This paper introduces the Novel Sensors for Autonomous Vehicle Perception dataset to facilitate future research on this topic.
The data was collected by repeatedly driving two 8 km routes and includes varied lighting conditions and opposing viewpoint perspectives.
To our knowledge, the NSAVP dataset is the first to include stereo thermal cameras together with stereo event and monochrome cameras.
arXiv Detail & Related papers (2024-01-24T23:25:23Z) - MUSES: The Multi-Sensor Semantic Perception Dataset for Driving under Uncertainty [46.369657697892634]
We introduce MUSES, the MUlti-SEnsor Semantic perception dataset for driving in adverse conditions under increased uncertainty.
The dataset integrates a frame camera, a lidar, a radar, an event camera, and an IMU/GNSS sensor.
MUSES proves both effective for training and challenging for evaluating models under diverse visual conditions.
arXiv Detail & Related papers (2024-01-23T13:43:17Z) - EventAid: Benchmarking Event-aided Image/Video Enhancement Algorithms
with Real-captured Hybrid Dataset [55.12137324648253]
Event cameras are emerging imaging technology that offers advantages over conventional frame-based imaging sensors in dynamic range and sensing speed.
This paper focuses on five event-aided image and video enhancement tasks.
arXiv Detail & Related papers (2023-12-13T15:42:04Z) - VECtor: A Versatile Event-Centric Benchmark for Multi-Sensor SLAM [31.779462222706346]
Event cameras hold strong potential to complement regular cameras in situations of high dynamics or challenging illumination.
Our contribution is the first complete set of benchmark datasets captured with a multi-sensor setup.
Individual sequences include both small and large-scale environments, and cover the specific challenges targeted by dynamic vision sensors.
arXiv Detail & Related papers (2022-07-04T13:37:26Z) - Lasers to Events: Automatic Extrinsic Calibration of Lidars and Event
Cameras [67.84498757689776]
This paper presents the first direct calibration method between event cameras and lidars.
It removes dependencies on frame-based camera intermediaries and/or highly-accurate hand measurements.
arXiv Detail & Related papers (2022-07-03T11:05:45Z) - Are High-Resolution Event Cameras Really Needed? [62.70541164894224]
In low-illumination conditions and at high speeds, low-resolution cameras can outperform high-resolution ones, while requiring a significantly lower bandwidth.
We provide both empirical and theoretical evidence for this claim, which indicates that high-resolution event cameras exhibit higher per-pixel event rates.
In most cases, high-resolution event cameras show a lower task performance, compared to lower resolution sensors in these conditions.
arXiv Detail & Related papers (2022-03-28T12:06:20Z) - E$^2$(GO)MOTION: Motion Augmented Event Stream for Egocentric Action
Recognition [21.199869051111367]
Event cameras capture pixel-level intensity changes in the form of "events"
N-EPIC-Kitchens is the first event-based camera extension of the large-scale EPIC-Kitchens dataset.
We show that event data provides a comparable performance to RGB and optical flow, yet without any additional flow computation at deploy time.
arXiv Detail & Related papers (2021-12-07T09:43:08Z) - TUM-VIE: The TUM Stereo Visual-Inertial Event Dataset [50.8779574716494]
Event cameras are bio-inspired vision sensors which measure per pixel brightness changes.
They offer numerous benefits over traditional, frame-based cameras, including low latency, high dynamic range, high temporal resolution and low power consumption.
To foster the development of 3D perception and navigation algorithms with event cameras, we present the TUM-VIE dataset.
arXiv Detail & Related papers (2021-08-16T19:53:56Z) - Learning to Detect Objects with a 1 Megapixel Event Camera [14.949946376335305]
Event cameras encode visual information with high temporal precision, low data-rate, and high-dynamic range.
Due to the novelty of the field, the performance of event-based systems on many vision tasks is still lower compared to conventional frame-based solutions.
arXiv Detail & Related papers (2020-09-28T16:03:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.