Dataset and Benchmark: Novel Sensors for Autonomous Vehicle Perception
- URL: http://arxiv.org/abs/2401.13853v1
- Date: Wed, 24 Jan 2024 23:25:23 GMT
- Title: Dataset and Benchmark: Novel Sensors for Autonomous Vehicle Perception
- Authors: Spencer Carmichael, Austin Buchan, Mani Ramanagopal, Radhika Ravi, Ram
Vasudevan, Katherine A. Skinner
- Abstract summary: This paper introduces the Novel Sensors for Autonomous Vehicle Perception dataset to facilitate future research on this topic.
The data was collected by repeatedly driving two 8 km routes and includes varied lighting conditions and opposing viewpoint perspectives.
To our knowledge, the NSAVP dataset is the first to include stereo thermal cameras together with stereo event and monochrome cameras.
- Score: 7.474695739346621
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Conventional cameras employed in autonomous vehicle (AV) systems support many
perception tasks, but are challenged by low-light or high dynamic range scenes,
adverse weather, and fast motion. Novel sensors, such as event and thermal
cameras, offer capabilities with the potential to address these scenarios, but
they remain to be fully exploited. This paper introduces the Novel Sensors for
Autonomous Vehicle Perception (NSAVP) dataset to facilitate future research on
this topic. The dataset was captured with a platform including stereo event,
thermal, monochrome, and RGB cameras as well as a high precision navigation
system providing ground truth poses. The data was collected by repeatedly
driving two ~8 km routes and includes varied lighting conditions and opposing
viewpoint perspectives. We provide benchmarking experiments on the task of
place recognition to demonstrate challenges and opportunities for novel sensors
to enhance critical AV perception tasks. To our knowledge, the NSAVP dataset is
the first to include stereo thermal cameras together with stereo event and
monochrome cameras. The dataset and supporting software suite is available at:
https://umautobots.github.io/nsavp
Related papers
- ES-PTAM: Event-based Stereo Parallel Tracking and Mapping [11.801511288805225]
Event cameras offer advantages to overcome the limitations of standard cameras.
We propose a novel event-based stereo VO system by combining two ideas.
We evaluate the system on five real-world datasets.
arXiv Detail & Related papers (2024-08-28T07:56:28Z) - Camera-Radar Perception for Autonomous Vehicles and ADAS: Concepts,
Datasets and Metrics [77.34726150561087]
This work aims to carry out a study on the current scenario of camera and radar-based perception for ADAS and autonomous vehicles.
Concepts and characteristics related to both sensors, as well as to their fusion, are presented.
We give an overview of the Deep Learning-based detection and segmentation tasks, and the main datasets, metrics, challenges, and open questions in vehicle perception.
arXiv Detail & Related papers (2023-03-08T00:48:32Z) - Drone Detection and Tracking in Real-Time by Fusion of Different Sensing
Modalities [66.4525391417921]
We design and evaluate a multi-sensor drone detection system.
Our solution integrates a fish-eye camera as well to monitor a wider part of the sky and steer the other cameras towards objects of interest.
The thermal camera is shown to be a feasible solution as good as the video camera, even if the camera employed here has a lower resolution.
arXiv Detail & Related papers (2022-07-05T10:00:58Z) - Emergent Visual Sensors for Autonomous Vehicles [3.3227094421785344]
We review the principles of four novel image sensors: infrared cameras, range-gated cameras, polarization cameras, and event cameras.
Their comparative advantages, existing or potential applications, and corresponding data processing algorithms are presented.
arXiv Detail & Related papers (2022-05-19T08:29:30Z) - ViViD++: Vision for Visibility Dataset [14.839450468199457]
We present a dataset capturing diverse visual data formats that target varying luminance conditions.
Despite the alternative sensors' potential, there still are few datasets with alternative vision sensors.
We provide these measurements along with inertial sensors and ground-truth for developing robust visual SLAM under poor illumination.
arXiv Detail & Related papers (2022-04-13T06:01:27Z) - Rope3D: TheRoadside Perception Dataset for Autonomous Driving and
Monocular 3D Object Detection Task [48.555440807415664]
We present the first high-diversity challenging Roadside Perception 3D dataset- Rope3D from a novel view.
The dataset consists of 50k images and over 1.5M 3D objects in various scenes.
We propose to leverage the geometry constraint to solve the inherent ambiguities caused by various sensors, viewpoints.
arXiv Detail & Related papers (2022-03-25T12:13:23Z) - TUM-VIE: The TUM Stereo Visual-Inertial Event Dataset [50.8779574716494]
Event cameras are bio-inspired vision sensors which measure per pixel brightness changes.
They offer numerous benefits over traditional, frame-based cameras, including low latency, high dynamic range, high temporal resolution and low power consumption.
To foster the development of 3D perception and navigation algorithms with event cameras, we present the TUM-VIE dataset.
arXiv Detail & Related papers (2021-08-16T19:53:56Z) - DSEC: A Stereo Event Camera Dataset for Driving Scenarios [55.79329250951028]
This work presents the first high-resolution, large-scale stereo dataset with event cameras.
The dataset contains 53 sequences collected by driving in a variety of illumination conditions.
It provides ground truth disparity for the development and evaluation of event-based stereo algorithms.
arXiv Detail & Related papers (2021-03-10T12:10:33Z) - Real-Time Drone Detection and Tracking With Visible, Thermal and
Acoustic Sensors [66.4525391417921]
A thermal infrared camera is shown to be a feasible solution to the drone detection task.
The detector performance as a function of the sensor-to-target distance is also investigated.
A novel video dataset containing 650 annotated infrared and visible videos of drones, birds, airplanes and helicopters is also presented.
arXiv Detail & Related papers (2020-07-14T23:06:42Z) - SurfelGAN: Synthesizing Realistic Sensor Data for Autonomous Driving [27.948417322786575]
We present a simple yet effective approach to generate realistic scenario sensor data.
Our approach uses texture-mapped surfels to efficiently reconstruct the scene from an initial vehicle pass or set of passes.
We then leverage a SurfelGAN network to reconstruct realistic camera images for novel positions and orientations of the self-driving vehicle.
arXiv Detail & Related papers (2020-05-08T04:01:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.