Emergent Visual Sensors for Autonomous Vehicles
- URL: http://arxiv.org/abs/2205.09383v2
- Date: Sun, 18 Jun 2023 08:29:37 GMT
- Title: Emergent Visual Sensors for Autonomous Vehicles
- Authors: You Li, Julien Moreau, Javier Ibanez-Guzman
- Abstract summary: We review the principles of four novel image sensors: infrared cameras, range-gated cameras, polarization cameras, and event cameras.
Their comparative advantages, existing or potential applications, and corresponding data processing algorithms are presented.
- Score: 3.3227094421785344
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Autonomous vehicles rely on perception systems to understand their
surroundings for further navigation missions. Cameras are essential for
perception systems due to the advantages of object detection and recognition
provided by modern computer vision algorithms, comparing to other sensors, such
as LiDARs and radars. However, limited by its inherent imaging principle, a
standard RGB camera may perform poorly in a variety of adverse scenarios,
including but not limited to: low illumination, high contrast, bad weather such
as fog/rain/snow, etc. Meanwhile, estimating the 3D information from the 2D
image detection is generally more difficult when compared to LiDARs or radars.
Several new sensing technologies have emerged in recent years to address the
limitations of conventional RGB cameras. In this paper, we review the
principles of four novel image sensors: infrared cameras, range-gated cameras,
polarization cameras, and event cameras. Their comparative advantages, existing
or potential applications, and corresponding data processing algorithms are all
presented in a systematic manner. We expect that this study will assist
practitioners in the autonomous driving society with new perspectives and
insights.
Related papers
- Prototipo de un Contador Bidireccional Automático de Personas basado en sensores de visión 3D [39.58317527488534]
3D sensors, also known as RGB-D sensors, utilize depth images where each pixel measures the distance from the camera to objects.
The described prototype uses RGB-D sensors for bidirectional people counting in venues, aiding security and surveillance in spaces like stadiums or airports.
The system includes a RealSense D415 depth camera and a mini-computer running object detection algorithms to count people and a 2D camera for identity verification.
arXiv Detail & Related papers (2024-03-18T23:18:40Z) - Dataset and Benchmark: Novel Sensors for Autonomous Vehicle Perception [7.474695739346621]
This paper introduces the Novel Sensors for Autonomous Vehicle Perception dataset to facilitate future research on this topic.
The data was collected by repeatedly driving two 8 km routes and includes varied lighting conditions and opposing viewpoint perspectives.
To our knowledge, the NSAVP dataset is the first to include stereo thermal cameras together with stereo event and monochrome cameras.
arXiv Detail & Related papers (2024-01-24T23:25:23Z) - Polarimetric Imaging for Perception [3.093890460224435]
We analyze the potential for improvement in perception tasks when using an RGB-polarimetric camera.
We show that a quantifiable improvement can be achieved for both of them using state-of-the-art deep neural networks.
arXiv Detail & Related papers (2023-05-24T06:42:27Z) - Camera-Radar Perception for Autonomous Vehicles and ADAS: Concepts,
Datasets and Metrics [77.34726150561087]
This work aims to carry out a study on the current scenario of camera and radar-based perception for ADAS and autonomous vehicles.
Concepts and characteristics related to both sensors, as well as to their fusion, are presented.
We give an overview of the Deep Learning-based detection and segmentation tasks, and the main datasets, metrics, challenges, and open questions in vehicle perception.
arXiv Detail & Related papers (2023-03-08T00:48:32Z) - DensePose From WiFi [86.61881052177228]
We develop a deep neural network that maps the phase and amplitude of WiFi signals to UV coordinates within 24 human regions.
Our model can estimate the dense pose of multiple subjects, with comparable performance to image-based approaches.
arXiv Detail & Related papers (2022-12-31T16:48:43Z) - Drone Detection and Tracking in Real-Time by Fusion of Different Sensing
Modalities [66.4525391417921]
We design and evaluate a multi-sensor drone detection system.
Our solution integrates a fish-eye camera as well to monitor a wider part of the sky and steer the other cameras towards objects of interest.
The thermal camera is shown to be a feasible solution as good as the video camera, even if the camera employed here has a lower resolution.
arXiv Detail & Related papers (2022-07-05T10:00:58Z) - Rethinking of Radar's Role: A Camera-Radar Dataset and Systematic
Annotator via Coordinate Alignment [38.24705460170415]
We propose a new dataset, named CRUW, with a systematic annotator and performance evaluation system.
CRUW aims to classify and localize the objects in 3D purely from radar's radio frequency (RF) images.
To the best of our knowledge, CRUW is the first public large-scale dataset with a systematic annotation and evaluation system.
arXiv Detail & Related papers (2021-05-11T17:13:45Z) - Complex-valued Convolutional Neural Networks for Enhanced Radar Signal
Denoising and Interference Mitigation [73.0103413636673]
We propose the use of Complex-Valued Convolutional Neural Networks (CVCNNs) to address the issue of mutual interference between radar sensors.
CVCNNs increase data efficiency, speeds up network training and substantially improves the conservation of phase information during interference removal.
arXiv Detail & Related papers (2021-04-29T10:06:29Z) - All-Weather Object Recognition Using Radar and Infrared Sensing [1.7513645771137178]
This thesis explores new sensing developments based on long wave polarised infrared (IR) imagery and imaging radar to recognise objects.
First, we developed a methodology based on Stokes parameters using polarised infrared data to recognise vehicles using deep neural networks.
Second, we explored the potential of using only the power spectrum captured by low-THz radar sensors to perform object recognition in a controlled scenario.
Last, we created a new large-scale dataset in the "wild" with many different weather scenarios showing radar robustness to detect vehicles in adverse weather.
arXiv Detail & Related papers (2020-10-30T14:16:39Z) - Depth Sensing Beyond LiDAR Range [84.19507822574568]
We propose a novel three-camera system that utilizes small field of view cameras.
Our system, along with our novel algorithm for computing metric depth, does not require full pre-calibration.
It can output dense depth maps with practically acceptable accuracy for scenes and objects at long distances.
arXiv Detail & Related papers (2020-04-07T00:09:51Z) - LIBRE: The Multiple 3D LiDAR Dataset [54.25307983677663]
We present LIBRE: LiDAR Benchmarking and Reference, a first-of-its-kind dataset featuring 10 different LiDAR sensors.
LIBRE will contribute to the research community to provide a means for a fair comparison of currently available LiDARs.
It will also facilitate the improvement of existing self-driving vehicles and robotics-related software.
arXiv Detail & Related papers (2020-03-13T06:17:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.