All-Weather Object Recognition Using Radar and Infrared Sensing
- URL: http://arxiv.org/abs/2010.16285v1
- Date: Fri, 30 Oct 2020 14:16:39 GMT
- Title: All-Weather Object Recognition Using Radar and Infrared Sensing
- Authors: Marcel Sheeny
- Abstract summary: This thesis explores new sensing developments based on long wave polarised infrared (IR) imagery and imaging radar to recognise objects.
First, we developed a methodology based on Stokes parameters using polarised infrared data to recognise vehicles using deep neural networks.
Second, we explored the potential of using only the power spectrum captured by low-THz radar sensors to perform object recognition in a controlled scenario.
Last, we created a new large-scale dataset in the "wild" with many different weather scenarios showing radar robustness to detect vehicles in adverse weather.
- Score: 1.7513645771137178
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Autonomous cars are an emergent technology which has the capacity to change
human lives. The current sensor systems which are most capable of perception
are based on optical sensors. For example, deep neural networks show
outstanding results in recognising objects when used to process data from
cameras and Light Detection And Ranging (LiDAR) sensors. However these sensors
perform poorly under adverse weather conditions such as rain, fog, and snow due
to the sensor wavelengths. This thesis explores new sensing developments based
on long wave polarised infrared (IR) imagery and imaging radar to recognise
objects. First, we developed a methodology based on Stokes parameters using
polarised infrared data to recognise vehicles using deep neural networks.
Second, we explored the potential of using only the power spectrum captured by
low-THz radar sensors to perform object recognition in a controlled scenario.
This latter work is based on a data-driven approach together with the
development of a data augmentation method based on attenuation, range and
speckle noise. Last, we created a new large-scale dataset in the "wild" with
many different weather scenarios (sunny, overcast, night, fog, rain and snow)
showing radar robustness to detect vehicles in adverse weather. High resolution
radar and polarised IR imagery, combined with a deep learning approach, are
shown as a potential alternative to current automotive sensing systems based on
visible spectrum optical technology as they are more robust in severe weather
and adverse light conditions.
Related papers
- Radar Fields: Frequency-Space Neural Scene Representations for FMCW Radar [62.51065633674272]
We introduce Radar Fields - a neural scene reconstruction method designed for active radar imagers.
Our approach unites an explicit, physics-informed sensor model with an implicit neural geometry and reflectance model to directly synthesize raw radar measurements.
We validate the effectiveness of the method across diverse outdoor scenarios, including urban scenes with dense vehicles and infrastructure.
arXiv Detail & Related papers (2024-05-07T20:44:48Z) - The Radar Ghost Dataset -- An Evaluation of Ghost Objects in Automotive Radar Data [12.653873936535149]
A lot more surfaces in a typical traffic scenario appear flat relative to the radar's emitted signal.
This results in multi-path reflections or so called ghost detections in the radar signal.
We present a dataset with detailed manual annotations for different kinds of ghost detections.
arXiv Detail & Related papers (2024-04-01T19:20:32Z) - Exploring Radar Data Representations in Autonomous Driving: A Comprehensive Review [9.68427762815025]
Review focuses on exploring different radar data representations utilized in autonomous driving systems.
We introduce the capabilities and limitations of the radar sensor.
For each radar representation, we examine the related datasets, methods, advantages and limitations.
arXiv Detail & Related papers (2023-12-08T06:31:19Z) - Echoes Beyond Points: Unleashing the Power of Raw Radar Data in
Multi-modality Fusion [74.84019379368807]
We propose a novel method named EchoFusion to skip the existing radar signal processing pipeline.
Specifically, we first generate the Bird's Eye View (BEV) queries and then take corresponding spectrum features from radar to fuse with other sensors.
arXiv Detail & Related papers (2023-07-31T09:53:50Z) - Semantic Segmentation of Radar Detections using Convolutions on Point
Clouds [59.45414406974091]
We introduce a deep-learning based method to convolve radar detections into point clouds.
We adapt this algorithm to radar-specific properties through distance-dependent clustering and pre-processing of input point clouds.
Our network outperforms state-of-the-art approaches that are based on PointNet++ on the task of semantic segmentation of radar point clouds.
arXiv Detail & Related papers (2023-05-22T07:09:35Z) - RadarFormer: Lightweight and Accurate Real-Time Radar Object Detection
Model [13.214257841152033]
Radar-centric data sets do not get a lot of attention in the development of deep learning techniques for radar perception.
We propose a transformers-based model, named RadarFormer, that utilizes state-of-the-art developments in vision deep learning.
Our model also introduces a channel-chirp-time merging module that reduces the size and complexity of our models by more than 10 times without compromising accuracy.
arXiv Detail & Related papers (2023-04-17T17:07:35Z) - DensePose From WiFi [86.61881052177228]
We develop a deep neural network that maps the phase and amplitude of WiFi signals to UV coordinates within 24 human regions.
Our model can estimate the dense pose of multiple subjects, with comparable performance to image-based approaches.
arXiv Detail & Related papers (2022-12-31T16:48:43Z) - Rethinking of Radar's Role: A Camera-Radar Dataset and Systematic
Annotator via Coordinate Alignment [38.24705460170415]
We propose a new dataset, named CRUW, with a systematic annotator and performance evaluation system.
CRUW aims to classify and localize the objects in 3D purely from radar's radio frequency (RF) images.
To the best of our knowledge, CRUW is the first public large-scale dataset with a systematic annotation and evaluation system.
arXiv Detail & Related papers (2021-05-11T17:13:45Z) - Complex-valued Convolutional Neural Networks for Enhanced Radar Signal
Denoising and Interference Mitigation [73.0103413636673]
We propose the use of Complex-Valued Convolutional Neural Networks (CVCNNs) to address the issue of mutual interference between radar sensors.
CVCNNs increase data efficiency, speeds up network training and substantially improves the conservation of phase information during interference removal.
arXiv Detail & Related papers (2021-04-29T10:06:29Z) - RadarNet: Exploiting Radar for Robust Perception of Dynamic Objects [73.80316195652493]
We tackle the problem of exploiting Radar for perception in the context of self-driving cars.
We propose a new solution that exploits both LiDAR and Radar sensors for perception.
Our approach, dubbed RadarNet, features a voxel-based early fusion and an attention-based late fusion.
arXiv Detail & Related papers (2020-07-28T17:15:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.