RODNet: Radar Object Detection Using Cross-Modal Supervision
- URL: http://arxiv.org/abs/2003.01816v2
- Date: Mon, 8 Feb 2021 07:00:42 GMT
- Title: RODNet: Radar Object Detection Using Cross-Modal Supervision
- Authors: Yizhou Wang, Zhongyu Jiang, Xiangyu Gao, Jenq-Neng Hwang, Guanbin
Xing, Hui Liu
- Abstract summary: Radar is usually more robust than the camera in severe driving scenarios.
Unlike RGB images captured by a camera, semantic information from the radar signals is noticeably difficult to extract.
We propose a deep radar object detection network (RODNet) to effectively detect objects purely from the radar frequency data.
- Score: 34.33920572597379
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Radar is usually more robust than the camera in severe driving scenarios,
e.g., weak/strong lighting and bad weather. However, unlike RGB images captured
by a camera, the semantic information from the radar signals is noticeably
difficult to extract. In this paper, we propose a deep radar object detection
network (RODNet), to effectively detect objects purely from the carefully
processed radar frequency data in the format of range-azimuth frequency
heatmaps (RAMaps). Three different 3D autoencoder based architectures are
introduced to predict object confidence distribution from each snippet of the
input RAMaps. The final detection results are then calculated using our
post-processing method, called location-based non-maximum suppression (L-NMS).
Instead of using burdensome human-labeled ground truth, we train the RODNet
using the annotations generated automatically by a novel 3D localization method
using a camera-radar fusion (CRF) strategy. To train and evaluate our method,
we build a new dataset -- CRUW, containing synchronized videos and RAMaps in
various driving scenarios. After intensive experiments, our RODNet shows
favorable object detection performance without the presence of the camera.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.