Radar Occupancy Prediction with Lidar Supervision while Preserving
Long-Range Sensing and Penetrating Capabilities
- URL: http://arxiv.org/abs/2112.04282v1
- Date: Wed, 8 Dec 2021 13:38:58 GMT
- Title: Radar Occupancy Prediction with Lidar Supervision while Preserving
Long-Range Sensing and Penetrating Capabilities
- Authors: Pou-Chun Kung, Chieh-Chih Wang, Wen-Chieh Lin
- Abstract summary: Recent works have made enormous progress in classifying free and occupied spaces in radar images by leveraging lidar label supervision.
The sensing distance of the results is limited by the sensing range of lidar.
Some objects visible to lidar are invisible to radar, and some objects occluded in lidar scans are visible in radar images because of the radar's penetrating capability.
- Score: 10.133923613929575
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Radar shows great potential for autonomous driving by accomplishing
long-range sensing under diverse weather conditions. But radar is also a
particularly challenging sensing modality due to the radar noises. Recent works
have made enormous progress in classifying free and occupied spaces in radar
images by leveraging lidar label supervision. However, there are still several
unsolved issues. Firstly, the sensing distance of the results is limited by the
sensing range of lidar. Secondly, the performance of the results is degenerated
by lidar due to the physical sensing discrepancies between the two sensors. For
example, some objects visible to lidar are invisible to radar, and some objects
occluded in lidar scans are visible in radar images because of the radar's
penetrating capability. These sensing differences cause false positive and
penetrating capability degeneration, respectively.
In this paper, we propose training data preprocessing and polar sliding
window inference to solve the issues. The data preprocessing aims to reduce the
effect caused by radar-invisible measurements in lidar scans. The polar sliding
window inference aims to solve the limited sensing range issue by applying a
near-range trained network to the long-range region. Instead of using common
Cartesian representation, we propose to use polar representation to reduce the
shape dissimilarity between long-range and near-range data. We find that
extending a near-range trained network to long-range region inference in the
polar space has 4.2 times better IoU than in Cartesian space. Besides, the
polar sliding window inference can preserve the radar penetrating capability by
changing the viewpoint of the inference region, which makes some occluded
measurements seem non-occluded for a pretrained network.
Related papers
- Radar Fields: Frequency-Space Neural Scene Representations for FMCW Radar [62.51065633674272]
We introduce Radar Fields - a neural scene reconstruction method designed for active radar imagers.
Our approach unites an explicit, physics-informed sensor model with an implicit neural geometry and reflectance model to directly synthesize raw radar measurements.
We validate the effectiveness of the method across diverse outdoor scenarios, including urban scenes with dense vehicles and infrastructure.
arXiv Detail & Related papers (2024-05-07T20:44:48Z) - Echoes Beyond Points: Unleashing the Power of Raw Radar Data in
Multi-modality Fusion [74.84019379368807]
We propose a novel method named EchoFusion to skip the existing radar signal processing pipeline.
Specifically, we first generate the Bird's Eye View (BEV) queries and then take corresponding spectrum features from radar to fuse with other sensors.
arXiv Detail & Related papers (2023-07-31T09:53:50Z) - Semantic Segmentation of Radar Detections using Convolutions on Point
Clouds [59.45414406974091]
We introduce a deep-learning based method to convolve radar detections into point clouds.
We adapt this algorithm to radar-specific properties through distance-dependent clustering and pre-processing of input point clouds.
Our network outperforms state-of-the-art approaches that are based on PointNet++ on the task of semantic segmentation of radar point clouds.
arXiv Detail & Related papers (2023-05-22T07:09:35Z) - RadarFormer: Lightweight and Accurate Real-Time Radar Object Detection
Model [13.214257841152033]
Radar-centric data sets do not get a lot of attention in the development of deep learning techniques for radar perception.
We propose a transformers-based model, named RadarFormer, that utilizes state-of-the-art developments in vision deep learning.
Our model also introduces a channel-chirp-time merging module that reduces the size and complexity of our models by more than 10 times without compromising accuracy.
arXiv Detail & Related papers (2023-04-17T17:07:35Z) - Pointillism: Accurate 3D bounding box estimation with multi-radars [6.59119432945925]
We introduce Pointillism, a system that combines data from multiple spatially separated radars with an optimal separation to mitigate these problems.
We present the design of RP-net, a novel deep learning architecture, designed explicitly for radar's sparse data distribution, to enable accurate 3D bounding box estimation.
arXiv Detail & Related papers (2022-03-08T23:09:58Z) - Anomaly Detection in Radar Data Using PointNets [7.3600716208089825]
We present an approach based on PointNets to detect anomalous radar targets.
Our method is evaluated on a real-world dataset in urban scenarios.
arXiv Detail & Related papers (2021-09-20T10:02:24Z) - Complex-valued Convolutional Neural Networks for Enhanced Radar Signal
Denoising and Interference Mitigation [73.0103413636673]
We propose the use of Complex-Valued Convolutional Neural Networks (CVCNNs) to address the issue of mutual interference between radar sensors.
CVCNNs increase data efficiency, speeds up network training and substantially improves the conservation of phase information during interference removal.
arXiv Detail & Related papers (2021-04-29T10:06:29Z) - Radar Artifact Labeling Framework (RALF): Method for Plausible Radar
Detections in Datasets [2.5899040911480187]
We propose a cross sensor Radar Artifact Labeling Framework (RALF) for labeling sparse radar point clouds.
RALF provides plausibility labels for radar raw detections, distinguishing between artifacts and targets.
We validate the results by evaluating error metrics on semi-manually labeled ground truth dataset of $3.28cdot106$ points.
arXiv Detail & Related papers (2020-12-03T15:11:31Z) - LiRaNet: End-to-End Trajectory Prediction using Spatio-Temporal Radar
Fusion [52.59664614744447]
We present LiRaNet, a novel end-to-end trajectory prediction method which utilizes radar sensor information along with widely used lidar and high definition (HD) maps.
automotive radar provides rich, complementary information, allowing for longer range vehicle detection as well as instantaneous velocity measurements.
arXiv Detail & Related papers (2020-10-02T00:13:00Z) - Estimating the Magnitude and Phase of Automotive Radar Signals under
Multiple Interference Sources with Fully Convolutional Networks [22.081568892330996]
Radar sensors are gradually becoming a wide-spread equipment for road vehicles, playing a crucial role in autonomous driving and road safety.
The broad adoption of radar sensors increases the chance of interference among sensors from different vehicles, generating corrupted range profiles and range-Doppler maps.
In this paper, we propose a fully convolutional neural network for automotive radar interference mitigation.
arXiv Detail & Related papers (2020-08-11T18:50:38Z) - RadarNet: Exploiting Radar for Robust Perception of Dynamic Objects [73.80316195652493]
We tackle the problem of exploiting Radar for perception in the context of self-driving cars.
We propose a new solution that exploits both LiDAR and Radar sensors for perception.
Our approach, dubbed RadarNet, features a voxel-based early fusion and an attention-based late fusion.
arXiv Detail & Related papers (2020-07-28T17:15:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.