Probabilistic Oriented Object Detection in Automotive Radar
- URL: http://arxiv.org/abs/2004.05310v2
- Date: Sat, 18 Apr 2020 03:49:39 GMT
- Title: Probabilistic Oriented Object Detection in Automotive Radar
- Authors: Xu Dong, Pengluo Wang, Pengyue Zhang, Langechuan Liu
- Abstract summary: We propose a deep-learning based algorithm for radar object detection.
We created a new multimodal dataset with 102544 frames of raw radar and synchronized LiDAR data.
Our best performing radar detection model achieves 77.28% AP under oriented IoU of 0.3.
- Score: 8.281391209717103
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Autonomous radar has been an integral part of advanced driver assistance
systems due to its robustness to adverse weather and various lighting
conditions. Conventional automotive radars use digital signal processing (DSP)
algorithms to process raw data into sparse radar pins that do not provide
information regarding the size and orientation of the objects. In this paper,
we propose a deep-learning based algorithm for radar object detection. The
algorithm takes in radar data in its raw tensor representation and places
probabilistic oriented bounding boxes around the detected objects in
bird's-eye-view space. We created a new multimodal dataset with 102544 frames
of raw radar and synchronized LiDAR data. To reduce human annotation effort we
developed a scalable pipeline to automatically annotate ground truth using
LiDAR as reference. Based on this dataset we developed a vehicle detection
pipeline using raw radar data as the only input. Our best performing radar
detection model achieves 77.28\% AP under oriented IoU of 0.3. To the best of
our knowledge, this is the first attempt to investigate object detection with
raw radar data for conventional corner automotive radars.
Related papers
- Radar Fields: Frequency-Space Neural Scene Representations for FMCW Radar [62.51065633674272]
We introduce Radar Fields - a neural scene reconstruction method designed for active radar imagers.
Our approach unites an explicit, physics-informed sensor model with an implicit neural geometry and reflectance model to directly synthesize raw radar measurements.
We validate the effectiveness of the method across diverse outdoor scenarios, including urban scenes with dense vehicles and infrastructure.
arXiv Detail & Related papers (2024-05-07T20:44:48Z) - Echoes Beyond Points: Unleashing the Power of Raw Radar Data in
Multi-modality Fusion [74.84019379368807]
We propose a novel method named EchoFusion to skip the existing radar signal processing pipeline.
Specifically, we first generate the Bird's Eye View (BEV) queries and then take corresponding spectrum features from radar to fuse with other sensors.
arXiv Detail & Related papers (2023-07-31T09:53:50Z) - Bi-LRFusion: Bi-Directional LiDAR-Radar Fusion for 3D Dynamic Object
Detection [78.59426158981108]
We introduce a bi-directional LiDAR-Radar fusion framework, termed Bi-LRFusion, to tackle the challenges and improve 3D detection for dynamic objects.
We conduct extensive experiments on nuScenes and ORR datasets, and show that our Bi-LRFusion achieves state-of-the-art performance for detecting dynamic objects.
arXiv Detail & Related papers (2023-06-02T10:57:41Z) - RadarFormer: Lightweight and Accurate Real-Time Radar Object Detection
Model [13.214257841152033]
Radar-centric data sets do not get a lot of attention in the development of deep learning techniques for radar perception.
We propose a transformers-based model, named RadarFormer, that utilizes state-of-the-art developments in vision deep learning.
Our model also introduces a channel-chirp-time merging module that reduces the size and complexity of our models by more than 10 times without compromising accuracy.
arXiv Detail & Related papers (2023-04-17T17:07:35Z) - R4Dyn: Exploring Radar for Self-Supervised Monocular Depth Estimation of
Dynamic Scenes [69.6715406227469]
Self-supervised monocular depth estimation in driving scenarios has achieved comparable performance to supervised approaches.
We present R4Dyn, a novel set of techniques to use cost-efficient radar data on top of a self-supervised depth estimation framework.
arXiv Detail & Related papers (2021-08-10T17:57:03Z) - Radar Artifact Labeling Framework (RALF): Method for Plausible Radar
Detections in Datasets [2.5899040911480187]
We propose a cross sensor Radar Artifact Labeling Framework (RALF) for labeling sparse radar point clouds.
RALF provides plausibility labels for radar raw detections, distinguishing between artifacts and targets.
We validate the results by evaluating error metrics on semi-manually labeled ground truth dataset of $3.28cdot106$ points.
arXiv Detail & Related papers (2020-12-03T15:11:31Z) - LiRaNet: End-to-End Trajectory Prediction using Spatio-Temporal Radar
Fusion [52.59664614744447]
We present LiRaNet, a novel end-to-end trajectory prediction method which utilizes radar sensor information along with widely used lidar and high definition (HD) maps.
automotive radar provides rich, complementary information, allowing for longer range vehicle detection as well as instantaneous velocity measurements.
arXiv Detail & Related papers (2020-10-02T00:13:00Z) - RadarNet: Exploiting Radar for Robust Perception of Dynamic Objects [73.80316195652493]
We tackle the problem of exploiting Radar for perception in the context of self-driving cars.
We propose a new solution that exploits both LiDAR and Radar sensors for perception.
Our approach, dubbed RadarNet, features a voxel-based early fusion and an attention-based late fusion.
arXiv Detail & Related papers (2020-07-28T17:15:02Z) - Deep Learning on Radar Centric 3D Object Detection [4.822598110892847]
We introduce a deep learning approach to 3D object detection with radar only.
To overcome the lack of radar labeled data, we propose a novel way of making use of abundant LiDAR data.
arXiv Detail & Related papers (2020-02-27T10:16:46Z) - Experiments with mmWave Automotive Radar Test-bed [10.006245521984697]
Millimeter-wave (mmW) radars are being increasingly integrated in commercial vehicles to support new Adaptive Driver Assisted Systems (ADAS)
We have assembled a lab-scale frequency modulated continuous wave (FMCW) radar test-bed based on Texas Instrument's (TI) automotive chipset family.
arXiv Detail & Related papers (2019-12-29T02:14:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.