A Systematic Study on Object Recognition Using Millimeter-wave Radar
- URL: http://arxiv.org/abs/2305.02085v1
- Date: Wed, 3 May 2023 12:42:44 GMT
- Title: A Systematic Study on Object Recognition Using Millimeter-wave Radar
- Authors: Maloy Kumar Devnath, Avijoy Chakma, Mohammad Saeid Anwar, Emon Dey,
Zahid Hasan, Marc Conn, Biplab Pal, Nirmalya Roy
- Abstract summary: millimeter-wave (MMW) radars are essential in smart environments.
MMW radars are expensive and hard to get for community-purpose smart environment applications.
These challenges need to be investigated for tasks like recognizing objects and activities.
- Score: 1.3192560874022086
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Due to its light and weather-independent sensing, millimeter-wave (MMW) radar
is essential in smart environments. Intelligent vehicle systems and
industry-grade MMW radars have integrated such capabilities. Industry-grade MMW
radars are expensive and hard to get for community-purpose smart environment
applications. However, commercially available MMW radars have hidden
underpinning challenges that need to be investigated for tasks like recognizing
objects and activities, real-time person tracking, object localization, etc.
Image and video data are straightforward to gather, understand, and annotate
for such jobs. Image and video data are light and weather-dependent,
susceptible to the occlusion effect, and present privacy problems. To eliminate
dependence and ensure privacy, commercial MMW radars should be tested. MMW
radar's practicality and performance in varied operating settings must be
addressed before promoting it. To address the problems, we collected a dataset
using Texas Instruments' Automotive mmWave Radar (AWR2944) and reported the
best experimental settings for object recognition performance using different
deep learning algorithms. Our extensive data gathering technique allows us to
systematically explore and identify object identification task problems under
cross-ambience conditions. We investigated several solutions and published
detailed experimental data.
Related papers
- Radar Fields: Frequency-Space Neural Scene Representations for FMCW Radar [62.51065633674272]
We introduce Radar Fields - a neural scene reconstruction method designed for active radar imagers.
Our approach unites an explicit, physics-informed sensor model with an implicit neural geometry and reflectance model to directly synthesize raw radar measurements.
We validate the effectiveness of the method across diverse outdoor scenarios, including urban scenes with dense vehicles and infrastructure.
arXiv Detail & Related papers (2024-05-07T20:44:48Z) - Leveraging Self-Supervised Instance Contrastive Learning for Radar
Object Detection [7.728838099011661]
This paper presents RiCL, an instance contrastive learning framework to pre-train radar object detectors.
We aim to pre-train an object detector's backbone, head and neck to learn with fewer data.
arXiv Detail & Related papers (2024-02-13T12:53:33Z) - Echoes Beyond Points: Unleashing the Power of Raw Radar Data in
Multi-modality Fusion [74.84019379368807]
We propose a novel method named EchoFusion to skip the existing radar signal processing pipeline.
Specifically, we first generate the Bird's Eye View (BEV) queries and then take corresponding spectrum features from radar to fuse with other sensors.
arXiv Detail & Related papers (2023-07-31T09:53:50Z) - Radars for Autonomous Driving: A Review of Deep Learning Methods and
Challenges [0.021665899581403605]
Radar is a key component of the suite of perception sensors used for autonomous vehicles.
It is characterized by low resolution, sparsity, clutter, high uncertainty, and lack of good datasets.
Current radar models are often influenced by lidar and vision models, which are focused on optical features that are relatively weak in radar data.
arXiv Detail & Related papers (2023-06-15T17:37:52Z) - RadarFormer: Lightweight and Accurate Real-Time Radar Object Detection
Model [13.214257841152033]
Radar-centric data sets do not get a lot of attention in the development of deep learning techniques for radar perception.
We propose a transformers-based model, named RadarFormer, that utilizes state-of-the-art developments in vision deep learning.
Our model also introduces a channel-chirp-time merging module that reduces the size and complexity of our models by more than 10 times without compromising accuracy.
arXiv Detail & Related papers (2023-04-17T17:07:35Z) - Camera-Radar Perception for Autonomous Vehicles and ADAS: Concepts,
Datasets and Metrics [77.34726150561087]
This work aims to carry out a study on the current scenario of camera and radar-based perception for ADAS and autonomous vehicles.
Concepts and characteristics related to both sensors, as well as to their fusion, are presented.
We give an overview of the Deep Learning-based detection and segmentation tasks, and the main datasets, metrics, challenges, and open questions in vehicle perception.
arXiv Detail & Related papers (2023-03-08T00:48:32Z) - Complex-valued Convolutional Neural Networks for Enhanced Radar Signal
Denoising and Interference Mitigation [73.0103413636673]
We propose the use of Complex-Valued Convolutional Neural Networks (CVCNNs) to address the issue of mutual interference between radar sensors.
CVCNNs increase data efficiency, speeds up network training and substantially improves the conservation of phase information during interference removal.
arXiv Detail & Related papers (2021-04-29T10:06:29Z) - LiRaNet: End-to-End Trajectory Prediction using Spatio-Temporal Radar
Fusion [52.59664614744447]
We present LiRaNet, a novel end-to-end trajectory prediction method which utilizes radar sensor information along with widely used lidar and high definition (HD) maps.
automotive radar provides rich, complementary information, allowing for longer range vehicle detection as well as instantaneous velocity measurements.
arXiv Detail & Related papers (2020-10-02T00:13:00Z) - RadarNet: Exploiting Radar for Robust Perception of Dynamic Objects [73.80316195652493]
We tackle the problem of exploiting Radar for perception in the context of self-driving cars.
We propose a new solution that exploits both LiDAR and Radar sensors for perception.
Our approach, dubbed RadarNet, features a voxel-based early fusion and an attention-based late fusion.
arXiv Detail & Related papers (2020-07-28T17:15:02Z) - Experiments with mmWave Automotive Radar Test-bed [10.006245521984697]
Millimeter-wave (mmW) radars are being increasingly integrated in commercial vehicles to support new Adaptive Driver Assisted Systems (ADAS)
We have assembled a lab-scale frequency modulated continuous wave (FMCW) radar test-bed based on Texas Instrument's (TI) automotive chipset family.
arXiv Detail & Related papers (2019-12-29T02:14:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.