IR-UWB Radar-based Situational Awareness System for
Smartphone-Distracted Pedestrians
- URL: http://arxiv.org/abs/2311.00991v1
- Date: Thu, 2 Nov 2023 04:45:04 GMT
- Title: IR-UWB Radar-based Situational Awareness System for
Smartphone-Distracted Pedestrians
- Authors: Jamsheed Manja Ppallan, Ruchi Pandey, Yellappa Damam, Vijay Narayan
Tiwari, Karthikeyan Arunachalam and Antariksha Ray
- Abstract summary: This paper proposes a novel and real-time assistance system called UWB-assisted Safe Walk (UASW) for obstacle detection and warns users about real-time situations.
We implemented UASW specifically for Android smartphones with IR-UWB connectivity.
The proposed system achieves an obstacle detection accuracy of up to 97% and obstacle classification accuracy of up to 95% with an inference delay of 26.8 ms.
- Score: 1.4074017875514788
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: With the widespread adoption of smartphones, ensuring pedestrian safety on
roads has become a critical concern due to smartphone distraction. This paper
proposes a novel and real-time assistance system called UWB-assisted Safe Walk
(UASW) for obstacle detection and warns users about real-time situations. The
proposed method leverages Impulse Radio Ultra-Wideband (IR-UWB) radar embedded
in the smartphone, which provides excellent range resolution and high noise
resilience using short pulses. We implemented UASW specifically for Android
smartphones with IR-UWB connectivity. The framework uses complex Channel
Impulse Response (CIR) data to integrate rule-based obstacle detection with
artificial neural network (ANN) based obstacle classification. The performance
of the proposed UASW system is analyzed using real-time collected data. The
results show that the proposed system achieves an obstacle detection accuracy
of up to 97% and obstacle classification accuracy of up to 95% with an
inference delay of 26.8 ms. The results highlight the effectiveness of UASW in
assisting smartphone-distracted pedestrians and improving their situational
awareness.
Related papers
- Enhancing Reliability in Federated mmWave Networks: A Practical and
Scalable Solution using Radar-Aided Dynamic Blockage Recognition [14.18507067281377]
This article introduces a new method to improve the dependability of millimeter-wave (mmWave) and terahertz (THz) network services in dynamic outdoor environments.
In these settings, line-of-sight (LoS) connections are easily interrupted by moving obstacles like humans and vehicles.
The proposed approach, coined as Radar-aided blockage Dynamic Recognition (RaDaR), leverages radar measurements and federated learning (FL) to train a dual-output neural network (NN) model.
arXiv Detail & Related papers (2023-06-22T10:10:25Z) - A Systematic Study on Object Recognition Using Millimeter-wave Radar [1.3192560874022086]
millimeter-wave (MMW) radars are essential in smart environments.
MMW radars are expensive and hard to get for community-purpose smart environment applications.
These challenges need to be investigated for tasks like recognizing objects and activities.
arXiv Detail & Related papers (2023-05-03T12:42:44Z) - Camera-Radar Perception for Autonomous Vehicles and ADAS: Concepts,
Datasets and Metrics [77.34726150561087]
This work aims to carry out a study on the current scenario of camera and radar-based perception for ADAS and autonomous vehicles.
Concepts and characteristics related to both sensors, as well as to their fusion, are presented.
We give an overview of the Deep Learning-based detection and segmentation tasks, and the main datasets, metrics, challenges, and open questions in vehicle perception.
arXiv Detail & Related papers (2023-03-08T00:48:32Z) - NVRadarNet: Real-Time Radar Obstacle and Free Space Detection for
Autonomous Driving [57.03126447713602]
We present a deep neural network (DNN) that detects dynamic obstacles and drivable free space using automotive RADAR sensors.
The network runs faster than real time on an embedded GPU and shows good generalization across geographic regions.
arXiv Detail & Related papers (2022-09-29T01:30:34Z) - Large Scale Passenger Detection with Smartphone/Bus Implicit Interaction
and Multisensory Unsupervised Cause-effect Learning [5.449283796175882]
We focus on the concept of implicit Be-in/Be-out (BIBO) smartphone-sensing agnostic and classification.
To enable the training of a model based on GPS features against the BLE pseudo-label, we propose the Cause-Effect Multitask Wasserstein Autoencoder (CEMWA)
arXiv Detail & Related papers (2022-02-24T08:50:32Z) - Smartphone-based Hard-braking Event Detection at Scale for Road Safety
Services [6.451490979743455]
Road crashes are the sixth leading cause of lost disability-adjusted life-years (DALYs) worldwide.
This paper presents a scalable approach for detecting hard-braking events using the kinematics data collected from smartphone sensors.
We train a Transformer-based machine learning model for hard-braking event detection using concurrent sensor readings from smartphones and vehicle sensors from drivers who connect their phone to the vehicle while navigating in Google Maps.
arXiv Detail & Related papers (2022-02-04T01:30:32Z) - Efficient and Robust LiDAR-Based End-to-End Navigation [132.52661670308606]
We present an efficient and robust LiDAR-based end-to-end navigation framework.
We propose Fast-LiDARNet that is based on sparse convolution kernel optimization and hardware-aware model design.
We then propose Hybrid Evidential Fusion that directly estimates the uncertainty of the prediction from only a single forward pass.
arXiv Detail & Related papers (2021-05-20T17:52:37Z) - Complex-valued Convolutional Neural Networks for Enhanced Radar Signal
Denoising and Interference Mitigation [73.0103413636673]
We propose the use of Complex-Valued Convolutional Neural Networks (CVCNNs) to address the issue of mutual interference between radar sensors.
CVCNNs increase data efficiency, speeds up network training and substantially improves the conservation of phase information during interference removal.
arXiv Detail & Related papers (2021-04-29T10:06:29Z) - RadarNet: Exploiting Radar for Robust Perception of Dynamic Objects [73.80316195652493]
We tackle the problem of exploiting Radar for perception in the context of self-driving cars.
We propose a new solution that exploits both LiDAR and Radar sensors for perception.
Our approach, dubbed RadarNet, features a voxel-based early fusion and an attention-based late fusion.
arXiv Detail & Related papers (2020-07-28T17:15:02Z) - Drone-based RGB-Infrared Cross-Modality Vehicle Detection via
Uncertainty-Aware Learning [59.19469551774703]
Drone-based vehicle detection aims at finding the vehicle locations and categories in an aerial image.
We construct a large-scale drone-based RGB-Infrared vehicle detection dataset, termed DroneVehicle.
Our DroneVehicle collects 28, 439 RGB-Infrared image pairs, covering urban roads, residential areas, parking lots, and other scenarios from day to night.
arXiv Detail & Related papers (2020-03-05T05:29:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.