Demo Abstract: Indoor Positioning System in Visually-Degraded
Environments with Millimetre-Wave Radar and Inertial Sensors
- URL: http://arxiv.org/abs/2010.13750v1
- Date: Mon, 26 Oct 2020 17:41:25 GMT
- Title: Demo Abstract: Indoor Positioning System in Visually-Degraded
Environments with Millimetre-Wave Radar and Inertial Sensors
- Authors: Zhuangzhuang Dai, Muhamad Risqi U. Saputra, Chris Xiaoxuan Lu, Niki
Trigoni, Andrew Markham
- Abstract summary: We present a real-time indoor positioning system which fuses millimetre-wave (mmWave) radar and Inertial Measurement Units (IMU) data via deep sensor fusion.
Good accuracy and resilience were exhibited even in poorly illuminated scenes.
- Score: 44.58134907168034
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Positional estimation is of great importance in the public safety sector.
Emergency responders such as fire fighters, medical rescue teams, and the
police will all benefit from a resilient positioning system to deliver safe and
effective emergency services. Unfortunately, satellite navigation (e.g., GPS)
offers limited coverage in indoor environments. It is also not possible to rely
on infrastructure based solutions. To this end, wearable sensor-aided
navigation techniques, such as those based on camera and Inertial Measurement
Units (IMU), have recently emerged recently as an accurate, infrastructure-free
solution. Together with an increase in the computational capabilities of mobile
devices, motion estimation can be performed in real-time. In this
demonstration, we present a real-time indoor positioning system which fuses
millimetre-wave (mmWave) radar and IMU data via deep sensor fusion. We employ
mmWave radar rather than an RGB camera as it provides better robustness to
visual degradation (e.g., smoke, darkness, etc.) while at the same time
requiring lower computational resources to enable runtime computation. We
implemented the sensor system on a handheld device and a mobile computer
running at 10 FPS to track a user inside an apartment. Good accuracy and
resilience were exhibited even in poorly illuminated scenes.
Related papers
- Radar Fields: Frequency-Space Neural Scene Representations for FMCW Radar [62.51065633674272]
We introduce Radar Fields - a neural scene reconstruction method designed for active radar imagers.
Our approach unites an explicit, physics-informed sensor model with an implicit neural geometry and reflectance model to directly synthesize raw radar measurements.
We validate the effectiveness of the method across diverse outdoor scenarios, including urban scenes with dense vehicles and infrastructure.
arXiv Detail & Related papers (2024-05-07T20:44:48Z) - Radar-Lidar Fusion for Object Detection by Designing Effective
Convolution Networks [18.17057711053028]
We propose a dual-branch framework to integrate radar and Lidar data for enhanced object detection.
The results show that it surpasses state-of-the-art methods by $1.89%$ and $2.61%$ in favorable and adverse weather conditions.
arXiv Detail & Related papers (2023-10-30T10:18:40Z) - MSight: An Edge-Cloud Infrastructure-based Perception System for
Connected Automated Vehicles [58.461077944514564]
This paper presents MSight, a cutting-edge roadside perception system specifically designed for automated vehicles.
MSight offers real-time vehicle detection, localization, tracking, and short-term trajectory prediction.
Evaluations underscore the system's capability to uphold lane-level accuracy with minimal latency.
arXiv Detail & Related papers (2023-10-08T21:32:30Z) - Efficient Real-time Smoke Filtration with 3D LiDAR for Search and Rescue
with Autonomous Heterogeneous Robotic Systems [56.838297900091426]
Smoke and dust affect the performance of any mobile robotic platform due to their reliance on onboard perception systems.
This paper proposes a novel modular computation filtration pipeline based on intensity and spatial information.
arXiv Detail & Related papers (2023-08-14T16:48:57Z) - Camera-Radar Perception for Autonomous Vehicles and ADAS: Concepts,
Datasets and Metrics [77.34726150561087]
This work aims to carry out a study on the current scenario of camera and radar-based perception for ADAS and autonomous vehicles.
Concepts and characteristics related to both sensors, as well as to their fusion, are presented.
We give an overview of the Deep Learning-based detection and segmentation tasks, and the main datasets, metrics, challenges, and open questions in vehicle perception.
arXiv Detail & Related papers (2023-03-08T00:48:32Z) - mmSense: Detecting Concealed Weapons with a Miniature Radar Sensor [2.963928676363629]
mmSense is an end-to-end portable miniaturised real-time system that can accurately detect the presence of concealed metallic objects on persons.
mmSense features millimeter wave radar technology, provided by Google's Soli sensor for its data acquisition, and TransDope, our real-time neural network, capable of processing a single radar data frame in 19 ms.
arXiv Detail & Related papers (2023-02-28T15:06:03Z) - Fusion of Radio and Camera Sensor Data for Accurate Indoor Positioning [45.926983284834954]
We propose a novel positioning system, RAVEL, which fuses anonymous visual detections captured by widely available camera infrastructure, with radio readings.
Our experiments show that although the WiFi measurements are not by themselves sufficiently accurate, when they are fused with camera data, they become a catalyst for pulling together ambiguous, fragmented, and anonymous visual tracklets.
arXiv Detail & Related papers (2023-02-01T11:37:41Z) - Drone Detection and Tracking in Real-Time by Fusion of Different Sensing
Modalities [66.4525391417921]
We design and evaluate a multi-sensor drone detection system.
Our solution integrates a fish-eye camera as well to monitor a wider part of the sky and steer the other cameras towards objects of interest.
The thermal camera is shown to be a feasible solution as good as the video camera, even if the camera employed here has a lower resolution.
arXiv Detail & Related papers (2022-07-05T10:00:58Z) - A Novel Indoor Positioning System for unprepared firefighting scenarios [2.446948464551684]
This research implements a novel optical flow based video for compass orientation estimation and fused IMU data based activity recognition for Indoor Positioning Systems (IPS)
This technique helps first responders to go into unprepared, unknown environments and still maintain situational awareness like the orientation and, position of the victim fire fighters.
arXiv Detail & Related papers (2020-08-04T05:46:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.