Fusion of Radio and Camera Sensor Data for Accurate Indoor Positioning
- URL: http://arxiv.org/abs/2302.02952v1
- Date: Wed, 1 Feb 2023 11:37:41 GMT
- Title: Fusion of Radio and Camera Sensor Data for Accurate Indoor Positioning
- Authors: Savvas Papaioannou, Hongkai Wen, Andrew Markham and Niki Trigoni
- Abstract summary: We propose a novel positioning system, RAVEL, which fuses anonymous visual detections captured by widely available camera infrastructure, with radio readings.
Our experiments show that although the WiFi measurements are not by themselves sufficiently accurate, when they are fused with camera data, they become a catalyst for pulling together ambiguous, fragmented, and anonymous visual tracklets.
- Score: 45.926983284834954
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Indoor positioning systems have received a lot of attention recently due to
their importance for many location-based services, e.g. indoor navigation and
smart buildings. Lightweight solutions based on WiFi and inertial sensing have
gained popularity, but are not fit for demanding applications, such as expert
museum guides and industrial settings, which typically require sub-meter
location information. In this paper, we propose a novel positioning system,
RAVEL (Radio And Vision Enhanced Localization), which fuses anonymous visual
detections captured by widely available camera infrastructure, with radio
readings (e.g. WiFi radio data). Although visual trackers can provide excellent
positioning accuracy, they are plagued by issues such as occlusions and people
entering/exiting the scene, preventing their use as a robust tracking solution.
By incorporating radio measurements, visually ambiguous or missing data can be
resolved through multi-hypothesis tracking. We evaluate our system in a complex
museum environment with dim lighting and multiple people moving around in a
space cluttered with exhibit stands. Our experiments show that although the
WiFi measurements are not by themselves sufficiently accurate, when they are
fused with camera data, they become a catalyst for pulling together ambiguous,
fragmented, and anonymous visual tracklets into accurate and continuous paths,
yielding typical errors below 1 meter.
Related papers
- DensePose From WiFi [86.61881052177228]
We develop a deep neural network that maps the phase and amplitude of WiFi signals to UV coordinates within 24 human regions.
Our model can estimate the dense pose of multiple subjects, with comparable performance to image-based approaches.
arXiv Detail & Related papers (2022-12-31T16:48:43Z) - ViFi-Loc: Multi-modal Pedestrian Localization using GAN with
Camera-Phone Correspondences [7.953401800573514]
We propose a Generative Adversarial Network architecture to produce more accurate location estimations for pedestrians.
During training, it learns the underlying linkage between pedestrians' camera-phone data correspondences.
We show that our GAN produces 3D coordinates at 1 to 2 meter localization error across 5 different outdoor scenes.
arXiv Detail & Related papers (2022-11-22T05:27:38Z) - Drone Detection and Tracking in Real-Time by Fusion of Different Sensing
Modalities [66.4525391417921]
We design and evaluate a multi-sensor drone detection system.
Our solution integrates a fish-eye camera as well to monitor a wider part of the sky and steer the other cameras towards objects of interest.
The thermal camera is shown to be a feasible solution as good as the video camera, even if the camera employed here has a lower resolution.
arXiv Detail & Related papers (2022-07-05T10:00:58Z) - Radar Voxel Fusion for 3D Object Detection [0.0]
This paper develops a low-level sensor fusion network for 3D object detection.
The radar sensor fusion proves especially beneficial in inclement conditions such as rain and night scenes.
arXiv Detail & Related papers (2021-06-26T20:34:12Z) - Infrared Beacons for Robust Localization [58.720142291102135]
This paper presents a localization system that uses infrared beacons and a camera equipped with an optical band-pass filter.
Our system can reliably detect and identify individual beacons at 100m distance regardless of lighting conditions.
arXiv Detail & Related papers (2021-04-19T14:23:20Z) - Demo Abstract: Indoor Positioning System in Visually-Degraded
Environments with Millimetre-Wave Radar and Inertial Sensors [44.58134907168034]
We present a real-time indoor positioning system which fuses millimetre-wave (mmWave) radar and Inertial Measurement Units (IMU) data via deep sensor fusion.
Good accuracy and resilience were exhibited even in poorly illuminated scenes.
arXiv Detail & Related papers (2020-10-26T17:41:25Z) - Vision Meets Wireless Positioning: Effective Person Re-identification
with Recurrent Context Propagation [120.18969251405485]
Existing person re-identification methods rely on the visual sensor to capture the pedestrians.
Mobile phone can be sensed by WiFi and cellular networks in the form of a wireless positioning signal.
We propose a novel recurrent context propagation module that enables information to propagate between visual data and wireless positioning data.
arXiv Detail & Related papers (2020-08-10T14:19:15Z) - Real-Time Drone Detection and Tracking With Visible, Thermal and
Acoustic Sensors [66.4525391417921]
A thermal infrared camera is shown to be a feasible solution to the drone detection task.
The detector performance as a function of the sensor-to-target distance is also investigated.
A novel video dataset containing 650 annotated infrared and visible videos of drones, birds, airplanes and helicopters is also presented.
arXiv Detail & Related papers (2020-07-14T23:06:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.