Near Real-Time Position Tracking for Robot-Guided Evacuation
- URL: http://arxiv.org/abs/2309.15054v1
- Date: Tue, 26 Sep 2023 16:34:18 GMT
- Title: Near Real-Time Position Tracking for Robot-Guided Evacuation
- Authors: Mollik Nayyar, Alan Wagner
- Abstract summary: This paper introduces a near real-time human position tracking solution tailored for evacuation robots.
We show that the system can achieve an accuracy of 0.55 meters when compared to ground truth.
The potential of our approach extends beyond mere tracking, paving the way for evacuee motion prediction.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: During the evacuation of a building, the rapid and accurate tracking of human
evacuees can be used by a guide robot to increase the effectiveness of the
evacuation [1],[2]. This paper introduces a near real-time human position
tracking solution tailored for evacuation robots. Using a pose detector, our
system first identifies human joints in the camera frame in near real-time and
then translates the position of these pixels into real-world coordinates via a
simple calibration process. We run multiple trials of the system in action in
an indoor lab environment and show that the system can achieve an accuracy of
0.55 meters when compared to ground truth. The system can also achieve an
average of 3 frames per second (FPS) which was sufficient for our study on
robot-guided human evacuation. The potential of our approach extends beyond
mere tracking, paving the way for evacuee motion prediction, allowing the robot
to proactively respond to human movements during an evacuation.
Related papers
- Language-guided Robust Navigation for Mobile Robots in Dynamically-changing Environments [26.209402619114353]
We develop an embodied AI system for human-in-the-loop navigation with a wheeled mobile robot.
We propose a method of monitoring the robot's current plan to detect changes in the environment that impact the intended trajectory of the robot.
This work can support applications like precision agriculture and construction, where persistent monitoring of the environment provides a human with information about the environment state.
arXiv Detail & Related papers (2024-09-28T21:30:23Z) - Exploring 3D Human Pose Estimation and Forecasting from the Robot's Perspective: The HARPER Dataset [52.22758311559]
We introduce HARPER, a novel dataset for 3D body pose estimation and forecast in dyadic interactions between users and Spot.
The key-novelty is the focus on the robot's perspective, i.e., on the data captured by the robot's sensors.
The scenario underlying HARPER includes 15 actions, of which 10 involve physical contact between the robot and users.
arXiv Detail & Related papers (2024-03-21T14:53:50Z) - AZTR: Aerial Video Action Recognition with Auto Zoom and Temporal
Reasoning [63.628195002143734]
We propose a novel approach for aerial video action recognition.
Our method is designed for videos captured using UAVs and can run on edge or mobile devices.
We present a learning-based approach that uses customized auto zoom to automatically identify the human target and scale it appropriately.
arXiv Detail & Related papers (2023-03-02T21:24:19Z) - Learning Semantics-Aware Locomotion Skills from Human Demonstration [35.996425893483796]
We present a framework that learns semantics-aware locomotion skills from perception for quadrupedal robots.
Our framework learns to adjust the speed and gait of the robot based on perceived terrain semantics, and enables the robot to walk over 6km without failure.
arXiv Detail & Related papers (2022-06-27T21:08:03Z) - Neural Scene Representation for Locomotion on Structured Terrain [56.48607865960868]
We propose a learning-based method to reconstruct the local terrain for a mobile robot traversing urban environments.
Using a stream of depth measurements from the onboard cameras and the robot's trajectory, the estimates the topography in the robot's vicinity.
We propose a 3D reconstruction model that faithfully reconstructs the scene, despite the noisy measurements and large amounts of missing data coming from the blind spots of the camera arrangement.
arXiv Detail & Related papers (2022-06-16T10:45:17Z) - Learning Time-optimized Path Tracking with or without Sensory Feedback [5.254093731341154]
We present a learning-based approach that allows a robot to quickly follow a reference path defined in joint space.
The robot is controlled by a neural network that is trained via reinforcement learning using data generated by a physics simulator.
arXiv Detail & Related papers (2022-03-03T19:13:31Z) - AuraSense: Robot Collision Avoidance by Full Surface Proximity Detection [3.9770080498150224]
AuraSense is the first system to realize no-dead-spot proximity sensing for robot arms.
It requires only a single pair of piezoelectric transducers, and can easily be applied to off-the-shelf robots.
arXiv Detail & Related papers (2021-08-10T18:37:54Z) - SABER: Data-Driven Motion Planner for Autonomously Navigating
Heterogeneous Robots [112.2491765424719]
We present an end-to-end online motion planning framework that uses a data-driven approach to navigate a heterogeneous robot team towards a global goal.
We use model predictive control (SMPC) to calculate control inputs that satisfy robot dynamics, and consider uncertainty during obstacle avoidance with chance constraints.
recurrent neural networks are used to provide a quick estimate of future state uncertainty considered in the SMPC finite-time horizon solution.
A Deep Q-learning agent is employed to serve as a high-level path planner, providing the SMPC with target positions that move the robots towards a desired global goal.
arXiv Detail & Related papers (2021-08-03T02:56:21Z) - Human POSEitioning System (HPS): 3D Human Pose Estimation and
Self-localization in Large Scenes from Body-Mounted Sensors [71.29186299435423]
We introduce (HPS) Human POSEitioning System, a method to recover the full 3D pose of a human registered with a 3D scan of the surrounding environment.
We show that our optimization-based integration exploits the benefits of the two, resulting in pose accuracy free of drift.
HPS could be used for VR/AR applications where humans interact with the scene without requiring direct line of sight with an external camera.
arXiv Detail & Related papers (2021-03-31T17:58:31Z) - Show Me What You Can Do: Capability Calibration on Reachable Workspace
for Human-Robot Collaboration [83.4081612443128]
We show that a short calibration using REMP can effectively bridge the gap between what a non-expert user thinks a robot can reach and the ground-truth.
We show that this calibration procedure not only results in better user perception, but also promotes more efficient human-robot collaborations.
arXiv Detail & Related papers (2021-03-06T09:14:30Z) - Wearable camera-based human absolute localization in large warehouses [0.0]
This paper introduces a wearable human localization system for large warehouses.
A monocular down-looking camera is detecting ground nodes, identifying them and computing the absolute position of the human.
A virtual safety area around the human operator is set up and any AGV in this area is immediately stopped.
arXiv Detail & Related papers (2020-07-20T12:57:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.