Event-based tracking of human hands
- URL: http://arxiv.org/abs/2304.06534v1
- Date: Thu, 13 Apr 2023 13:43:45 GMT
- Title: Event-based tracking of human hands
- Authors: Laura Duarte, Mohammad Safeea, Pedro Neto
- Abstract summary: Event camera detects changes in brightness, measuring motion, with low latency, no motion blur, low power consumption and high dynamic range.
Captured frames are analysed using lightweight algorithms reporting 3D hand position data.
- Score: 0.6875312133832077
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper proposes a novel method for human hands tracking using data from
an event camera. The event camera detects changes in brightness, measuring
motion, with low latency, no motion blur, low power consumption and high
dynamic range. Captured frames are analysed using lightweight algorithms
reporting 3D hand position data. The chosen pick-and-place scenario serves as
an example input for collaborative human-robot interactions and in obstacle
avoidance for human-robot safety applications. Events data are pre-processed
into intensity frames. The regions of interest (ROI) are defined through object
edge event activity, reducing noise. ROI features are extracted for use
in-depth perception. Event-based tracking of human hand demonstrated feasible,
in real time and at a low computational cost. The proposed ROI-finding method
reduces noise from intensity images, achieving up to 89% of data reduction in
relation to the original, while preserving the features. The depth estimation
error in relation to ground truth (measured with wearables), measured using
dynamic time warping and using a single event camera, is from 15 to 30
millimetres, depending on the plane it is measured. Tracking of human hands in
3D space using a single event camera data and lightweight algorithms to define
ROI features (hands tracking in space).
Related papers
- Line-based 6-DoF Object Pose Estimation and Tracking With an Event Camera [19.204896246140155]
Event cameras possess remarkable attributes such as high dynamic range, low latency, and resilience against motion blur.
We propose a line-based robust pose estimation and tracking method for planar or non-planar objects using an event camera.
arXiv Detail & Related papers (2024-08-06T14:36:43Z) - EventEgo3D: 3D Human Motion Capture from Egocentric Event Streams [59.77837807004765]
This paper introduces a new problem, i.e., 3D human motion capture from an egocentric monocular event camera with a fisheye lens.
Event streams have high temporal resolution and provide reliable cues for 3D human motion capture under high-speed human motions and rapidly changing illumination.
Our EE3D demonstrates robustness and superior 3D accuracy compared to existing solutions while supporting real-time 3D pose update rates of 140Hz.
arXiv Detail & Related papers (2024-04-12T17:59:47Z) - 3D Human Scan With A Moving Event Camera [7.734104968315144]
Event cameras have the advantages of high temporal resolution and high dynamic range.
This paper proposes a novel event-based method for 3D pose estimation and human mesh recovery.
arXiv Detail & Related papers (2024-04-12T14:34:24Z) - Tracking Fast by Learning Slow: An Event-based Speed Adaptive Hand
Tracker Leveraging Knowledge in RGB Domain [4.530678016396477]
3D hand tracking methods based on monocular RGB videos are easily affected by motion blur, while event camera, a sensor with high temporal resolution and dynamic range, is naturally suitable for this task with sparse output and low power consumption.
We developed an event-based speed adaptive hand tracker (ESAHT) to solve the hand tracking problem based on event camera.
Our solution outperformed RGB-based as well as previous event-based solutions in fast hand tracking tasks, and our codes and dataset will be publicly available.
arXiv Detail & Related papers (2023-02-28T09:18:48Z) - Data-driven Feature Tracking for Event Cameras [48.04815194265117]
We introduce the first data-driven feature tracker for event cameras, which leverages low-latency events to track features detected in a grayscale frame.
By directly transferring zero-shot from synthetic to real data, our data-driven tracker outperforms existing approaches in relative feature age by up to 120%.
This performance gap is further increased to 130% by adapting our tracker to real data with a novel self-supervision strategy.
arXiv Detail & Related papers (2022-11-23T10:20:11Z) - PUCK: Parallel Surface and Convolution-kernel Tracking for Event-Based
Cameras [4.110120522045467]
Event-cameras can guarantee fast visual sensing in dynamic environments, but require a tracking algorithm that can keep up with the high data rate induced by the robot ego-motion.
We introduce a novel tracking method that leverages the Exponential Reduced Ordinal Surface (EROS) data representation to decouple event-by-event processing and tracking.
We propose the task of tracking the air hockey puck sliding on a surface, with the future aim of controlling the iCub robot to reach the target precisely and on time.
arXiv Detail & Related papers (2022-05-16T13:23:52Z) - Event-aided Direct Sparse Odometry [54.602311491827805]
We introduce EDS, a direct monocular visual odometry using events and frames.
Our algorithm leverages the event generation model to track the camera motion in the blind time between frames.
EDS is the first method to perform 6-DOF VO using events and frames with a direct approach.
arXiv Detail & Related papers (2022-04-15T20:40:29Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - Moving Object Detection for Event-based vision using Graph Spectral
Clustering [6.354824287948164]
Moving object detection has been a central topic of discussion in computer vision for its wide range of applications.
We present an unsupervised Graph Spectral Clustering technique for Moving Object Detection in Event-based data.
We additionally show how the optimum number of moving objects can be automatically determined.
arXiv Detail & Related papers (2021-09-30T10:19:22Z) - EventHands: Real-Time Neural 3D Hand Reconstruction from an Event Stream [80.15360180192175]
3D hand pose estimation from monocular videos is a long-standing and challenging problem.
We address it for the first time using a single event camera, i.e., an asynchronous vision sensor reacting on brightness changes.
Our approach has characteristics previously not demonstrated with a single RGB or depth camera.
arXiv Detail & Related papers (2020-12-11T16:45:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.