EventEgoHands: Event-based Egocentric 3D Hand Mesh Reconstruction
- URL: http://arxiv.org/abs/2505.19169v3
- Date: Wed, 28 May 2025 06:32:42 GMT
- Title: EventEgoHands: Event-based Egocentric 3D Hand Mesh Reconstruction
- Authors: Ryosei Hara, Wataru Ikeda, Masashi Hatano, Mariko Isogawa,
- Abstract summary: Reconstructing 3D hand mesh is challenging but an important task for human-computer interaction and AR/VR applications.<n>We propose EventEgoHands, a novel method for event-based 3D hand mesh reconstruction in an egocentric view.<n>Our approach introduces a Hand Module that extracts hand regions, effectively mitigating the influence of dynamic background events.
- Score: 2.3695551082138864
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Reconstructing 3D hand mesh is challenging but an important task for human-computer interaction and AR/VR applications. In particular, RGB and/or depth cameras have been widely used in this task. However, methods using these conventional cameras face challenges in low-light environments and during motion blur. Thus, to address these limitations, event cameras have been attracting attention in recent years for their high dynamic range and high temporal resolution. Despite their advantages, event cameras are sensitive to background noise or camera motion, which has limited existing studies to static backgrounds and fixed cameras. In this study, we propose EventEgoHands, a novel method for event-based 3D hand mesh reconstruction in an egocentric view. Our approach introduces a Hand Segmentation Module that extracts hand regions, effectively mitigating the influence of dynamic background events. We evaluated our approach and demonstrated its effectiveness on the N-HOT3D dataset, improving MPJPE by approximately more than 4.5 cm (43%).
Related papers
- Event-based Egocentric Human Pose Estimation in Dynamic Environment [2.3695551082138864]
Estimating human pose using a front-facing egocentric camera is essential for applications such as sports motion analysis, VR/AR, and AI for wearable devices.<n>In this work, we introduce a novel task of human pose estimation using a front-facing event-based camera mounted on the head.
arXiv Detail & Related papers (2025-05-28T06:13:01Z) - EventEgo3D++: 3D Human Motion Capture from a Head-Mounted Event Camera [64.58147600753382]
EventEgo3D++ is a monocular event camera with a fisheye lens for 3D human motion capture.<n>Event cameras excel in high-speed scenarios and varying illumination due to their high temporal resolution.<n>Our method supports real-time 3D pose updates at a rate of 140Hz.
arXiv Detail & Related papers (2025-02-11T18:57:05Z) - Dyn-HaMR: Recovering 4D Interacting Hand Motion from a Dynamic Camera [49.82535393220003]
Dyn-HaMR is the first approach to reconstruct 4D global hand motion from monocular videos recorded by dynamic cameras in the wild.<n>We show that our approach significantly outperforms state-of-the-art methods in terms of 4D global mesh recovery.<n>This establishes a new benchmark for hand motion reconstruction from monocular video with moving cameras.
arXiv Detail & Related papers (2024-12-17T12:43:10Z) - E-3DGS: Gaussian Splatting with Exposure and Motion Events [29.042018288378447]
E-3DGS sets a new benchmark for event-based 3D reconstruction with robust performance in challenging conditions.<n>We introduce EME-3D, a real-world 3D dataset with exposure events, motion events, camera calibration parameters, and sparse point clouds.
arXiv Detail & Related papers (2024-10-22T13:17:20Z) - EF-3DGS: Event-Aided Free-Trajectory 3D Gaussian Splatting [72.60992807941885]
Event cameras, inspired by biological vision, record pixel-wise intensity changes asynchronously with high temporal resolution.<n>We propose Event-Aided Free-Trajectory 3DGS, which seamlessly integrates the advantages of event cameras into 3DGS.<n>We evaluate our method on the public Tanks and Temples benchmark and a newly collected real-world dataset, RealEv-DAVIS.
arXiv Detail & Related papers (2024-10-20T13:44:24Z) - EventEgo3D: 3D Human Motion Capture from Egocentric Event Streams [59.77837807004765]
This paper introduces a new problem, i.e., 3D human motion capture from an egocentric monocular event camera with a fisheye lens.
Event streams have high temporal resolution and provide reliable cues for 3D human motion capture under high-speed human motions and rapidly changing illumination.
Our EE3D demonstrates robustness and superior 3D accuracy compared to existing solutions while supporting real-time 3D pose update rates of 140Hz.
arXiv Detail & Related papers (2024-04-12T17:59:47Z) - Complementing Event Streams and RGB Frames for Hand Mesh Reconstruction [51.87279764576998]
We propose EvRGBHand -- the first approach for 3D hand mesh reconstruction with an event camera and an RGB camera compensating for each other.
EvRGBHand can tackle overexposure and motion blur issues in RGB-based HMR and foreground scarcity and background overflow issues in event-based HMR.
arXiv Detail & Related papers (2024-03-12T06:04:50Z) - EventHands: Real-Time Neural 3D Hand Reconstruction from an Event Stream [80.15360180192175]
3D hand pose estimation from monocular videos is a long-standing and challenging problem.
We address it for the first time using a single event camera, i.e., an asynchronous vision sensor reacting on brightness changes.
Our approach has characteristics previously not demonstrated with a single RGB or depth camera.
arXiv Detail & Related papers (2020-12-11T16:45:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.