A Survey on 3D Egocentric Human Pose Estimation
- URL: http://arxiv.org/abs/2403.17893v2
- Date: Thu, 18 Apr 2024 05:09:04 GMT
- Title: A Survey on 3D Egocentric Human Pose Estimation
- Authors: Md Mushfiqur Azam, Kevin Desai,
- Abstract summary: Egocentric human pose estimation aims to estimate human body poses and develop body representations from a first-person camera perspective.
It has gained vast popularity in recent years because of its wide range of applications in sectors like XR-technologies, human-computer interaction, and fitness tracking.
There is no systematic literature review based on the proposed solutions regarding egocentric 3D human pose estimation.
- Score: 3.5845457075304363
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Egocentric human pose estimation aims to estimate human body poses and develop body representations from a first-person camera perspective. It has gained vast popularity in recent years because of its wide range of applications in sectors like XR-technologies, human-computer interaction, and fitness tracking. However, to the best of our knowledge, there is no systematic literature review based on the proposed solutions regarding egocentric 3D human pose estimation. To that end, the aim of this survey paper is to provide an extensive overview of the current state of egocentric pose estimation research. In this paper, we categorize and discuss the popular datasets and the different pose estimation models, highlighting the strengths and weaknesses of different methods by comparative analysis. This survey can be a valuable resource for both researchers and practitioners in the field, offering insights into key concepts and cutting-edge solutions in egocentric pose estimation, its wide-ranging applications, as well as the open problems with future scope.
Related papers
- Deep Learning-Based Object Pose Estimation: A Comprehensive Survey [73.74933379151419]
We discuss the recent advances in deep learning-based object pose estimation.
Our survey also covers multiple input data modalities, degrees-of-freedom of output poses, object properties, and downstream tasks.
arXiv Detail & Related papers (2024-05-13T14:44:22Z) - Data Augmentation in Human-Centric Vision [54.97327269866757]
This survey presents a comprehensive analysis of data augmentation techniques in human-centric vision tasks.
It delves into a wide range of research areas including person ReID, human parsing, human pose estimation, and pedestrian detection.
Our work categorizes data augmentation methods into two main types: data generation and data perturbation.
arXiv Detail & Related papers (2024-03-13T16:05:18Z) - Deep learning for 3D human pose estimation and mesh recovery: A survey [6.535833206786788]
We present a review of recent progress over the past five years in deep learning methods for 3D human pose estimation.
To the best of our knowledge, this survey is arguably the first to comprehensively cover deep learning methods for 3D human pose estimation.
arXiv Detail & Related papers (2024-02-29T04:30:39Z) - Understanding Pose and Appearance Disentanglement in 3D Human Pose
Estimation [72.50214227616728]
Several methods have proposed to learn image representations in a self-supervised fashion so as to disentangle the appearance information from the pose one.
We study disentanglement from the perspective of the self-supervised network, via diverse image synthesis experiments.
We design an adversarial strategy focusing on generating natural appearance changes of the subject, and against which we could expect a disentangled network to be robust.
arXiv Detail & Related papers (2023-09-20T22:22:21Z) - A survey of top-down approaches for human pose estimation [0.0]
State-of-the-art methods implemented with Deep Learning have brought remarkable results in the field of human pose estimation.
This paper aims to provide newcomers with an extensive review of deep learning methods-based 2D images for recognizing the pose of people.
arXiv Detail & Related papers (2022-02-05T23:27:46Z) - Estimating Egocentric 3D Human Pose in the Wild with External Weak
Supervision [72.36132924512299]
We present a new egocentric pose estimation method, which can be trained on a large-scale in-the-wild egocentric dataset.
We propose a novel learning strategy to supervise the egocentric features with the high-quality features extracted by a pretrained external-view pose estimation model.
Experiments show that our method predicts accurate 3D poses from a single in-the-wild egocentric image and outperforms the state-of-the-art methods both quantitatively and qualitatively.
arXiv Detail & Related papers (2022-01-20T00:45:13Z) - Recent Advances in Monocular 2D and 3D Human Pose Estimation: A Deep
Learning Perspective [69.44384540002358]
We provide a comprehensive and holistic 2D-to-3D perspective to tackle this problem.
We categorize the mainstream and milestone approaches since the year 2014 under unified frameworks.
We also summarize the pose representation styles, benchmarks, evaluation metrics, and the quantitative performance of popular approaches.
arXiv Detail & Related papers (2021-04-23T11:07:07Z) - Deep Learning-Based Human Pose Estimation: A Survey [66.01917727294163]
Human pose estimation has drawn increasing attention during the past decade.
It has been utilized in a wide range of applications including human-computer interaction, motion analysis, augmented reality, and virtual reality.
Recent deep learning-based solutions have achieved high performance in human pose estimation.
arXiv Detail & Related papers (2020-12-24T18:49:06Z) - Monocular Human Pose Estimation: A Survey of Deep Learning-based Methods [25.3614052943568]
Vision-based monocular human pose estimation is one of the most fundamental and challenging problems in computer vision.
The recent developments of deep learning techniques have been brought significant progress and remarkable breakthroughs in the field of human pose estimation.
arXiv Detail & Related papers (2020-06-02T07:07:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.