Motion Capture Benchmark of Real Industrial Tasks and Traditional Crafts
for Human Movement Analysis
- URL: http://arxiv.org/abs/2304.03771v1
- Date: Mon, 3 Apr 2023 10:29:24 GMT
- Title: Motion Capture Benchmark of Real Industrial Tasks and Traditional Crafts
for Human Movement Analysis
- Authors: Brenda Elizabeth Olivas-Padilla, Alina Glushkova and Sotiris
Manitsaris
- Abstract summary: This paper presents seven datasets recorded using inertial-based motion capture.
The datasets contain professional gestures carried out by industrial operators and skilled craftsmen performed in real conditions in-situ.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Human movement analysis is a key area of research in robotics, biomechanics,
and data science. It encompasses tracking, posture estimation, and movement
synthesis. While numerous methodologies have evolved over time, a systematic
and quantitative evaluation of these approaches using verifiable ground truth
data of three-dimensional human movement is still required to define the
current state of the art. This paper presents seven datasets recorded using
inertial-based motion capture. The datasets contain professional gestures
carried out by industrial operators and skilled craftsmen performed in real
conditions in-situ. The datasets were created with the intention of being used
for research in human motion modeling, analysis, and generation. The protocols
for data collection are described in detail, and a preliminary analysis of the
collected data is provided as a benchmark. The Gesture Operational Model, a
hybrid stochastic-biomechanical approach based on kinematic descriptors, is
utilized to model the dynamics of the experts' movements and create
mathematical representations of their motion trajectories for analysis and
quantifying their body dexterity. The models allowed accurate the generation of
human professional poses and an intuitive description of how body joints
cooperate and change over time through the performance of the task.
Related papers
- Scaling Up Dynamic Human-Scene Interaction Modeling [58.032368564071895]
TRUMANS is the most comprehensive motion-captured HSI dataset currently available.
It intricately captures whole-body human motions and part-level object dynamics.
We devise a diffusion-based autoregressive model that efficiently generates HSI sequences of any length.
arXiv Detail & Related papers (2024-03-13T15:45:04Z) - 3D Kinematics Estimation from Video with a Biomechanical Model and
Synthetic Training Data [4.130944152992895]
We propose a novel biomechanics-aware network that directly outputs 3D kinematics from two input views.
Our experiments demonstrate that the proposed approach, only trained on synthetic data, outperforms previous state-of-the-art methods.
arXiv Detail & Related papers (2024-02-20T17:33:40Z) - Deep state-space modeling for explainable representation, analysis, and
generation of professional human poses [0.0]
This paper introduces three novel methods for creating explainable representations of human movement.
The trained models are used for the full-body dexterity analysis of expert professionals.
arXiv Detail & Related papers (2023-04-13T08:13:10Z) - Task-Oriented Human-Object Interactions Generation with Implicit Neural
Representations [61.659439423703155]
TOHO: Task-Oriented Human-Object Interactions Generation with Implicit Neural Representations.
Our method generates continuous motions that are parameterized only by the temporal coordinate.
This work takes a step further toward general human-scene interaction simulation.
arXiv Detail & Related papers (2023-03-23T09:31:56Z) - Learn to Predict How Humans Manipulate Large-sized Objects from
Interactive Motions [82.90906153293585]
We propose a graph neural network, HO-GCN, to fuse motion data and dynamic descriptors for the prediction task.
We show the proposed network that consumes dynamic descriptors can achieve state-of-the-art prediction results and help the network better generalize to unseen objects.
arXiv Detail & Related papers (2022-06-25T09:55:39Z) - Transformer Inertial Poser: Attention-based Real-time Human Motion
Reconstruction from Sparse IMUs [79.72586714047199]
We propose an attention-based deep learning method to reconstruct full-body motion from six IMU sensors in real-time.
Our method achieves new state-of-the-art results both quantitatively and qualitatively, while being simple to implement and smaller in size.
arXiv Detail & Related papers (2022-03-29T16:24:52Z) - Dynamic Modeling of Hand-Object Interactions via Tactile Sensing [133.52375730875696]
In this work, we employ a high-resolution tactile glove to perform four different interactive activities on a diversified set of objects.
We build our model on a cross-modal learning framework and generate the labels using a visual processing pipeline to supervise the tactile model.
This work takes a step on dynamics modeling in hand-object interactions from dense tactile sensing.
arXiv Detail & Related papers (2021-09-09T16:04:14Z) - TRiPOD: Human Trajectory and Pose Dynamics Forecasting in the Wild [77.59069361196404]
TRiPOD is a novel method for predicting body dynamics based on graph attentional networks.
To incorporate a real-world challenge, we learn an indicator representing whether an estimated body joint is visible/invisible at each frame.
Our evaluation shows that TRiPOD outperforms all prior work and state-of-the-art specifically designed for each of the trajectory and pose forecasting tasks.
arXiv Detail & Related papers (2021-04-08T20:01:00Z) - Graph-based Normalizing Flow for Human Motion Generation and
Reconstruction [20.454140530081183]
We propose a probabilistic generative model to synthesize and reconstruct long horizon motion sequences conditioned on past information and control signals.
We evaluate the models on a mixture of motion capture datasets of human locomotion with foot-step and bone-length analysis.
arXiv Detail & Related papers (2021-04-07T09:51:15Z) - Methodology for Mining, Discovering and Analyzing Semantic Human
Mobility Behaviors [0.3499870393443268]
We propose a novel methodological pipeline called simba for mining and analyzing semantic mobility sequences.
A framework for semantic sequence mobility analysis and clustering explicability is implemented.
arXiv Detail & Related papers (2020-12-08T22:24:19Z) - Data Science for Motion and Time Analysis with Modern Motion Sensor Data [14.105132549564873]
The motion-and-time analysis has been a popular research topic in operations research.
It is regaining attention as continuous improvement tools for lean manufacturing and smart factory.
This paper develops a framework for data-driven analysis of work motions and studies their correlations to work speeds or execution rates.
arXiv Detail & Related papers (2020-08-25T02:33:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.