BioMoDiffuse: Physics-Guided Biomechanical Diffusion for Controllable and Authentic Human Motion Synthesis
- URL: http://arxiv.org/abs/2503.06151v1
- Date: Sat, 08 Mar 2025 10:22:36 GMT
- Title: BioMoDiffuse: Physics-Guided Biomechanical Diffusion for Controllable and Authentic Human Motion Synthesis
- Authors: Zixi Kang, Xinghan Wang, Yadong Mu,
- Abstract summary: This paper introduces BioMoDiffuse, a novel biomechanics-aware diffusion framework.<n>It features three key innovations: (1) A lightweight biodynamic network that integrates muscle electromyography (EMG) signals with acceleration constraints; (2) A physics-guided diffusion process that incorporates real-time biomechanical verification via modified Euler-Lagrange equations; and (3) A decoupled control mechanism that allows independent regulation of motion speed and semantic context.
- Score: 21.750804738752105
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Human motion generation holds significant promise in fields such as animation, film production, and robotics. However, existing methods often fail to produce physically plausible movements that adhere to biomechanical principles. While recent autoregressive and diffusion models have improved visual quality, they frequently overlook essential biodynamic features, such as muscle activation patterns and joint coordination, leading to motions that either violate physical laws or lack controllability. This paper introduces BioMoDiffuse, a novel biomechanics-aware diffusion framework that addresses these limitations. It features three key innovations: (1) A lightweight biodynamic network that integrates muscle electromyography (EMG) signals and kinematic features with acceleration constraints, (2) A physics-guided diffusion process that incorporates real-time biomechanical verification via modified Euler-Lagrange equations, and (3) A decoupled control mechanism that allows independent regulation of motion speed and semantic context. We also propose a set of comprehensive evaluation protocols that combines traditional metrics (FID, R-precision, etc.) with new biomechanical criteria (smoothness, foot sliding, floating, etc.). Our approach bridges the gap between data-driven motion synthesis and biomechanical authenticity, establishing new benchmarks for physically accurate motion generation.
Related papers
- Spatial-Temporal Graph Diffusion Policy with Kinematic Modeling for Bimanual Robotic Manipulation [88.83749146867665]
Existing approaches learn a policy to predict a distant next-best end-effector pose.
They then compute the corresponding joint rotation angles for motion using inverse kinematics.
We propose Kinematics enhanced Spatial-TemporAl gRaph diffuser.
arXiv Detail & Related papers (2025-03-13T17:48:35Z) - Learning Speed-Adaptive Walking Agent Using Imitation Learning with Physics-Informed Simulation [0.0]
We create a skeletal humanoid agent capable of adapting to varying walking speeds while maintaining biomechanically realistic motions.<n>The framework combines a synthetic data generator, which produces biomechanically plausible gait kinematics from open-source biomechanics data, and a training system that uses adversarial imitation learning to train the agent's walking policy.
arXiv Detail & Related papers (2024-12-05T07:55:58Z) - MS-MANO: Enabling Hand Pose Tracking with Biomechanical Constraints [50.61346764110482]
We integrate a musculoskeletal system with a learnable parametric hand model, MANO, to create MS-MANO.
This model emulates the dynamics of muscles and tendons to drive the skeletal system, imposing physiologically realistic constraints on the resulting torque trajectories.
We also propose a simulation-in-the-loop pose refinement framework, BioPR, that refines the initial estimated pose through a multi-layer perceptron network.
arXiv Detail & Related papers (2024-04-16T02:18:18Z) - 3D Kinematics Estimation from Video with a Biomechanical Model and
Synthetic Training Data [4.130944152992895]
We propose a novel biomechanics-aware network that directly outputs 3D kinematics from two input views.
Our experiments demonstrate that the proposed approach, only trained on synthetic data, outperforms previous state-of-the-art methods.
arXiv Detail & Related papers (2024-02-20T17:33:40Z) - DiffuseBot: Breeding Soft Robots With Physics-Augmented Generative
Diffusion Models [102.13968267347553]
We present DiffuseBot, a physics-augmented diffusion model that generates soft robot morphologies capable of excelling in a wide spectrum of tasks.
We showcase a range of simulated and fabricated robots along with their capabilities.
arXiv Detail & Related papers (2023-11-28T18:58:48Z) - Priority-Centric Human Motion Generation in Discrete Latent Space [59.401128190423535]
We introduce a Priority-Centric Motion Discrete Diffusion Model (M2DM) for text-to-motion generation.
M2DM incorporates a global self-attention mechanism and a regularization term to counteract code collapse.
We also present a motion discrete diffusion model that employs an innovative noise schedule, determined by the significance of each motion token.
arXiv Detail & Related papers (2023-08-28T10:40:16Z) - A Physics-Informed Low-Shot Learning For sEMG-Based Estimation of Muscle
Force and Joint Kinematics [4.878073267556235]
Muscle force and joint kinematics estimation from surface electromyography (sEMG) are essential for real-time biomechanical analysis.
Recent advances in deep neural networks (DNNs) have shown the potential to improve biomechanical analysis in a fully automated and reproducible manner.
This paper presents a novel physics-informed low-shot learning method for sEMG-based estimation of muscle force and joint kinematics.
arXiv Detail & Related papers (2023-07-08T23:01:12Z) - Skeleton2Humanoid: Animating Simulated Characters for
Physically-plausible Motion In-betweening [59.88594294676711]
Modern deep learning based motion synthesis approaches barely consider the physical plausibility of synthesized motions.
We propose a system Skeleton2Humanoid'' which performs physics-oriented motion correction at test time.
Experiments on the challenging LaFAN1 dataset show our system can outperform prior methods significantly in terms of both physical plausibility and accuracy.
arXiv Detail & Related papers (2022-10-09T16:15:34Z) - Ultrafast viscosity measurement with ballistic optical tweezers [55.41644538483948]
Noninvasive viscosity measurements require integration times of seconds.
We demonstrate a four orders-of-magnitude improvement in speed, down to twenty microseconds.
We achieve this using the instantaneous velocity of a trapped particle in an optical tweezer.
arXiv Detail & Related papers (2020-06-29T00:09:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.