BioMoDiffuse: Physics-Guided Biomechanical Diffusion for Controllable and Authentic Human Motion Synthesis
- URL: http://arxiv.org/abs/2503.06151v1
- Date: Sat, 08 Mar 2025 10:22:36 GMT
- Title: BioMoDiffuse: Physics-Guided Biomechanical Diffusion for Controllable and Authentic Human Motion Synthesis
- Authors: Zixi Kang, Xinghan Wang, Yadong Mu,
- Abstract summary: This paper introduces BioMoDiffuse, a novel biomechanics-aware diffusion framework.<n>It features three key innovations: (1) A lightweight biodynamic network that integrates muscle electromyography (EMG) signals with acceleration constraints; (2) A physics-guided diffusion process that incorporates real-time biomechanical verification via modified Euler-Lagrange equations; and (3) A decoupled control mechanism that allows independent regulation of motion speed and semantic context.
- Score: 21.750804738752105
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Human motion generation holds significant promise in fields such as animation, film production, and robotics. However, existing methods often fail to produce physically plausible movements that adhere to biomechanical principles. While recent autoregressive and diffusion models have improved visual quality, they frequently overlook essential biodynamic features, such as muscle activation patterns and joint coordination, leading to motions that either violate physical laws or lack controllability. This paper introduces BioMoDiffuse, a novel biomechanics-aware diffusion framework that addresses these limitations. It features three key innovations: (1) A lightweight biodynamic network that integrates muscle electromyography (EMG) signals and kinematic features with acceleration constraints, (2) A physics-guided diffusion process that incorporates real-time biomechanical verification via modified Euler-Lagrange equations, and (3) A decoupled control mechanism that allows independent regulation of motion speed and semantic context. We also propose a set of comprehensive evaluation protocols that combines traditional metrics (FID, R-precision, etc.) with new biomechanical criteria (smoothness, foot sliding, floating, etc.). Our approach bridges the gap between data-driven motion synthesis and biomechanical authenticity, establishing new benchmarks for physically accurate motion generation.
Related papers
- Towards Immersive Human-X Interaction: A Real-Time Framework for Physically Plausible Motion Synthesis [51.95817740348585]
Human-X is a novel framework designed to enable immersive and physically plausible human interactions across diverse entities.<n>Our method jointly predicts actions and reactions in real-time using an auto-regressive reaction diffusion planner.<n>Our framework is validated in real-world applications, including virtual reality interface for human-robot interaction.
arXiv Detail & Related papers (2025-08-04T06:35:48Z) - Half-Physics: Enabling Kinematic 3D Human Model with Physical Interactions [88.01918532202716]
We introduce a novel approach that embeds SMPL-X into a tangible entity capable of dynamic physical interactions with its surroundings.<n>Our approach maintains kinematic control over inherent SMPL-X poses while ensuring physically plausible interactions with scenes and objects.<n>Unlike reinforcement learning-based methods, which demand extensive and complex training, our half-physics method is learning-free and generalizes to any body shape and motion.
arXiv Detail & Related papers (2025-07-31T17:58:33Z) - KinTwin: Imitation Learning with Torque and Muscle Driven Biomechanical Models Enables Precise Replication of Able-Bodied and Impaired Movement from Markerless Motion Capture [2.44755919161855]
High-quality movement analysis could greatly benefit movement science and rehabilitation.<n>We show the potential for using imitation learning to enable high-quality movement analysis in clinical practice.
arXiv Detail & Related papers (2025-05-19T17:58:03Z) - GENMO: A GENeralist Model for Human MOtion [64.16188966024542]
We present GENMO, a unified Generalist Model for Human Motion that bridges motion estimation and generation in a single framework.<n>Our key insight is to reformulate motion estimation as constrained motion generation, where the output motion must precisely satisfy observed conditioning signals.<n>Our novel architecture handles variable-length motions and mixed multimodal conditions (text, audio, video) at different time intervals, offering flexible control.
arXiv Detail & Related papers (2025-05-02T17:59:55Z) - Reinforcement learning-based motion imitation for physiologically plausible musculoskeletal motor control [47.423243831156285]
We present a model-free motion imitation framework (KINESIS) to advance the understanding of muscle-based motor control.<n>We demonstrate that KINESIS achieves strong imitation performance on 1.9 hours of motion capture data.<n>KINESIS generates muscle activity patterns that correlate well with human EMG activity.
arXiv Detail & Related papers (2025-03-18T18:37:49Z) - Spatial-Temporal Graph Diffusion Policy with Kinematic Modeling for Bimanual Robotic Manipulation [88.83749146867665]
Existing approaches learn a policy to predict a distant next-best end-effector pose.
They then compute the corresponding joint rotation angles for motion using inverse kinematics.
We propose Kinematics enhanced Spatial-TemporAl gRaph diffuser.
arXiv Detail & Related papers (2025-03-13T17:48:35Z) - Learning Speed-Adaptive Walking Agent Using Imitation Learning with Physics-Informed Simulation [0.0]
We create a skeletal humanoid agent capable of adapting to varying walking speeds while maintaining biomechanically realistic motions.<n>The framework combines a synthetic data generator, which produces biomechanically plausible gait kinematics from open-source biomechanics data, and a training system that uses adversarial imitation learning to train the agent's walking policy.
arXiv Detail & Related papers (2024-12-05T07:55:58Z) - I-CTRL: Imitation to Control Humanoid Robots Through Constrained Reinforcement Learning [8.97654258232601]
We develop a framework to control humanoid robots through bounded residual reinforcement learning (I-CTRL)<n>I-CTRL excels in motion imitation with simple and unique rewards that generalize across five robots.<n>Our framework introduces an automatic priority scheduler to manage large-scale motion datasets.
arXiv Detail & Related papers (2024-05-14T16:12:27Z) - MS-MANO: Enabling Hand Pose Tracking with Biomechanical Constraints [50.61346764110482]
We integrate a musculoskeletal system with a learnable parametric hand model, MANO, to create MS-MANO.
This model emulates the dynamics of muscles and tendons to drive the skeletal system, imposing physiologically realistic constraints on the resulting torque trajectories.
We also propose a simulation-in-the-loop pose refinement framework, BioPR, that refines the initial estimated pose through a multi-layer perceptron network.
arXiv Detail & Related papers (2024-04-16T02:18:18Z) - 3D Kinematics Estimation from Video with a Biomechanical Model and
Synthetic Training Data [4.130944152992895]
We propose a novel biomechanics-aware network that directly outputs 3D kinematics from two input views.
Our experiments demonstrate that the proposed approach, only trained on synthetic data, outperforms previous state-of-the-art methods.
arXiv Detail & Related papers (2024-02-20T17:33:40Z) - DiffuseBot: Breeding Soft Robots With Physics-Augmented Generative
Diffusion Models [102.13968267347553]
We present DiffuseBot, a physics-augmented diffusion model that generates soft robot morphologies capable of excelling in a wide spectrum of tasks.
We showcase a range of simulated and fabricated robots along with their capabilities.
arXiv Detail & Related papers (2023-11-28T18:58:48Z) - DROP: Dynamics Responses from Human Motion Prior and Projective Dynamics [21.00283279991885]
We introduce DROP, a novel framework for modeling Dynamics Responses of humans using generative mOtion prior and Projective dynamics.
We conduct extensive evaluations of our model across different motion tasks and various physical perturbations, demonstrating the scalability and diversity of responses.
arXiv Detail & Related papers (2023-09-24T20:25:59Z) - Priority-Centric Human Motion Generation in Discrete Latent Space [59.401128190423535]
We introduce a Priority-Centric Motion Discrete Diffusion Model (M2DM) for text-to-motion generation.
M2DM incorporates a global self-attention mechanism and a regularization term to counteract code collapse.
We also present a motion discrete diffusion model that employs an innovative noise schedule, determined by the significance of each motion token.
arXiv Detail & Related papers (2023-08-28T10:40:16Z) - A Physics-Informed Low-Shot Learning For sEMG-Based Estimation of Muscle
Force and Joint Kinematics [4.878073267556235]
Muscle force and joint kinematics estimation from surface electromyography (sEMG) are essential for real-time biomechanical analysis.
Recent advances in deep neural networks (DNNs) have shown the potential to improve biomechanical analysis in a fully automated and reproducible manner.
This paper presents a novel physics-informed low-shot learning method for sEMG-based estimation of muscle force and joint kinematics.
arXiv Detail & Related papers (2023-07-08T23:01:12Z) - Skeleton2Humanoid: Animating Simulated Characters for
Physically-plausible Motion In-betweening [59.88594294676711]
Modern deep learning based motion synthesis approaches barely consider the physical plausibility of synthesized motions.
We propose a system Skeleton2Humanoid'' which performs physics-oriented motion correction at test time.
Experiments on the challenging LaFAN1 dataset show our system can outperform prior methods significantly in terms of both physical plausibility and accuracy.
arXiv Detail & Related papers (2022-10-09T16:15:34Z) - Ultrafast viscosity measurement with ballistic optical tweezers [55.41644538483948]
Noninvasive viscosity measurements require integration times of seconds.
We demonstrate a four orders-of-magnitude improvement in speed, down to twenty microseconds.
We achieve this using the instantaneous velocity of a trapped particle in an optical tweezer.
arXiv Detail & Related papers (2020-06-29T00:09:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.