One-shot Humanoid Whole-body Motion Learning
- URL: http://arxiv.org/abs/2510.25241v1
- Date: Wed, 29 Oct 2025 07:48:10 GMT
- Title: One-shot Humanoid Whole-body Motion Learning
- Authors: Hao Huang, Geeta Chandra Raju Bethala, Shuaihang Yuan, Congcong Wen, Anthony Tzes, Yi Fang,
- Abstract summary: Methods typically require multiple training samples per motion category.<n>We propose a novel approach that trains effective humanoid motion policies using only a single non-walking target motion sample.
- Score: 18.375746497945023
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Whole-body humanoid motion represents a cornerstone challenge in robotics, integrating balance, coordination, and adaptability to enable human-like behaviors. However, existing methods typically require multiple training samples per motion category, rendering the collection of high-quality human motion datasets both labor-intensive and costly. To address this, we propose a novel approach that trains effective humanoid motion policies using only a single non-walking target motion sample alongside readily available walking motions. The core idea lies in leveraging order-preserving optimal transport to compute distances between walking and non-walking sequences, followed by interpolation along geodesics to generate new intermediate pose skeletons, which are then optimized for collision-free configurations and retargeted to the humanoid before integration into a simulated environment for policy training via reinforcement learning. Experimental evaluations on the CMU MoCap dataset demonstrate that our method consistently outperforms baselines, achieving superior performance across metrics. Code will be released upon acceptance.
Related papers
- MeshMimic: Geometry-Aware Humanoid Motion Learning through 3D Scene Reconstruction [54.36564144414704]
MeshMimic is an innovative framework that bridges 3D scene reconstruction and embodied intelligence to enable humanoid robots to learn coupled "motion-terrain" interactions directly from video.<n>By leveraging state-of-the-art 3D vision models, our framework precisely segments and reconstructs both human trajectories and the underlying 3D geometry of terrains and objects.
arXiv Detail & Related papers (2026-02-17T17:09:45Z) - ResMimic: From General Motion Tracking to Humanoid Whole-body Loco-Manipulation via Residual Learning [59.64325421657381]
Humanoid whole-body loco-manipulation promises transformative capabilities for daily service and warehouse tasks.<n>We introduce ResMimic, a two-stage residual learning framework for precise and expressive humanoid control from human motion data.<n>Results show substantial gains in task success, training efficiency, and robustness over strong baselines.
arXiv Detail & Related papers (2025-10-06T17:47:02Z) - OmniRetarget: Interaction-Preserving Data Generation for Humanoid Whole-Body Loco-Manipulation and Scene Interaction [76.44108003274955]
A dominant paradigm for teaching humanoid robots complex skills is to retarget human motions as kinematic references to train reinforcement learning policies.<n>We introduce OmniRetarget, an interaction-preserving data generation engine based on an interaction mesh.<n>By minimizing the Laplacian deformation between the human and robot meshes, OmniRetarget generates kinematically feasible trajectories.
arXiv Detail & Related papers (2025-09-30T17:59:02Z) - CEDex: Cross-Embodiment Dexterous Grasp Generation at Scale from Human-like Contact Representations [53.37721117405022]
Cross-embodiment dexterous grasp synthesis refers to adaptively generating and optimizing grasps for various robotic hands.<n>We propose CEDex, a novel cross-embodiment dexterous grasp synthesis method at scale.<n>We construct the largest cross-embodiment grasp dataset to date, comprising 500K objects across four types with 20M total grasps.
arXiv Detail & Related papers (2025-09-29T12:08:04Z) - Aligning Human Motion Generation with Human Perceptions [51.831338643012444]
We propose a data-driven approach to bridge the gap by introducing a large-scale human perceptual evaluation dataset, MotionPercept, and a human motion critic model, MotionCritic.<n>Our critic model offers a more accurate metric for assessing motion quality and could be readily integrated into the motion generation pipeline.
arXiv Detail & Related papers (2024-07-02T14:01:59Z) - InterControl: Zero-shot Human Interaction Generation by Controlling Every Joint [67.6297384588837]
We introduce a novel controllable motion generation method, InterControl, to encourage the synthesized motions maintaining the desired distance between joint pairs.
We demonstrate that the distance between joint pairs for human-wise interactions can be generated using an off-the-shelf Large Language Model.
arXiv Detail & Related papers (2023-11-27T14:32:33Z) - Enhanced Human-Robot Collaboration using Constrained Probabilistic
Human-Motion Prediction [5.501477817904299]
We propose a novel human motion prediction framework that incorporates human joint constraints and scene constraints.
It is tested on a human arm kinematic model and implemented on a human-robot collaborative setup with a UR5 robot arm.
arXiv Detail & Related papers (2023-10-05T05:12:14Z) - Learning Human Motion Prediction via Stochastic Differential Equations [19.30774202476477]
We propose a novel approach in modeling the motion prediction problem based on differential equations and path integrals.
It achieves a 12.48% accuracy improvement over current state-of-the-art methods in average.
arXiv Detail & Related papers (2021-12-21T11:55:13Z) - Improving Human Motion Prediction Through Continual Learning [2.720960618356385]
Human motion prediction is an essential component for enabling closer human-robot collaboration.
It is compounded by the variability of human motion, both at a skeletal level due to the varying size of humans and at a motion level due to individual movement idiosyncrasies.
We propose a modular sequence learning approach that allows end-to-end training while also having the flexibility of being fine-tuned.
arXiv Detail & Related papers (2021-07-01T15:34:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.