UniFucGrasp: Human-Hand-Inspired Unified Functional Grasp Annotation Strategy and Dataset for Diverse Dexterous Hands
- URL: http://arxiv.org/abs/2508.03339v1
- Date: Tue, 05 Aug 2025 11:37:38 GMT
- Title: UniFucGrasp: Human-Hand-Inspired Unified Functional Grasp Annotation Strategy and Dataset for Diverse Dexterous Hands
- Authors: Haoran Lin, Wenrui Chen, Xianchi Chen, Fan Yang, Qiang Diao, Wenxin Xie, Sijie Wu, Kailun Yang, Maojun Li, Yaonan Wang,
- Abstract summary: Dexterous grasp datasets are vital for embodied intelligence, but mostly ignore functional grasps needed for tasks like opening bottle caps or holding cup handles.<n>We establish UniFucGrasp, a universal functional grasp annotation strategy and dataset for multiple dexterous hand types.<n>Based on biomimicry, it maps natural human motions to diverse hand structures and uses geometry-based force closure to ensure functional, stable, human-like grasps.
- Score: 21.591446861018238
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Dexterous grasp datasets are vital for embodied intelligence, but mostly emphasize grasp stability, ignoring functional grasps needed for tasks like opening bottle caps or holding cup handles. Most rely on bulky, costly, and hard-to-control high-DOF Shadow Hands. Inspired by the human hand's underactuated mechanism, we establish UniFucGrasp, a universal functional grasp annotation strategy and dataset for multiple dexterous hand types. Based on biomimicry, it maps natural human motions to diverse hand structures and uses geometry-based force closure to ensure functional, stable, human-like grasps. This method supports low-cost, efficient collection of diverse, high-quality functional grasps. Finally, we establish the first multi-hand functional grasp dataset and provide a synthesis model to validate its effectiveness. Experiments on the UFG dataset, IsaacSim, and complex robotic tasks show that our method improves functional manipulation accuracy and grasp stability, enables efficient generalization across diverse robotic hands, and overcomes annotation cost and generalization challenges in dexterous grasping. The project page is at https://haochen611.github.io/UFG.
Related papers
- Is Diversity All You Need for Scalable Robotic Manipulation? [50.747150672933316]
We investigate the nuanced role of data diversity in robot learning by examining three critical dimensions-task (what to do), embodiment (which robot to use), and expert (who demonstrates)-challenging the conventional intuition of "more diverse is better"<n>We show that task diversity proves more critical than per-task demonstration quantity, benefiting transfer from diverse pre-training tasks to novel downstream scenarios.<n>We propose a distribution debiasing method to mitigate velocity ambiguity, the yielding GO-1-Pro achieves substantial performance gains of 15%, equivalent to using 2.5 times pre-training data.
arXiv Detail & Related papers (2025-07-08T17:52:44Z) - DexGarmentLab: Dexterous Garment Manipulation Environment with Generalizable Policy [74.9519138296936]
Garment manipulation is a critical challenge due to the diversity in garment categories, geometries, and deformations.<n>We propose DexGarmentLab, the first environment specifically designed for dexterous (especially bimanual) garment manipulation.<n>It features large-scale high-quality 3D assets for 15 task scenarios, and refines simulation techniques tailored for garment modeling to reduce the sim-to-real gap.
arXiv Detail & Related papers (2025-05-16T09:26:59Z) - DexGrasp Anything: Towards Universal Robotic Dexterous Grasping with Physics Awareness [38.310226324389596]
A dexterous hand capable of grasping any object is essential for the development of general-purpose embodied robots.<n>We introduce DexGrasp Anything, a method that integrates physical constraints into the training and sampling phases of a diffusion-based generative model.<n>We present a new dexterous grasping dataset containing over 3.4 million diverse grasping poses for more than 15k different objects.
arXiv Detail & Related papers (2025-03-11T10:21:50Z) - 3HANDS Dataset: Learning from Humans for Generating Naturalistic Handovers with Supernumerary Robotic Limbs [64.99122701615151]
Supernumerary robotic limbs (SRLs) are robotic structures integrated closely with the user's body.<n>We present 3HANDS, a novel dataset of object handover interactions between a participant performing a daily activity and another participant enacting a hip-mounted SRL in a naturalistic manner.<n>We present three models that generate naturalistic handover trajectories, one that determines the appropriate handover endpoints, and a third that predicts the moment to initiate a handover.
arXiv Detail & Related papers (2025-03-06T17:23:55Z) - FunHOI: Annotation-Free 3D Hand-Object Interaction Generation via Functional Text Guidanc [9.630837159704004]
Hand-object interaction (HOI) is the fundamental link between human and environment.<n>Despite advances in AI and robotics, capturing the semantics of functional grasping tasks remains a considerable challenge.<n>We propose an innovative two-stage framework, Functional Grasp Synthesis Net (FGS-Net), for generating 3D HOI driven by functional text.
arXiv Detail & Related papers (2025-02-28T07:42:54Z) - DexterityGen: Foundation Controller for Unprecedented Dexterity [67.15251368211361]
Teaching robots dexterous manipulation skills, such as tool use, presents a significant challenge.<n>Current approaches can be broadly categorized into two strategies: human teleoperation (for imitation learning) and sim-to-real reinforcement learning.<n>We introduce DexterityGen, which uses RL to pretrain large-scale dexterous motion primitives, such as in-hand rotation or translation.<n>In the real world, we use human teleoperation as a prompt to the controller to produce highly dexterous behavior.
arXiv Detail & Related papers (2025-02-06T18:49:35Z) - FunGrasp: Functional Grasping for Diverse Dexterous Hands [8.316017819784603]
We introduce FunGrasp, a system that enables functional dexterous grasping across various robot hands.
To achieve robust sim-to-real transfer, we employ several techniques including privileged learning, system identification, domain randomization, and gravity compensation.
arXiv Detail & Related papers (2024-11-24T07:30:54Z) - Learning Granularity-Aware Affordances from Human-Object Interaction for Tool-Based Functional Dexterous Grasping [27.124273762587848]
Affordance features of objects serve as a bridge in the functional interaction between agents and objects.<n>We propose a granularity-aware affordance feature extraction method for locating functional affordance areas.<n>We use highly activated coarse-grained affordance features in hand-object interaction regions to predict grasp gestures.<n>This forms GAAF-Dex, a complete framework that learns Granularity-Aware Affordances from human-object interaction.
arXiv Detail & Related papers (2024-06-30T07:42:57Z) - RealDex: Towards Human-like Grasping for Robotic Dexterous Hand [64.33746404551343]
We introduce RealDex, a pioneering dataset capturing authentic dexterous hand grasping motions infused with human behavioral patterns.<n>RealDex holds immense promise in advancing humanoid robot for automated perception, cognition, and manipulation in real-world scenarios.
arXiv Detail & Related papers (2024-02-21T14:59:46Z) - DexGraspNet: A Large-Scale Robotic Dexterous Grasp Dataset for General
Objects Based on Simulation [10.783992625475081]
We present a large-scale simulated dataset, DexGraspNet, for robotic dexterous grasping.
We use ShadowHand, a dexterous gripper commonly seen in robotics, to generate 1.32 million grasps for 5355 objects.
Compared to the previous dataset generated by GraspIt!, our dataset has not only more objects and grasps, but also higher diversity and quality.
arXiv Detail & Related papers (2022-10-06T06:09:16Z) - Model Predictive Control for Fluid Human-to-Robot Handovers [50.72520769938633]
Planning motions that take human comfort into account is not a part of the human-robot handover process.
We propose to generate smooth motions via an efficient model-predictive control framework.
We conduct human-to-robot handover experiments on a diverse set of objects with several users.
arXiv Detail & Related papers (2022-03-31T23:08:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.