Design and Control of Roller Grasper V2 for In-Hand Manipulation
- URL: http://arxiv.org/abs/2004.08499v2
- Date: Tue, 17 Nov 2020 23:16:45 GMT
- Title: Design and Control of Roller Grasper V2 for In-Hand Manipulation
- Authors: Shenli Yuan, Lin Shao, Connor L. Yako, Alex Gruebele, and J. Kenneth
Salisbury
- Abstract summary: We present a novel non-anthropomorphic robot grasper with the ability to manipulate objects by means of active surfaces at the fingertips.
Active surfaces are achieved by spherical rolling fingertips with two degrees of freedom (DoF)
A further DoF is in the base of each finger, allowing the fingers to grasp objects over a range of size and shapes.
- Score: 6.064252790182275
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The ability to perform in-hand manipulation still remains an unsolved
problem; having this capability would allow robots to perform sophisticated
tasks requiring repositioning and reorienting of grasped objects. In this work,
we present a novel non-anthropomorphic robot grasper with the ability to
manipulate objects by means of active surfaces at the fingertips. Active
surfaces are achieved by spherical rolling fingertips with two degrees of
freedom (DoF) -- a pivoting motion for surface reorientation -- and a
continuous rolling motion for moving the object. A further DoF is in the base
of each finger, allowing the fingers to grasp objects over a range of size and
shapes. Instantaneous kinematics was derived and objects were successfully
manipulated both with a custom handcrafted control scheme as well as one
learned through imitation learning, in simulation and experimentally on the
hardware.
Related papers
- FunGrasp: Functional Grasping for Diverse Dexterous Hands [8.316017819784603]
We introduce FunGrasp, a system that enables functional dexterous grasping across various robot hands.
To achieve robust sim-to-real transfer, we employ several techniques including privileged learning, system identification, domain randomization, and gravity compensation.
arXiv Detail & Related papers (2024-11-24T07:30:54Z) - AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch [9.606323817785114]
We present AnyRotate, a system for gravity-invariant multi-axis in-hand object rotation using dense featured sim-to-real touch.
Our formulation allows the training of a unified policy to rotate unseen objects about arbitrary rotation axes in any hand direction.
Rich multi-fingered tactile sensing can detect unstable grasps and provide a reactive behavior that improves the robustness of the policy.
arXiv Detail & Related papers (2024-05-12T22:51:35Z) - Twisting Lids Off with Two Hands [82.21668778600414]
We show how policies trained in simulation can be effectively and efficiently transferred to the real world.
Specifically, we consider the problem of twisting lids of various bottle-like objects with two hands.
This is the first sim-to-real RL system that enables such capabilities on bimanual multi-fingered hands.
arXiv Detail & Related papers (2024-03-04T18:59:30Z) - Grasp Multiple Objects with One Hand [44.18611368961791]
MultiGrasp is a novel two-stage approach for multi-object grasping using a dexterous multi-fingered robotic hand on a tabletop.
Our experimental focus is primarily on dual-object grasping, achieving a success rate of 44.13%.
The framework demonstrates the potential for grasping more than two objects at the cost of inference speed.
arXiv Detail & Related papers (2023-10-24T08:01:12Z) - GRIP: Generating Interaction Poses Using Spatial Cues and Latent Consistency [57.9920824261925]
Hands are dexterous and highly versatile manipulators that are central to how humans interact with objects and their environment.
modeling realistic hand-object interactions is critical for applications in computer graphics, computer vision, and mixed reality.
GRIP is a learning-based method that takes as input the 3D motion of the body and the object, and synthesizes realistic motion for both hands before, during, and after object interaction.
arXiv Detail & Related papers (2023-08-22T17:59:51Z) - Learning to Transfer In-Hand Manipulations Using a Greedy Shape
Curriculum [79.6027464700869]
We show that natural and robust in-hand manipulation of simple objects in a dynamic simulation can be learned from a high quality motion capture example.
We propose a simple greedy curriculum search algorithm that can successfully apply to a range of objects such as a teapot, bunny, bottle, train, and elephant.
arXiv Detail & Related papers (2023-03-14T17:08:19Z) - Reconfigurable Data Glove for Reconstructing Physical and Virtual Grasps [100.72245315180433]
We present a reconfigurable data glove design to capture different modes of human hand-object interactions.
The glove operates in three modes for various downstream tasks with distinct features.
We evaluate the system's three modes by (i) recording hand gestures and associated forces, (ii) improving manipulation fluency in VR, and (iii) producing realistic simulation effects of various tool uses.
arXiv Detail & Related papers (2023-01-14T05:35:50Z) - In-Hand Object Rotation via Rapid Motor Adaptation [59.59946962428837]
We show how to design and learn a simple adaptive controller to achieve in-hand object rotation using only fingertips.
The controller is trained entirely in simulation on only cylindrical objects.
It can be directly deployed to a real robot hand to rotate dozens of objects with diverse sizes, shapes, and weights over the z-axis.
arXiv Detail & Related papers (2022-10-10T17:58:45Z) - Physics-Based Dexterous Manipulations with Estimated Hand Poses and
Residual Reinforcement Learning [52.37106940303246]
We learn a model that maps noisy input hand poses to target virtual poses.
The agent is trained in a residual setting by using a model-free hybrid RL+IL approach.
We test our framework in two applications that use hand pose estimates for dexterous manipulations: hand-object interactions in VR and hand-object motion reconstruction in-the-wild.
arXiv Detail & Related papers (2020-08-07T17:34:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.