On the Feasibility of Learning Finger-gaiting In-hand Manipulation with
Intrinsic Sensing
- URL: http://arxiv.org/abs/2109.12720v1
- Date: Sun, 26 Sep 2021 23:22:29 GMT
- Title: On the Feasibility of Learning Finger-gaiting In-hand Manipulation with
Intrinsic Sensing
- Authors: Gagan Khandate, Maxmillian Haas-Heger, Matei Ciocarlie
- Abstract summary: We use model-free reinforcement learning to learn finger-gaiting only via precision grasps.
To tackle the inherent instability of precision grasping, we propose the use of initial state distributions.
Our method can learn finger-gaiting with significantly improved sample complexity than the state-of-the-art.
- Score: 0.7373617024876725
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Finger-gaiting manipulation is an important skill to achieve large-angle
in-hand re-orientation of objects. However, achieving these gaits with
arbitrary orientations of the hand is challenging due to the unstable nature of
the task. In this work, we use model-free reinforcement learning (RL) to learn
finger-gaiting only via precision grasps and demonstrate finger-gaiting for
rotation about an axis purely using on-board proprioceptive and tactile
feedback. To tackle the inherent instability of precision grasping, we propose
the use of initial state distributions that enable effective exploration of the
state space. Our method can learn finger-gaiting with significantly improved
sample complexity than the state-of-the-art. The policies we obtain are robust
and also transfer to novel objects.
Related papers
- Curriculum Is More Influential Than Haptic Information During Reinforcement Learning of Object Manipulation Against Gravity [0.0]
Learning to lift and rotate objects with the fingertips is necessary for autonomous in-hand dexterous manipulation.
We investigate the role of curriculum learning and haptic feedback in enabling the learning of dexterous manipulation.
arXiv Detail & Related papers (2024-07-13T19:23:11Z) - AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch [9.606323817785114]
We present AnyRotate, a system for gravity-invariant multi-axis in-hand object rotation using dense featured sim-to-real touch.
Our formulation allows the training of a unified policy to rotate unseen objects about arbitrary rotation axes in any hand direction.
Rich multi-fingered tactile sensing can detect unstable grasps and provide a reactive behavior that improves the robustness of the policy.
arXiv Detail & Related papers (2024-05-12T22:51:35Z) - A model-free approach to fingertip slip and disturbance detection for
grasp stability inference [0.0]
We propose a method for assessing grasp stability using tactile sensing.
We use highly sensitive Uskin tactile sensors mounted on an Allegro hand to test and validate our method.
arXiv Detail & Related papers (2023-11-22T09:04:26Z) - ArtiGrasp: Physically Plausible Synthesis of Bi-Manual Dexterous
Grasping and Articulation [29.999224233718927]
ArtiGrasp is a method to synthesize bi-manual hand-object interactions that include grasping and articulation.
Our framework unifies grasping and articulation within a single policy guided by a single hand pose reference.
We show that our method can generate motions with noisy hand-object pose estimates from an off-the-shelf image-based regressor.
arXiv Detail & Related papers (2023-09-07T17:53:20Z) - Learning to Transfer In-Hand Manipulations Using a Greedy Shape
Curriculum [79.6027464700869]
We show that natural and robust in-hand manipulation of simple objects in a dynamic simulation can be learned from a high quality motion capture example.
We propose a simple greedy curriculum search algorithm that can successfully apply to a range of objects such as a teapot, bunny, bottle, train, and elephant.
arXiv Detail & Related papers (2023-03-14T17:08:19Z) - Dexterous Manipulation from Images: Autonomous Real-World RL via Substep
Guidance [71.36749876465618]
We describe a system for vision-based dexterous manipulation that provides a "programming-free" approach for users to define new tasks.
Our system includes a framework for users to define a final task and intermediate sub-tasks with image examples.
experimental results with a four-finger robotic hand learning multi-stage object manipulation tasks directly in the real world.
arXiv Detail & Related papers (2022-12-19T22:50:40Z) - The Gesture Authoring Space: Authoring Customised Hand Gestures for
Grasping Virtual Objects in Immersive Virtual Environments [81.5101473684021]
This work proposes a hand gesture authoring tool for object specific grab gestures allowing virtual objects to be grabbed as in the real world.
The presented solution uses template matching for gesture recognition and requires no technical knowledge to design and create custom tailored hand gestures.
The study showed that gestures created with the proposed approach are perceived by users as a more natural input modality than the others.
arXiv Detail & Related papers (2022-07-03T18:33:33Z) - Generalization Through Hand-Eye Coordination: An Action Space for
Learning Spatially-Invariant Visuomotor Control [67.23580984118479]
Imitation Learning (IL) is an effective framework to learn visuomotor skills from offline demonstration data.
Hand-eye Action Networks (HAN) can approximate human's hand-eye coordination behaviors by learning from human teleoperated demonstrations.
arXiv Detail & Related papers (2021-02-28T01:49:13Z) - Learning Dexterous Grasping with Object-Centric Visual Affordances [86.49357517864937]
Dexterous robotic hands are appealing for their agility and human-like morphology.
We introduce an approach for learning dexterous grasping.
Our key idea is to embed an object-centric visual affordance model within a deep reinforcement learning loop.
arXiv Detail & Related papers (2020-09-03T04:00:40Z) - Physics-Based Dexterous Manipulations with Estimated Hand Poses and
Residual Reinforcement Learning [52.37106940303246]
We learn a model that maps noisy input hand poses to target virtual poses.
The agent is trained in a residual setting by using a model-free hybrid RL+IL approach.
We test our framework in two applications that use hand pose estimates for dexterous manipulations: hand-object interactions in VR and hand-object motion reconstruction in-the-wild.
arXiv Detail & Related papers (2020-08-07T17:34:28Z) - Design and Control of Roller Grasper V2 for In-Hand Manipulation [6.064252790182275]
We present a novel non-anthropomorphic robot grasper with the ability to manipulate objects by means of active surfaces at the fingertips.
Active surfaces are achieved by spherical rolling fingertips with two degrees of freedom (DoF)
A further DoF is in the base of each finger, allowing the fingers to grasp objects over a range of size and shapes.
arXiv Detail & Related papers (2020-04-18T00:54:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.