Zero-shot sim-to-real transfer of tactile control policies for
aggressive swing-up manipulation
- URL: http://arxiv.org/abs/2101.02680v2
- Date: Thu, 8 Apr 2021 13:33:16 GMT
- Title: Zero-shot sim-to-real transfer of tactile control policies for
aggressive swing-up manipulation
- Authors: Thomas Bi, Carmelo Sferrazza and Raffaello D'Andrea
- Abstract summary: This paper shows that robots equipped with a vision-based tactile sensor can perform dynamic manipulation tasks without prior knowledge of all the physical attributes of the objects to be manipulated.
A robotic system is presented that is able to swing up poles of different masses, radii and lengths, to an angle of 180 degrees.
This is the first work where a feedback policy from high-dimensional tactile observations is used to control the swing-up manipulation of poles in closed-loop.
- Score: 5.027571997864706
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper aims to show that robots equipped with a vision-based tactile
sensor can perform dynamic manipulation tasks without prior knowledge of all
the physical attributes of the objects to be manipulated. For this purpose, a
robotic system is presented that is able to swing up poles of different masses,
radii and lengths, to an angle of 180 degrees, while relying solely on the
feedback provided by the tactile sensor. This is achieved by developing a novel
simulator that accurately models the interaction of a pole with the soft
sensor. A feedback policy that is conditioned on a sensory observation history,
and which has no prior knowledge of the physical features of the pole, is then
learned in the aforementioned simulation. When evaluated on the physical
system, the policy is able to swing up a wide range of poles that differ
significantly in their physical attributes without further adaptation. To the
authors' knowledge, this is the first work where a feedback policy from
high-dimensional tactile observations is used to control the swing-up
manipulation of poles in closed-loop.
Related papers
- Lessons from Learning to Spin "Pens" [51.9182692233916]
In this work, we push the boundaries of learning-based in-hand manipulation systems by demonstrating the capability to spin pen-like objects.
We first use reinforcement learning to train an oracle policy with privileged information and generate a high-fidelity trajectory dataset in simulation.
We then fine-tune the sensorimotor policy using these real-world trajectories to adapt it to the real world dynamics.
arXiv Detail & Related papers (2024-07-26T17:56:01Z) - Towards Transferring Tactile-based Continuous Force Control Policies
from Simulation to Robot [19.789369416528604]
grasp force control aims to manipulate objects safely by limiting the amount of force exerted on the object.
Prior works have either hand-modeled their force controllers, employed model-based approaches, or have not shown sim-to-real transfer.
We propose a model-free deep reinforcement learning approach trained in simulation and then transferred to the robot without further fine-tuning.
arXiv Detail & Related papers (2023-11-13T11:29:06Z) - General In-Hand Object Rotation with Vision and Touch [46.871539289388615]
We introduce RotateIt, a system that enables fingertip-based object rotation along multiple axes.
We distill it to operate on realistic yet noisy simulated visuotactile and proprioceptive sensory inputs.
arXiv Detail & Related papers (2023-09-18T17:59:25Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Visual-Tactile Multimodality for Following Deformable Linear Objects
Using Reinforcement Learning [15.758583731036007]
We study the problem of using vision and tactile inputs together to complete the task of following deformable linear objects.
We create a Reinforcement Learning agent using different sensing modalities and investigate how its behaviour can be boosted.
Our experiments show that the use of both vision and tactile inputs, together with proprioception, allows the agent to complete the task in up to 92% of cases.
arXiv Detail & Related papers (2022-03-31T21:59:08Z) - Nonprehensile Riemannian Motion Predictive Control [57.295751294224765]
We introduce a novel Real-to-Sim reward analysis technique to reliably imagine and predict the outcome of taking possible actions for a real robotic platform.
We produce a closed-loop controller to reactively push objects in a continuous action space.
We observe that RMPC is robust in cluttered as well as occluded environments and outperforms the baselines.
arXiv Detail & Related papers (2021-11-15T18:50:04Z) - Leveraging distributed contact force measurements for slip detection: a
physics-based approach enabled by a data-driven tactile sensor [5.027571997864706]
This paper describes a novel model-based slip detection pipeline that can predict possibly failing grasps in real-time.
A vision-based tactile sensor that accurately estimates distributed forces was integrated into a grasping setup composed of a six degrees-of-freedom cobot and a two-finger gripper.
Results show that the system can reliably predict slip while manipulating objects of different shapes, materials, and weights.
arXiv Detail & Related papers (2021-09-23T17:12:46Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - Learning Intuitive Physics with Multimodal Generative Models [24.342994226226786]
This paper presents a perception framework that fuses visual and tactile feedback to make predictions about the expected motion of objects in dynamic scenes.
We use a novel See-Through-your-Skin (STS) sensor that provides high resolution multimodal sensing of contact surfaces.
We validate through simulated and real-world experiments in which the resting state of an object is predicted from given initial conditions.
arXiv Detail & Related papers (2021-01-12T12:55:53Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.