Tactile Memory with Soft Robot: Robust Object Insertion via Masked Encoding and Soft Wrist
- URL: http://arxiv.org/abs/2601.19275v1
- Date: Tue, 27 Jan 2026 07:04:01 GMT
- Title: Tactile Memory with Soft Robot: Robust Object Insertion via Masked Encoding and Soft Wrist
- Authors: Tatsuya Kamijo, Mai Nishimura, Cristian C. Beltran-Hernandez, Nodoka Shibasaki, Masashi Hamaya,
- Abstract summary: We introduce Tactile Memory with Soft Robot (TaSo-bot), a system that integrates a soft wrist with retrieval-based control to enable safe and robust manipulation.<n>The core of this system is the Masked Tactile Trajectory Transformer (MATtext3$), which jointly models interactions between robot actions, tactile feedback, force-torque measurements, and proprioceptive signals.<n>MATtext3$ achieves higher success rates than the baselines over all conditions and shows remarkable capability to adapt to unseen pegs and conditions.
- Score: 10.982180941605256
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Tactile memory, the ability to store and retrieve touch-based experience, is critical for contact-rich tasks such as key insertion under uncertainty. To replicate this capability, we introduce Tactile Memory with Soft Robot (TaMeSo-bot), a system that integrates a soft wrist with tactile retrieval-based control to enable safe and robust manipulation. The soft wrist allows safe contact exploration during data collection, while tactile memory reuses past demonstrations via retrieval for flexible adaptation to unseen scenarios. The core of this system is the Masked Tactile Trajectory Transformer (MAT$^\text{3}$), which jointly models spatiotemporal interactions between robot actions, distributed tactile feedback, force-torque measurements, and proprioceptive signals. Through masked-token prediction, MAT$^\text{3}$ learns rich spatiotemporal representations by inferring missing sensory information from context, autonomously extracting task-relevant features without explicit subtask segmentation. We validate our approach on peg-in-hole tasks with diverse pegs and conditions in real-robot experiments. Our extensive evaluation demonstrates that MAT$^\text{3}$ achieves higher success rates than the baselines over all conditions and shows remarkable capability to adapt to unseen pegs and conditions.
Related papers
- OPENTOUCH: Bringing Full-Hand Touch to Real-World Interaction [93.88239833545623]
We present OpenTouch, the first in-the-wild egocentric full-hand tactile dataset.<n>We show that tactile signals provide a compact yet powerful cue for grasp understanding.<n>We aim to advance multimodal egocentric perception, embodied learning, and contact-rich robotic manipulation.
arXiv Detail & Related papers (2025-12-18T18:18:17Z) - Simultaneous Tactile-Visual Perception for Learning Multimodal Robot Manipulation [21.78866976181311]
See-through-skin (STS) sensors combine tactile and visual perception.<n>Existing STS designs lack simultaneous multimodal perception and suffer from unreliable tactile tracking.<n>We introduce TacThru, an STS sensor enabling simultaneous visual perception and robust tactile signal extraction.
arXiv Detail & Related papers (2025-12-10T17:35:13Z) - Touch in the Wild: Learning Fine-Grained Manipulation with a Portable Visuo-Tactile Gripper [7.618517580705364]
We present a portable, lightweight gripper with integrated tactile sensors.<n>We propose a cross-modal representation learning framework that integrates visual and tactile signals.<n>We validate our approach on fine-grained tasks such as test tube insertion and pipette-based fluid transfer.
arXiv Detail & Related papers (2025-07-20T17:53:59Z) - Learning Visuotactile Skills with Two Multifingered Hands [80.99370364907278]
We explore learning from human demonstrations using a bimanual system with multifingered hands and visuotactile data.
Our results mark a promising step forward in bimanual multifingered manipulation from visuotactile data.
arXiv Detail & Related papers (2024-04-25T17:59:41Z) - Robot Synesthesia: In-Hand Manipulation with Visuotactile Sensing [15.970078821894758]
We introduce a system that leverages visual and tactile sensory inputs to enable dexterous in-hand manipulation.
Robot Synesthesia is a novel point cloud-based tactile representation inspired by human tactile-visual synesthesia.
arXiv Detail & Related papers (2023-12-04T12:35:43Z) - Multimodal and Force-Matched Imitation Learning with a See-Through Visuotactile Sensor [14.492202828369127]
We leverage a multimodal visuotactile sensor within the framework of imitation learning (IL) to perform contact-rich tasks.<n>We introduce two algorithmic contributions, tactile force matching and learned mode switching, as complimentary methods for improving IL.<n>Our results show that the inclusion of force matching raises average policy success rates by 62.5%, visuotactile mode switching by 30.3%, and visuotactile data as a policy input by 42.5%.
arXiv Detail & Related papers (2023-11-02T14:02:42Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Mobile Behavioral Biometrics for Passive Authentication [65.94403066225384]
This work carries out a comparative analysis of unimodal and multimodal behavioral biometric traits.
Experiments are performed over HuMIdb, one of the largest and most comprehensive freely available mobile user interaction databases.
In our experiments, the most discriminative background sensor is the magnetometer, whereas among touch tasks the best results are achieved with keystroke.
arXiv Detail & Related papers (2022-03-14T17:05:59Z) - Learning to Detect Slip with Barometric Tactile Sensors and a Temporal
Convolutional Neural Network [7.346580429118843]
We present a learning-based method to detect slip using barometric tactile sensors.
We train a temporal convolution neural network to detect slip, achieving high detection accuracies.
We argue that barometric tactile sensing technology, combined with data-driven learning, is suitable for many manipulation tasks such as slip compensation.
arXiv Detail & Related papers (2022-02-19T08:21:56Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - Under Pressure: Learning to Detect Slip with Barometric Tactile Sensors [7.35805050004643]
We present a learning-based method to detect slip using barometric tactile sensors.
We are able to achieve slip detection accuracies of greater than 91%.
We show that barometric tactile sensing technology, combined with data-driven learning, is potentially suitable for many complex manipulation tasks.
arXiv Detail & Related papers (2021-03-24T19:29:03Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z) - Gaining a Sense of Touch. Physical Parameters Estimation using a Soft
Gripper and Neural Networks [3.0892724364965005]
There is not enough research on physical parameters estimation using deep learning algorithms on measurements from direct interaction with objects using robotic grippers.
We propose a trainable system for the regression of a stiffness coefficient and provided extensive experiments using the physics simulator environment.
Our system can reliably estimate the stiffness of an object using the Yale OpenHand soft gripper based on readings from Inertial Measurement Units (IMUs) attached to its fingers.
arXiv Detail & Related papers (2020-03-02T11:56:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.