RETRO: REthinking Tactile Representation Learning with Material PriOrs
- URL: http://arxiv.org/abs/2505.14319v1
- Date: Tue, 20 May 2025 13:06:19 GMT
- Title: RETRO: REthinking Tactile Representation Learning with Material PriOrs
- Authors: Weihao Xia, Chenliang Zhou, Cengiz Oztireli,
- Abstract summary: We introduce material-aware priors into the tactile representation learning process.<n>These priors represent pre-learned characteristics specific to different materials, allowing models to better capture and generalize the nuances of surface texture.<n>Our method enables more accurate, contextually rich tactile feedback across diverse materials and textures, improving performance in real-world applications such as robotics, haptic feedback systems, and material editing.
- Score: 4.938177645099319
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Tactile perception is profoundly influenced by the surface properties of objects in contact. However, despite their crucial role in shaping tactile experiences, these material characteristics have been largely neglected in existing tactile representation learning methods. Most approaches primarily focus on aligning tactile data with visual or textual information, overlooking the richness of tactile feedback that comes from understanding the materials' inherent properties. In this work, we address this gap by revisiting the tactile representation learning framework and incorporating material-aware priors into the learning process. These priors, which represent pre-learned characteristics specific to different materials, allow tactile models to better capture and generalize the nuances of surface texture. Our method enables more accurate, contextually rich tactile feedback across diverse materials and textures, improving performance in real-world applications such as robotics, haptic feedback systems, and material editing.
Related papers
- RA-Touch: Retrieval-Augmented Touch Understanding with Enriched Visual Data [10.059624183053499]
Visuo-tactile perception aims to understand an object's tactile properties, such as texture, softness, and rigidity.<n>We introduce RA-Touch, a retrieval-augmented framework that improves visuo-tactile perception by leveraging visual data enriched with tactile semantics.
arXiv Detail & Related papers (2025-05-20T12:23:21Z) - Temporal Binding Foundation Model for Material Property Recognition via Tactile Sequence Perception [2.3724852180691025]
This letter presents a novel approach leveraging a temporal binding foundation model for tactile sequence understanding.<n>The proposed system captures the sequential nature of tactile interactions, similar to human fingertip perception.
arXiv Detail & Related papers (2025-01-24T21:47:38Z) - On the Importance of Accurate Geometry Data for Dense 3D Vision Tasks [61.74608497496841]
Training on inaccurate or corrupt data induces model bias and hampers generalisation capabilities.
This paper investigates the effect of sensor errors for the dense 3D vision tasks of depth estimation and reconstruction.
arXiv Detail & Related papers (2023-03-26T22:32:44Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Self-Supervised Material and Texture Representation Learning for Remote
Sensing Tasks [5.5531367234797555]
We present our material and texture based self-supervision method named MATTER (MATerial and TExture Representation Learning)
MATerial and TExture Representation Learning is inspired by classical material and texture methods.
We show that our self-supervision pre-training method allows for up to 24.22% and 6.33% performance increase in unsupervised and fine-tuned setups.
arXiv Detail & Related papers (2021-12-03T04:59:13Z) - Dynamic Modeling of Hand-Object Interactions via Tactile Sensing [133.52375730875696]
In this work, we employ a high-resolution tactile glove to perform four different interactive activities on a diversified set of objects.
We build our model on a cross-modal learning framework and generate the labels using a visual processing pipeline to supervise the tactile model.
This work takes a step on dynamics modeling in hand-object interactions from dense tactile sensing.
arXiv Detail & Related papers (2021-09-09T16:04:14Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - Constellation: Learning relational abstractions over objects for
compositional imagination [64.99658940906917]
We introduce Constellation, a network that learns relational abstractions of static visual scenes.
This work is a first step in the explicit representation of visual relationships and using them for complex cognitive procedures.
arXiv Detail & Related papers (2021-07-23T11:59:40Z) - Active 3D Shape Reconstruction from Vision and Touch [66.08432412497443]
Humans build 3D understandings of the world through active object exploration, using jointly their senses of vision and touch.
In 3D shape reconstruction, most recent progress has relied on static datasets of limited sensory data such as RGB images, depth maps or haptic readings.
We introduce a system composed of: 1) a haptic simulator leveraging high spatial resolution vision-based tactile sensors for active touching of 3D objects; 2) a mesh-based 3D shape reconstruction model that relies on tactile or visuotactile priors to guide the shape exploration; and 3) a set of data-driven solutions with either tactile or visuo
arXiv Detail & Related papers (2021-07-20T15:56:52Z) - Spatio-temporal Attention Model for Tactile Texture Recognition [25.06942319117782]
We propose a novel Spatio-Temporal Attention Model (STAM) for tactile texture recognition.
The proposed STAM pays attention to both spatial focus of each single tactile texture and the temporal correlation of a tactile sequence.
In the experiments to discriminate 100 different fabric textures, the spatially and temporally selective attention has resulted in a significant improvement of the recognition accuracy.
arXiv Detail & Related papers (2020-08-10T22:32:34Z) - Teaching Cameras to Feel: Estimating Tactile Physical Properties of
Surfaces From Images [4.666400601228301]
We introduce the challenging task of estimating a set of tactile physical properties from visual information.
We construct a first of its kind image-tactile dataset with over 400 multiview image sequences and the corresponding tactile properties.
We develop a cross-modal framework comprised of an adversarial objective and a novel visuo-tactile joint classification loss.
arXiv Detail & Related papers (2020-04-29T21:27:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.