Visibility-Inspired Models of Touch Sensors for Navigation
- URL: http://arxiv.org/abs/2203.04751v1
- Date: Fri, 4 Mar 2022 08:23:01 GMT
- Title: Visibility-Inspired Models of Touch Sensors for Navigation
- Authors: Kshitij Tiwari, Basak Sakcak, Prasanna Routray, Manivannan M., and
Steven M. LaValle
- Abstract summary: This paper introduces mathematical models of touch sensors for mobile robotics based on visibility.
The introduced models are expected to provide a useful, idealized characterization of task-relevant information.
- Score: 4.730233684561005
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper introduces mathematical models of touch sensors for mobile
robotics based on visibility. Serving a purpose similar to the pinhole camera
model for computer vision, the introduced models are expected to provide a
useful, idealized characterization of task-relevant information that can be
inferred from their outputs or observations. This allows direct comparisons to
be made between traditional depth sensors, highlighting cases in which touch
sensing may be interchangeable with time of flight or vision sensors, and
characterizing unique advantages provided by touch sensing. The models include
contact detection, compression, load bearing, and deflection. The results could
serve as a basic building block for innovative touch sensor designs for mobile
robot sensor fusion systems.
Related papers
- Fluidically Innervated Lattices Make Versatile and Durable Tactile Sensors [41.98879562938879]
We introduce a passive soft robotic fingertip with integrated tactile sensing, fabricated using a 3D-printed elastomer lattice with embedded air channels.<n>This sensorization approach, termed fluidic innervation, transforms the lattice into a tactile sensor by detecting pressure changes within sealed air channels.
arXiv Detail & Related papers (2025-07-28T18:00:04Z) - Advances in Compliance Detection: Novel Models Using Vision-Based Tactile Sensors [0.7199733380797579]
Compliance is a critical parameter for describing objects in engineering, agriculture, and biomedical applications.<n>Traditional compliance detection methods are limited by their lack of portability and scalability, rely on specialized, often expensive equipment, and are unsuitable for robotic applications.<n>We propose two models based on Long-term Recurrent Convolutional Networks (LRCNs) and Transformer architectures that leverage RGB tactile images and other information captured by the vision-based sensor GelSight to predict compliance metrics accurately.
arXiv Detail & Related papers (2025-06-17T21:10:05Z) - Sensor-Invariant Tactile Representation [11.153753622913843]
High-resolution tactile sensors have become critical for embodied perception and robotic manipulation.
A key challenge in the field is the lack of transferability between sensors due to design and manufacturing variations.
We introduce a novel method for extracting Sensor-Invariant Tactile Representations (SITR), enabling zero-shot transfer across optical tactile sensors.
arXiv Detail & Related papers (2025-02-27T00:12:50Z) - AnyTouch: Learning Unified Static-Dynamic Representation across Multiple Visuo-tactile Sensors [11.506370451126378]
Visuo-tactile sensors aim to emulate human tactile perception, enabling robots to understand and manipulate objects.
We introduce TacQuad, an aligned multi-modal tactile multi-sensor dataset from four different visuo-tactile sensors.
We propose AnyTouch, a unified static-dynamic multi-sensor representation learning framework with a multi-level structure.
arXiv Detail & Related papers (2025-02-15T08:33:25Z) - MSSIDD: A Benchmark for Multi-Sensor Denoising [55.41612200877861]
We introduce a new benchmark, the Multi-Sensor SIDD dataset, which is the first raw-domain dataset designed to evaluate the sensor transferability of denoising models.
We propose a sensor consistency training framework that enables denoising models to learn the sensor-invariant features.
arXiv Detail & Related papers (2024-11-18T13:32:59Z) - Controllable Visual-Tactile Synthesis [28.03469909285511]
We develop a conditional generative model that synthesizes both visual and tactile outputs from a single sketch.
We then introduce a pipeline to render high-quality visual and tactile outputs on an electroadhesion-based haptic device.
arXiv Detail & Related papers (2023-05-04T17:59:51Z) - Recognizing Complex Gestures on Minimalistic Knitted Sensors: Toward
Real-World Interactive Systems [0.13048920509133805]
Our digitally-knitted capacitive active sensors can be manufactured at scale with little human intervention.
This work advances the capabilities of such sensors by creating the foundation for an interactive gesture recognition system.
arXiv Detail & Related papers (2023-03-18T04:57:46Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - DenseTact: Optical Tactile Sensor for Dense Shape Reconstruction [0.0]
Vision-based tactile sensors have been widely used as rich tactile feedback has been correlated with increased performance in manipulation tasks.
Existing tactile sensor solutions with high resolution have limitations that include low accuracy, expensive components, or lack of scalability.
This paper proposes an inexpensive, scalable, and compact tactile sensor with high-resolution surface deformation modeling for surface reconstruction of the 3D sensor surface.
arXiv Detail & Related papers (2022-01-04T22:26:14Z) - Dynamic Modeling of Hand-Object Interactions via Tactile Sensing [133.52375730875696]
In this work, we employ a high-resolution tactile glove to perform four different interactive activities on a diversified set of objects.
We build our model on a cross-modal learning framework and generate the labels using a visual processing pipeline to supervise the tactile model.
This work takes a step on dynamics modeling in hand-object interactions from dense tactile sensing.
arXiv Detail & Related papers (2021-09-09T16:04:14Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - Investigating the Effect of Sensor Modalities in Multi-Sensor
Detection-Prediction Models [8.354898936252516]
We focus on the contribution of sensor modalities towards the model performance.
In addition, we investigate the use of sensor dropout to mitigate the above-mentioned issues.
arXiv Detail & Related papers (2021-01-09T03:21:36Z) - Semantics-aware Adaptive Knowledge Distillation for Sensor-to-Vision
Action Recognition [131.6328804788164]
We propose a framework, named Semantics-aware Adaptive Knowledge Distillation Networks (SAKDN), to enhance action recognition in vision-sensor modality (videos)
The SAKDN uses multiple wearable-sensors as teacher modalities and uses RGB videos as student modality.
arXiv Detail & Related papers (2020-09-01T03:38:31Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z) - The Feeling of Success: Does Touch Sensing Help Predict Grasp Outcomes? [57.366931129764815]
We collect more than 9,000 grasping trials using a two-finger gripper equipped with GelSight high-resolution tactile sensors on each finger.
Our experimental results indicate that incorporating tactile readings substantially improve grasping performance.
arXiv Detail & Related papers (2017-10-16T05:32:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.