Controllable Visual-Tactile Synthesis
- URL: http://arxiv.org/abs/2305.03051v1
- Date: Thu, 4 May 2023 17:59:51 GMT
- Title: Controllable Visual-Tactile Synthesis
- Authors: Ruihan Gao, Wenzhen Yuan, Jun-Yan Zhu
- Abstract summary: We develop a conditional generative model that synthesizes both visual and tactile outputs from a single sketch.
We then introduce a pipeline to render high-quality visual and tactile outputs on an electroadhesion-based haptic device.
- Score: 28.03469909285511
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep generative models have various content creation applications such as
graphic design, e-commerce, and virtual Try-on. However, current works mainly
focus on synthesizing realistic visual outputs, often ignoring other sensory
modalities, such as touch, which limits physical interaction with users. In
this work, we leverage deep generative models to create a multi-sensory
experience where users can touch and see the synthesized object when sliding
their fingers on a haptic surface. The main challenges lie in the significant
scale discrepancy between vision and touch sensing and the lack of explicit
mapping from touch sensing data to a haptic rendering device. To bridge this
gap, we collect high-resolution tactile data with a GelSight sensor and create
a new visuotactile clothing dataset. We then develop a conditional generative
model that synthesizes both visual and tactile outputs from a single sketch. We
evaluate our method regarding image quality and tactile rendering accuracy.
Finally, we introduce a pipeline to render high-quality visual and tactile
outputs on an electroadhesion-based haptic device for an immersive experience,
allowing for challenging materials and editable sketch inputs.
Related papers
- Digitizing Touch with an Artificial Multimodal Fingertip [51.7029315337739]
Humans and robots both benefit from using touch to perceive and interact with the surrounding environment.
Here, we describe several conceptual and technological innovations to improve the digitization of touch.
These advances are embodied in an artificial finger-shaped sensor with advanced sensing capabilities.
arXiv Detail & Related papers (2024-11-04T18:38:50Z) - Neural feels with neural fields: Visuo-tactile perception for in-hand
manipulation [57.60490773016364]
We combine vision and touch sensing on a multi-fingered hand to estimate an object's pose and shape during in-hand manipulation.
Our method, NeuralFeels, encodes object geometry by learning a neural field online and jointly tracks it by optimizing a pose graph problem.
Our results demonstrate that touch, at the very least, refines and, at the very best, disambiguates visual estimates during in-hand manipulation.
arXiv Detail & Related papers (2023-12-20T22:36:37Z) - Attention for Robot Touch: Tactile Saliency Prediction for Robust
Sim-to-Real Tactile Control [12.302685367517718]
High-resolution tactile sensing can provide accurate information about local contact in contact-rich robotic tasks.
We study a new concept: textittactile saliency for robot touch, inspired by the human touch attention mechanism from neuroscience.
arXiv Detail & Related papers (2023-07-26T21:19:45Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Touch and Go: Learning from Human-Collected Vision and Touch [16.139106833276]
We propose a dataset with paired visual and tactile data called Touch and Go.
Human data collectors probe objects in natural environments using tactile sensors.
Our dataset spans a large number of "in the wild" objects and scenes.
arXiv Detail & Related papers (2022-11-22T18:59:32Z) - Visual-Tactile Multimodality for Following Deformable Linear Objects
Using Reinforcement Learning [15.758583731036007]
We study the problem of using vision and tactile inputs together to complete the task of following deformable linear objects.
We create a Reinforcement Learning agent using different sensing modalities and investigate how its behaviour can be boosted.
Our experiments show that the use of both vision and tactile inputs, together with proprioception, allows the agent to complete the task in up to 92% of cases.
arXiv Detail & Related papers (2022-03-31T21:59:08Z) - Dynamic Modeling of Hand-Object Interactions via Tactile Sensing [133.52375730875696]
In this work, we employ a high-resolution tactile glove to perform four different interactive activities on a diversified set of objects.
We build our model on a cross-modal learning framework and generate the labels using a visual processing pipeline to supervise the tactile model.
This work takes a step on dynamics modeling in hand-object interactions from dense tactile sensing.
arXiv Detail & Related papers (2021-09-09T16:04:14Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - Learning Intuitive Physics with Multimodal Generative Models [24.342994226226786]
This paper presents a perception framework that fuses visual and tactile feedback to make predictions about the expected motion of objects in dynamic scenes.
We use a novel See-Through-your-Skin (STS) sensor that provides high resolution multimodal sensing of contact surfaces.
We validate through simulated and real-world experiments in which the resting state of an object is predicted from given initial conditions.
arXiv Detail & Related papers (2021-01-12T12:55:53Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.