Touch and Go: Learning from Human-Collected Vision and Touch
- URL: http://arxiv.org/abs/2211.12498v1
- Date: Tue, 22 Nov 2022 18:59:32 GMT
- Title: Touch and Go: Learning from Human-Collected Vision and Touch
- Authors: Fengyu Yang, Chenyang Ma, Jiacheng Zhang, Jing Zhu, Wenzhen Yuan,
Andrew Owens
- Abstract summary: We propose a dataset with paired visual and tactile data called Touch and Go.
Human data collectors probe objects in natural environments using tactile sensors.
Our dataset spans a large number of "in the wild" objects and scenes.
- Score: 16.139106833276
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The ability to associate touch with sight is essential for tasks that require
physically interacting with objects in the world. We propose a dataset with
paired visual and tactile data called Touch and Go, in which human data
collectors probe objects in natural environments using tactile sensors, while
simultaneously recording egocentric video. In contrast to previous efforts,
which have largely been confined to lab settings or simulated environments, our
dataset spans a large number of "in the wild" objects and scenes. To
demonstrate our dataset's effectiveness, we successfully apply it to a variety
of tasks: 1) self-supervised visuo-tactile feature learning, 2) tactile-driven
image stylization, i.e., making the visual appearance of an object more
consistent with a given tactile signal, and 3) predicting future frames of a
tactile signal from visuo-tactile inputs.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.