Learning Tactile Models for Factor Graph-based Estimation
- URL: http://arxiv.org/abs/2012.03768v2
- Date: Sun, 28 Mar 2021 19:34:52 GMT
- Title: Learning Tactile Models for Factor Graph-based Estimation
- Authors: Paloma Sodhi, Michael Kaess, Mustafa Mukadam, Stuart Anderson
- Abstract summary: Vision-based tactile sensors provide rich, local image measurements at the point of contact.
A single measurement contains limited information and multiple measurements are needed to infer latent object state.
We propose a two-stage approach: first we learn local tactile observation models supervised with ground truth data, and then integrate these models along with physics and geometric factors within a factor graph.
- Score: 24.958055047646628
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We're interested in the problem of estimating object states from touch during
manipulation under occlusions. In this work, we address the problem of
estimating object poses from touch during planar pushing. Vision-based tactile
sensors provide rich, local image measurements at the point of contact. A
single such measurement, however, contains limited information and multiple
measurements are needed to infer latent object state. We solve this inference
problem using a factor graph. In order to incorporate tactile measurements in
the graph, we need local observation models that can map high-dimensional
tactile images onto a low-dimensional state space. Prior work has used
low-dimensional force measurements or engineered functions to interpret tactile
measurements. These methods, however, can be brittle and difficult to scale
across objects and sensors. Our key insight is to directly learn tactile
observation models that predict the relative pose of the sensor given a pair of
tactile images. These relative poses can then be incorporated as factors within
a factor graph. We propose a two-stage approach: first we learn local tactile
observation models supervised with ground truth data, and then integrate these
models along with physics and geometric factors within a factor graph
optimizer. We demonstrate reliable object tracking using only tactile feedback
for 150 real-world planar pushing sequences with varying trajectories across
three object shapes. Supplementary video: https://youtu.be/y1kBfSmi8w0
Related papers
- GEARS: Local Geometry-aware Hand-object Interaction Synthesis [38.75942505771009]
We introduce a novel joint-centered sensor designed to reason about local object geometry near potential interaction regions.
As an important step towards mitigating the learning complexity, we transform the points from global frame to template hand frame and use a shared module to process sensor features of each individual joint.
This is followed by a perceptual-temporal transformer network aimed at capturing correlation among the joints in different dimensions.
arXiv Detail & Related papers (2024-04-02T09:18:52Z) - Neural feels with neural fields: Visuo-tactile perception for in-hand
manipulation [57.60490773016364]
We combine vision and touch sensing on a multi-fingered hand to estimate an object's pose and shape during in-hand manipulation.
Our method, NeuralFeels, encodes object geometry by learning a neural field online and jointly tracks it by optimizing a pose graph problem.
Our results demonstrate that touch, at the very least, refines and, at the very best, disambiguates visual estimates during in-hand manipulation.
arXiv Detail & Related papers (2023-12-20T22:36:37Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Tac2Pose: Tactile Object Pose Estimation from the First Touch [6.321662423735226]
We present Tac2Pose, an object-specific approach to tactile pose estimation from the first touch for known objects.
We simulate the contact shapes that a dense set of object poses would produce on the sensor.
We obtain contact shapes from the sensor with an object-agnostic calibration step that maps RGB tactile observations to binary contact shapes.
arXiv Detail & Related papers (2022-04-25T14:43:48Z) - Dynamic Modeling of Hand-Object Interactions via Tactile Sensing [133.52375730875696]
In this work, we employ a high-resolution tactile glove to perform four different interactive activities on a diversified set of objects.
We build our model on a cross-modal learning framework and generate the labels using a visual processing pipeline to supervise the tactile model.
This work takes a step on dynamics modeling in hand-object interactions from dense tactile sensing.
arXiv Detail & Related papers (2021-09-09T16:04:14Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - Probabilistic and Geometric Depth: Detecting Objects in Perspective [78.00922683083776]
3D object detection is an important capability needed in various practical applications such as driver assistance systems.
Monocular 3D detection, as an economical solution compared to conventional settings relying on binocular vision or LiDAR, has drawn increasing attention recently but still yields unsatisfactory results.
This paper first presents a systematic study on this problem and observes that the current monocular 3D detection problem can be simplified as an instance depth estimation problem.
arXiv Detail & Related papers (2021-07-29T16:30:33Z) - Active 3D Shape Reconstruction from Vision and Touch [66.08432412497443]
Humans build 3D understandings of the world through active object exploration, using jointly their senses of vision and touch.
In 3D shape reconstruction, most recent progress has relied on static datasets of limited sensory data such as RGB images, depth maps or haptic readings.
We introduce a system composed of: 1) a haptic simulator leveraging high spatial resolution vision-based tactile sensors for active touching of 3D objects; 2) a mesh-based 3D shape reconstruction model that relies on tactile or visuotactile priors to guide the shape exploration; and 3) a set of data-driven solutions with either tactile or visuo
arXiv Detail & Related papers (2021-07-20T15:56:52Z) - Tactile Object Pose Estimation from the First Touch with Geometric
Contact Rendering [19.69677059281393]
We present an approach to tactile pose estimation from the first touch for known objects.
We create an object-agnostic map from real tactile observations to contact shapes.
For a new object with known geometry, we learn a tailored perception model completely in simulation.
arXiv Detail & Related papers (2020-12-09T18:00:35Z) - Teaching Cameras to Feel: Estimating Tactile Physical Properties of
Surfaces From Images [4.666400601228301]
We introduce the challenging task of estimating a set of tactile physical properties from visual information.
We construct a first of its kind image-tactile dataset with over 400 multiview image sequences and the corresponding tactile properties.
We develop a cross-modal framework comprised of an adversarial objective and a novel visuo-tactile joint classification loss.
arXiv Detail & Related papers (2020-04-29T21:27:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.