TactileSGNet: A Spiking Graph Neural Network for Event-based Tactile
Object Recognition
- URL: http://arxiv.org/abs/2008.08046v1
- Date: Sat, 1 Aug 2020 03:35:15 GMT
- Title: TactileSGNet: A Spiking Graph Neural Network for Event-based Tactile
Object Recognition
- Authors: Fuqiang Gu, Weicong Sng, Tasbolat Taunyazov and Harold Soh
- Abstract summary: New advances in flexible, event-driven, electronic skins may soon endow robots with touch perception capabilities similar to humans.
These unique features may render current deep learning approaches such as convolutional feature extractors unsuitable for tactile learning.
We propose a novel spiking graph neural network for event-based tactile object recognition.
- Score: 17.37142241982902
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Tactile perception is crucial for a variety of robot tasks including grasping
and in-hand manipulation. New advances in flexible, event-driven, electronic
skins may soon endow robots with touch perception capabilities similar to
humans. These electronic skins respond asynchronously to changes (e.g., in
pressure, temperature), and can be laid out irregularly on the robot's body or
end-effector. However, these unique features may render current deep learning
approaches such as convolutional feature extractors unsuitable for tactile
learning. In this paper, we propose a novel spiking graph neural network for
event-based tactile object recognition. To make use of local connectivity of
taxels, we present several methods for organizing the tactile data in a graph
structure. Based on the constructed graphs, we develop a spiking graph
convolutional network. The event-driven nature of spiking neural network makes
it arguably more suitable for processing the event-based data. Experimental
results on two tactile datasets show that the proposed method outperforms other
state-of-the-art spiking methods, achieving high accuracies of approximately
90\% when classifying a variety of different household objects.
Related papers
- RoboPack: Learning Tactile-Informed Dynamics Models for Dense Packing [38.97168020979433]
We introduce an approach that combines visual and tactile sensing for robotic manipulation by learning a neural, tactile-informed dynamics model.
Our proposed framework, RoboPack, employs a recurrent graph neural network to estimate object states.
We demonstrate our approach on a real robot equipped with a compliant Soft-Bubble tactile sensor on non-prehensile manipulation and dense packing tasks.
arXiv Detail & Related papers (2024-07-01T16:08:37Z) - Physics-Encoded Graph Neural Networks for Deformation Prediction under
Contact [87.69278096528156]
In robotics, it's crucial to understand object deformation during tactile interactions.
We introduce a method using Physics-Encoded Graph Neural Networks (GNNs) for such predictions.
We've made our code and dataset public to advance research in robotic simulation and grasping.
arXiv Detail & Related papers (2024-02-05T19:21:52Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Graph Neural Networks with Trainable Adjacency Matrices for Fault
Diagnosis on Multivariate Sensor Data [69.25738064847175]
It is necessary to consider the behavior of the signals in each sensor separately, to take into account their correlation and hidden relationships with each other.
The graph nodes can be represented as data from the different sensors, and the edges can display the influence of these data on each other.
It was proposed to construct a graph during the training of graph neural network. This allows to train models on data where the dependencies between the sensors are not known in advance.
arXiv Detail & Related papers (2022-10-20T11:03:21Z) - Tactile-ViewGCN: Learning Shape Descriptor from Tactile Data using Graph
Convolutional Network [0.4189643331553922]
It focuses on improving previous works on object classification using tactile data.
We propose a novel method, dubbed as Tactile-ViewGCN, that hierarchically aggregate tactile features.
Our model outperforms previous methods on the STAG dataset with an accuracy of 81.82%.
arXiv Detail & Related papers (2022-03-12T05:58:21Z) - Self-Supervised Graph Representation Learning for Neuronal Morphologies [75.38832711445421]
We present GraphDINO, a data-driven approach to learn low-dimensional representations of 3D neuronal morphologies from unlabeled datasets.
We show, in two different species and across multiple brain areas, that this method yields morphological cell type clusterings on par with manual feature-based classification by experts.
Our method could potentially enable data-driven discovery of novel morphological features and cell types in large-scale datasets.
arXiv Detail & Related papers (2021-12-23T12:17:47Z) - A Variational Graph Autoencoder for Manipulation Action Recognition and
Prediction [1.1816942730023883]
We introduce a deep graph autoencoder to jointly learn recognition and prediction of manipulation tasks from symbolic scene graphs.
Our network has a variational autoencoder structure with two branches: one for identifying the input graph type and one for predicting the future graphs.
We benchmark our new model against different state-of-the-art methods on two different datasets, MANIAC and MSRC-9, and show that our proposed model can achieve better performance.
arXiv Detail & Related papers (2021-10-25T21:40:42Z) - Dynamic Modeling of Hand-Object Interactions via Tactile Sensing [133.52375730875696]
In this work, we employ a high-resolution tactile glove to perform four different interactive activities on a diversified set of objects.
We build our model on a cross-modal learning framework and generate the labels using a visual processing pipeline to supervise the tactile model.
This work takes a step on dynamics modeling in hand-object interactions from dense tactile sensing.
arXiv Detail & Related papers (2021-09-09T16:04:14Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.