ST-MNIST -- The Spiking Tactile MNIST Neuromorphic Dataset
- URL: http://arxiv.org/abs/2005.04319v1
- Date: Fri, 8 May 2020 23:44:14 GMT
- Title: ST-MNIST -- The Spiking Tactile MNIST Neuromorphic Dataset
- Authors: Hian Hian See, Brian Lim, Si Li, Haicheng Yao, Wen Cheng, Harold Soh,
and Benjamin C.K. Tee
- Abstract summary: We debut a novel neuromorphic Spiking Tactile MNIST dataset, which comprises handwritten digits obtained by human participants writing on a tactile neuromorphic sensor array.
We also describe an initial effort to evaluate our ST-MNIST dataset using existing artificial spiking and neural network models.
- Score: 13.270250399169104
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Tactile sensing is an essential modality for smart robots as it enables them
to interact flexibly with physical objects in their environment. Recent
advancements in electronic skins have led to the development of data-driven
machine learning methods that exploit this important sensory modality. However,
current datasets used to train such algorithms are limited to standard
synchronous tactile sensors. There is a dearth of neuromorphic event-based
tactile datasets, principally due to the scarcity of large-scale event-based
tactile sensors. Having such datasets is crucial for the development and
evaluation of new algorithms that process spatio-temporal event-based data. For
example, evaluating spiking neural networks on conventional frame-based
datasets is considered sub-optimal. Here, we debut a novel neuromorphic Spiking
Tactile MNIST (ST-MNIST) dataset, which comprises handwritten digits obtained
by human participants writing on a neuromorphic tactile sensor array. We also
describe an initial effort to evaluate our ST-MNIST dataset using existing
artificial and spiking neural network models. The classification accuracies
provided herein can serve as performance benchmarks for future work. We
anticipate that our ST-MNIST dataset will be of interest and useful to the
neuromorphic and robotics research communities.
Related papers
- Deep Learning for real-time neural decoding of grasp [0.0]
We present a Deep Learning-based approach to the decoding of neural signals for grasp type classification.
The main goal of the presented approach is to improve over state-of-the-art decoding accuracy without relying on any prior neuroscience knowledge.
arXiv Detail & Related papers (2023-11-02T08:26:29Z) - WaLiN-GUI: a graphical and auditory tool for neuron-based encoding [73.88751967207419]
Neuromorphic computing relies on spike-based, energy-efficient communication.
We develop a tool to identify suitable configurations for neuron-based encoding of sample-based data into spike trains.
The WaLiN-GUI is provided open source and with documentation.
arXiv Detail & Related papers (2023-10-25T20:34:08Z) - Graph Neural Networks with Trainable Adjacency Matrices for Fault
Diagnosis on Multivariate Sensor Data [69.25738064847175]
It is necessary to consider the behavior of the signals in each sensor separately, to take into account their correlation and hidden relationships with each other.
The graph nodes can be represented as data from the different sensors, and the edges can display the influence of these data on each other.
It was proposed to construct a graph during the training of graph neural network. This allows to train models on data where the dependencies between the sensors are not known in advance.
arXiv Detail & Related papers (2022-10-20T11:03:21Z) - Event-Driven Tactile Learning with Various Location Spiking Neurons [5.822511654546528]
Event-driven learning is still in its infancy due to the limited representation abilities of existing spiking neurons.
We propose a novel "location spiking neuron" model, which enables us to extract features of event-based data in a novel way.
By exploiting the novel location spiking neurons, we propose several models to capture complex tactile-temporal dependencies in the event-driven data.
arXiv Detail & Related papers (2022-10-09T14:49:27Z) - Braille Letter Reading: A Benchmark for Spatio-Temporal Pattern
Recognition on Neuromorphic Hardware [50.380319968947035]
Recent deep learning approaches have reached accuracy in such tasks, but their implementation on conventional embedded solutions is still computationally very and energy expensive.
We propose a new benchmark for computing tactile pattern recognition at the edge through letters reading.
We trained and compared feed-forward and recurrent spiking neural networks (SNNs) offline using back-propagation through time with surrogate gradients, then we deployed them on the Intel Loihimorphic chip for efficient inference.
Our results show that the LSTM outperforms the recurrent SNN in terms of accuracy by 14%. However, the recurrent SNN on Loihi is 237 times more energy
arXiv Detail & Related papers (2022-05-30T14:30:45Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Object recognition for robotics from tactile time series data utilising
different neural network architectures [0.0]
This paper investigates the use of Convolutional Neural Networks (CNN) and Long-Short Term Memory (LSTM) neural network architectures for object classification on tactile data.
We compare these methods using data from two different fingertip sensors (namely the BioTac SP and WTS-FT) in the same physical setup.
The results show that the proposed method improves the maximum accuracy from 82.4% (BioTac SP fingertips) and 90.7% (WTS-FT fingertips) with complete time-series data to about 94% for both sensor types.
arXiv Detail & Related papers (2021-09-09T22:05:45Z) - Controllable reset behavior in domain wall-magnetic tunnel junction
artificial neurons for task-adaptable computation [1.4505273244528207]
Domain wall-magnetic tunnel junction (DW-MTJ) devices have been shown to be able to intrinsically capture biological neuron behavior.
We show that edgy-relaxed behavior can be implemented in DW-MTJ artificial neurons via three alternative mechanisms.
arXiv Detail & Related papers (2021-01-08T16:50:29Z) - TactileSGNet: A Spiking Graph Neural Network for Event-based Tactile
Object Recognition [17.37142241982902]
New advances in flexible, event-driven, electronic skins may soon endow robots with touch perception capabilities similar to humans.
These unique features may render current deep learning approaches such as convolutional feature extractors unsuitable for tactile learning.
We propose a novel spiking graph neural network for event-based tactile object recognition.
arXiv Detail & Related papers (2020-08-01T03:35:15Z) - One-step regression and classification with crosspoint resistive memory
arrays [62.997667081978825]
High speed, low energy computing machines are in demand to enable real-time artificial intelligence at the edge.
One-step learning is supported by simulations of the prediction of the cost of a house in Boston and the training of a 2-layer neural network for MNIST digit recognition.
Results are all obtained in one computational step, thanks to the physical, parallel, and analog computing within the crosspoint array.
arXiv Detail & Related papers (2020-05-05T08:00:07Z) - Deep Learning based Pedestrian Inertial Navigation: Methods, Dataset and
On-Device Inference [49.88536971774444]
Inertial measurements units (IMUs) are small, cheap, energy efficient, and widely employed in smart devices and mobile robots.
Exploiting inertial data for accurate and reliable pedestrian navigation supports is a key component for emerging Internet-of-Things applications and services.
We present and release the Oxford Inertial Odometry dataset (OxIOD), a first-of-its-kind public dataset for deep learning based inertial navigation research.
arXiv Detail & Related papers (2020-01-13T04:41:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.