STEMNIST: Spiking Tactile Extended MNIST Neuromorphic Dataset
- URL: http://arxiv.org/abs/2601.01658v1
- Date: Sun, 04 Jan 2026 20:26:55 GMT
- Title: STEMNIST: Spiking Tactile Extended MNIST Neuromorphic Dataset
- Authors: Anubhab Tripathi, Li Gaishan, Zhengnan Fu, Chiara Bartolozzi, Bert E. Shi, Arindam Basu,
- Abstract summary: STEMNIST is a large-scale neuromorphic tactile dataset extending ST-MNIST from 10 digits to 35 alphanumeric classes.<n>The dataset comprises 7,700 samples collected from 34 participants using a custom tactile sensor array operating at 120 Hz.
- Score: 6.8584962606447535
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Tactile sensing is essential for robotic manipulation, prosthetics and assistive technologies, yet neuromorphic tactile datasets remain limited compared to their visual counterparts. We introduce STEMNIST, a large-scale neuromorphic tactile dataset extending ST-MNIST from 10 digits to 35 alphanumeric classes (uppercase letters A--Z and digits 1--9), providing a challenging benchmark for event-based haptic recognition. The dataset comprises 7,700 samples collected from 34 participants using a custom \(16\times 16\) tactile sensor array operating at 120 Hz, encoded as 1,005,592 spike events through adaptive temporal differentiation. Following EMNIST's visual character recognition protocol, STEMNIST addresses the critical gap between simplified digit classification and real-world tactile interaction scenarios requiring alphanumeric discrimination. Baseline experiments using conventional CNNs (90.91% test accuracy) and spiking neural networks (89.16%) establish performance benchmarks. The dataset's event-based format, unrestricted spatial variability and rich temporal structure makes it suitable for testing neuromorphic hardware and bio-inspired learning algorithms. STEMNIST enables reproducible evaluation of tactile recognition systems and provides a foundation for advancing energy-efficient neuromorphic perception in robotics, biomedical engineering and human-machine interfaces. The dataset, documentation and codes are publicly available to accelerate research in neuromorphic tactile computing.
Related papers
- SensorLM: Learning the Language of Wearable Sensors [50.95988682423808]
We present SensorLM, a family of sensor-language foundation models that enable wearable sensor data understanding with natural language.<n>We introduce a hierarchical caption generation pipeline designed to capture statistical, structural, and semantic information from sensor data.<n>This approach enabled the curation of the largest sensor-language dataset to date, comprising over 59.7 million hours of data from more than 103,000 people.
arXiv Detail & Related papers (2025-06-10T17:13:09Z) - MT-NAM: An Efficient and Adaptive Model for Epileptic Seizure Detection [51.87482627771981]
Micro Tree-based NAM (MT-NAM) is a distilled model based on the recently proposed Neural Additive Models (NAM)<n>MT-NAM achieves a remarkable 100$times$ improvement in inference speed compared to standard NAM, without compromising accuracy.<n>We evaluate our approach on the CHB-MIT scalp EEG dataset, which includes recordings from 24 patients with varying numbers of sessions and seizures.
arXiv Detail & Related papers (2025-03-11T10:14:53Z) - Reservoir Network with Structural Plasticity for Human Activity Recognition [2.355460994057843]
Echo state network (ESN) is a class of recurrent neural networks that can be used to identify unique patterns in time-series data and predict future events.<n>In this work, a custom-design neuromorphic chip based on ESN targeting edge devices is proposed.<n>The proposed system supports various learning mechanisms, including structural plasticity and synaptic plasticity, locally on-chip.
arXiv Detail & Related papers (2025-03-01T07:57:22Z) - emg2qwerty: A Large Dataset with Baselines for Touch Typing using Surface Electromyography [47.160223334501126]
emg2qwerty is a large-scale dataset of non-invasive electromyographic signals recorded at the wrists while touch typing on a QWERTY keyboard.<n>With 1,135 sessions spanning 108 users and 346 hours of recording, this is the largest such public dataset to date.<n>We show strong baseline performance on predicting key-presses using sEMG signals alone.
arXiv Detail & Related papers (2024-10-26T05:18:48Z) - BeCAPTCHA-Type: Biometric Keystroke Data Generation for Improved Bot
Detection [63.447493500066045]
This work proposes a data driven learning model for the synthesis of keystroke biometric data.
The proposed method is compared with two statistical approaches based on Universal and User-dependent models.
Our experimental framework considers a dataset with 136 million keystroke events from 168 thousand subjects.
arXiv Detail & Related papers (2022-07-27T09:26:15Z) - Braille Letter Reading: A Benchmark for Spatio-Temporal Pattern
Recognition on Neuromorphic Hardware [50.380319968947035]
Recent deep learning approaches have reached accuracy in such tasks, but their implementation on conventional embedded solutions is still computationally very and energy expensive.
We propose a new benchmark for computing tactile pattern recognition at the edge through letters reading.
We trained and compared feed-forward and recurrent spiking neural networks (SNNs) offline using back-propagation through time with surrogate gradients, then we deployed them on the Intel Loihimorphic chip for efficient inference.
Our results show that the LSTM outperforms the recurrent SNN in terms of accuracy by 14%. However, the recurrent SNN on Loihi is 237 times more energy
arXiv Detail & Related papers (2022-05-30T14:30:45Z) - Mobile Behavioral Biometrics for Passive Authentication [65.94403066225384]
This work carries out a comparative analysis of unimodal and multimodal behavioral biometric traits.
Experiments are performed over HuMIdb, one of the largest and most comprehensive freely available mobile user interaction databases.
In our experiments, the most discriminative background sensor is the magnetometer, whereas among touch tasks the best results are achieved with keystroke.
arXiv Detail & Related papers (2022-03-14T17:05:59Z) - Object recognition for robotics from tactile time series data utilising
different neural network architectures [0.0]
This paper investigates the use of Convolutional Neural Networks (CNN) and Long-Short Term Memory (LSTM) neural network architectures for object classification on tactile data.
We compare these methods using data from two different fingertip sensors (namely the BioTac SP and WTS-FT) in the same physical setup.
The results show that the proposed method improves the maximum accuracy from 82.4% (BioTac SP fingertips) and 90.7% (WTS-FT fingertips) with complete time-series data to about 94% for both sensor types.
arXiv Detail & Related papers (2021-09-09T22:05:45Z) - Real-Time Activity Recognition and Intention Recognition Using a
Vision-based Embedded System [4.060731229044571]
We introduce a real-time activity recognition to recognize people's intentions to pass or not pass a door.
This system, if applied in elevators and automatic doors will save energy and increase efficiency.
Our embedded system was implemented with an accuracy of 98.78% on our Intention Recognition dataset.
arXiv Detail & Related papers (2021-07-27T11:38:44Z) - ST-MNIST -- The Spiking Tactile MNIST Neuromorphic Dataset [13.270250399169104]
We debut a novel neuromorphic Spiking Tactile MNIST dataset, which comprises handwritten digits obtained by human participants writing on a tactile neuromorphic sensor array.
We also describe an initial effort to evaluate our ST-MNIST dataset using existing artificial spiking and neural network models.
arXiv Detail & Related papers (2020-05-08T23:44:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.