Bioformers: Embedding Transformers for Ultra-Low Power sEMG-based
Gesture Recognition
- URL: http://arxiv.org/abs/2203.12932v2
- Date: Fri, 25 Mar 2022 15:54:59 GMT
- Title: Bioformers: Embedding Transformers for Ultra-Low Power sEMG-based
Gesture Recognition
- Authors: Alessio Burrello, Francesco Bianco Morghet, Moritz Scherer, Simone
Benatti, Luca Benini, Enrico Macii, Massimo Poncino, Daniele Jahier Pagliari
- Abstract summary: Human-machine interaction is gaining traction in rehabilitation tasks, such as controlling prosthetic hands or robotic arms.
Gesture recognition exploiting surface electromyographic (sEMG) signals is one of the most promising approaches.
However, the analysis of these signals still presents many challenges since similar gestures result in similar muscle contractions.
- Score: 21.486555297061717
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Human-machine interaction is gaining traction in rehabilitation tasks, such
as controlling prosthetic hands or robotic arms. Gesture recognition exploiting
surface electromyographic (sEMG) signals is one of the most promising
approaches, given that sEMG signal acquisition is non-invasive and is directly
related to muscle contraction. However, the analysis of these signals still
presents many challenges since similar gestures result in similar muscle
contractions. Thus the resulting signal shapes are almost identical, leading to
low classification accuracy. To tackle this challenge, complex neural networks
are employed, which require large memory footprints, consume relatively high
energy and limit the maximum battery life of devices used for classification.
This work addresses this problem with the introduction of the Bioformers. This
new family of ultra-small attention-based architectures approaches
state-of-the-art performance while reducing the number of parameters and
operations of 4.9X. Additionally, by introducing a new inter-subjects
pre-training, we improve the accuracy of our best Bioformer by 3.39%, matching
state-of-the-art accuracy without any additional inference cost. Deploying our
best performing Bioformer on a Parallel, Ultra-Low Power (PULP) microcontroller
unit (MCU), the GreenWaves GAP8, we achieve an inference latency and energy of
2.72 ms and 0.14 mJ, respectively, 8.0X lower than the previous
state-of-the-art neural network, while occupying just 94.2 kB of memory.
Related papers
- Biological Processing Units: Leveraging an Insect Connectome to Pioneer Biofidelic Neural Architectures [5.459524614513537]
The complete connectome of the Drosophila larva brain offers a unique opportunity to investigate whether biologically evolved circuits can support artificial intelligence.<n>We convert this wiring diagram into a Biological Processing Unit (BPU), a fixed network derived directly from synaptic connectivity.<n>Despite its modest size 3,000 neurons and 65,000 weights between them, the unmodified BPU achieves 98% accuracy on MNIST and 58% on CIFAR-10, surpassing size-matched parameters.<n>These results demonstrate the potential of biofidelic neural architectures to support complex cognitive tasks and motivate scaling to larger and more intelligent connectomes in future work.
arXiv Detail & Related papers (2025-07-15T03:31:57Z) - WaveFormer: A Lightweight Transformer Model for sEMG-based Gesture Recognition [18.978031999678507]
WaveFormer is a lightweight transformer-based architecture tailored for sEMG gesture recognition.<n>Our model integrates time-domain and frequency-domain features through a novel learnable wavelet transform, enhancing feature extraction.<n>With just 3.1 million parameters, WaveFormer achieves 95% classification accuracy on the EPN612 dataset, outperforming larger models.
arXiv Detail & Related papers (2025-06-12T04:07:11Z) - BRAVE: Brain-Controlled Prosthetic Arm with Voice Integration and Embodied Learning for Enhanced Mobility [5.528262076322921]
BRAVE is a hybrid EEG and voice-controlled prosthetic system.<n>It aims to interpret EEG-driven motor intent, enabling movement control without reliance on residual muscle activity.<n>The system operates in real time, with a response latency of 150 ms.
arXiv Detail & Related papers (2025-05-23T11:44:33Z) - BrainOmni: A Brain Foundation Model for Unified EEG and MEG Signals [50.76802709706976]
This paper proposes Brain Omni, the first brain foundation model that generalises across heterogeneous EEG and MEG recordings.<n>To unify diverse data sources, we introduce BrainTokenizer, the first tokenizer that quantises neural brain activity into discrete representations.<n>A total of 1,997 hours of EEG and 656 hours of MEG data are curated and standardised from publicly available sources for pretraining.
arXiv Detail & Related papers (2025-05-18T14:07:14Z) - CEReBrO: Compact Encoder for Representations of Brain Oscillations Using Efficient Alternating Attention [53.539020807256904]
We introduce a Compact for Representations of Brain Oscillations using alternating attention (CEReBrO)
Our tokenization scheme represents EEG signals at a per-channel patch.
We propose an alternating attention mechanism that jointly models intra-channel temporal dynamics and inter-channel spatial correlations, achieving 2x speed improvement with 6x less memory required compared to standard self-attention.
arXiv Detail & Related papers (2025-01-18T21:44:38Z) - emg2qwerty: A Large Dataset with Baselines for Touch Typing using Surface Electromyography [47.160223334501126]
emg2qwerty is a large-scale dataset of non-invasive electromyographic signals recorded at the wrists while touch typing on a QWERTY keyboard.<n>With 1,135 sessions spanning 108 users and 346 hours of recording, this is the largest such public dataset to date.<n>We show strong baseline performance on predicting key-presses using sEMG signals alone.
arXiv Detail & Related papers (2024-10-26T05:18:48Z) - Spatial Adaptation Layer: Interpretable Domain Adaptation For Biosignal Sensor Array Applications [0.7499722271664147]
Biosignal acquisition is key for healthcare applications and wearable devices.
Existing solutions often require large and expensive datasets and/or lack robustness and interpretability.
We propose the Spatial Adaptation Layer (SAL), which can be prepended to any biosignal array model.
We also introduce learnable baseline normalization (LBN) to reduce baseline fluctuations.
arXiv Detail & Related papers (2024-09-12T14:06:12Z) - Hybrid Spiking Neural Networks for Low-Power Intra-Cortical Brain-Machine Interfaces [42.72938925647165]
Intra-cortical brain-machine interfaces (iBMIs) have the potential to dramatically improve the lives of people with paraplegia.
Current iBMIs suffer from scalability and mobility limitations due to bulky hardware and wiring.
We are investigating hybrid spiking neural networks for embedded neural decoding in wireless iBMIs.
arXiv Detail & Related papers (2024-09-06T17:48:44Z) - MS-MANO: Enabling Hand Pose Tracking with Biomechanical Constraints [50.61346764110482]
We integrate a musculoskeletal system with a learnable parametric hand model, MANO, to create MS-MANO.
This model emulates the dynamics of muscles and tendons to drive the skeletal system, imposing physiologically realistic constraints on the resulting torque trajectories.
We also propose a simulation-in-the-loop pose refinement framework, BioPR, that refines the initial estimated pose through a multi-layer perceptron network.
arXiv Detail & Related papers (2024-04-16T02:18:18Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - A Tiny Transformer for Low-Power Arrhythmia Classification on Microcontrollers [10.203375838335935]
A promising approach for real-time analysis of the electrocardiographic (ECG) signal and the detection of heart conditions, such as arrhythmia, is represented by the transformer machine learning model.
We present a tiny transformer model for the analysis of the ECG signal, requiring only 6k parameters and reaching 98.97% accuracy in the recognition of the 5 most common arrhythmia classes from the MIT-BIH Arrhythmia database.
arXiv Detail & Related papers (2024-02-16T15:14:16Z) - Online Transformers with Spiking Neurons for Fast Prosthetic Hand
Control [1.6114012813668934]
In this paper, instead of the self-attention mechanism, we use a sliding window attention mechanism.
We show that this mechanism is more efficient for continuous signals with finite-range dependencies between input and target.
Our results hold great promises for accurate and fast online processing of sEMG signals for smooth prosthetic hand control.
arXiv Detail & Related papers (2023-03-21T13:59:35Z) - The Lazy Neuron Phenomenon: On Emergence of Activation Sparsity in
Transformers [59.87030906486969]
This paper studies the curious phenomenon for machine learning models with Transformer architectures that their activation maps are sparse.
We show that sparsity is a prevalent phenomenon that occurs for both natural language processing and vision tasks.
We discuss how sparsity immediately implies a way to significantly reduce the FLOP count and improve efficiency for Transformers.
arXiv Detail & Related papers (2022-10-12T15:25:19Z) - Multiple Time Series Fusion Based on LSTM An Application to CAP A Phase
Classification Using EEG [56.155331323304]
Deep learning based electroencephalogram channels' feature level fusion is carried out in this work.
Channel selection, fusion, and classification procedures were optimized by two optimization algorithms.
arXiv Detail & Related papers (2021-12-18T14:17:49Z) - SOUL: An Energy-Efficient Unsupervised Online Learning Seizure Detection
Classifier [68.8204255655161]
Implantable devices that record neural activity and detect seizures have been adopted to issue warnings or trigger neurostimulation to suppress seizures.
For an implantable seizure detection system, a low power, at-the-edge, online learning algorithm can be employed to dynamically adapt to neural signal drifts.
SOUL was fabricated in TSMC's 28 nm process occupying 0.1 mm2 and achieves 1.5 nJ/classification energy efficiency, which is at least 24x more efficient than state-of-the-art.
arXiv Detail & Related papers (2021-10-01T23:01:20Z) - Video-based Remote Physiological Measurement via Cross-verified Feature
Disentangling [121.50704279659253]
We propose a cross-verified feature disentangling strategy to disentangle the physiological features with non-physiological representations.
We then use the distilled physiological features for robust multi-task physiological measurements.
The disentangled features are finally used for the joint prediction of multiple physiological signals like average HR values and r signals.
arXiv Detail & Related papers (2020-07-16T09:39:17Z) - sEMG Gesture Recognition with a Simple Model of Attention [0.0]
We present our research in surface electromyography (sEMG) signal classification.
Our novel attention-based model achieves benchmark leading results on multiple industry-standard datasets.
Our results indicate that sEMG represents a promising avenue for future machine learning research.
arXiv Detail & Related papers (2020-06-05T19:28:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.