Bioformers: Embedding Transformers for Ultra-Low Power sEMG-based
Gesture Recognition
- URL: http://arxiv.org/abs/2203.12932v2
- Date: Fri, 25 Mar 2022 15:54:59 GMT
- Title: Bioformers: Embedding Transformers for Ultra-Low Power sEMG-based
Gesture Recognition
- Authors: Alessio Burrello, Francesco Bianco Morghet, Moritz Scherer, Simone
Benatti, Luca Benini, Enrico Macii, Massimo Poncino, Daniele Jahier Pagliari
- Abstract summary: Human-machine interaction is gaining traction in rehabilitation tasks, such as controlling prosthetic hands or robotic arms.
Gesture recognition exploiting surface electromyographic (sEMG) signals is one of the most promising approaches.
However, the analysis of these signals still presents many challenges since similar gestures result in similar muscle contractions.
- Score: 21.486555297061717
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Human-machine interaction is gaining traction in rehabilitation tasks, such
as controlling prosthetic hands or robotic arms. Gesture recognition exploiting
surface electromyographic (sEMG) signals is one of the most promising
approaches, given that sEMG signal acquisition is non-invasive and is directly
related to muscle contraction. However, the analysis of these signals still
presents many challenges since similar gestures result in similar muscle
contractions. Thus the resulting signal shapes are almost identical, leading to
low classification accuracy. To tackle this challenge, complex neural networks
are employed, which require large memory footprints, consume relatively high
energy and limit the maximum battery life of devices used for classification.
This work addresses this problem with the introduction of the Bioformers. This
new family of ultra-small attention-based architectures approaches
state-of-the-art performance while reducing the number of parameters and
operations of 4.9X. Additionally, by introducing a new inter-subjects
pre-training, we improve the accuracy of our best Bioformer by 3.39%, matching
state-of-the-art accuracy without any additional inference cost. Deploying our
best performing Bioformer on a Parallel, Ultra-Low Power (PULP) microcontroller
unit (MCU), the GreenWaves GAP8, we achieve an inference latency and energy of
2.72 ms and 0.14 mJ, respectively, 8.0X lower than the previous
state-of-the-art neural network, while occupying just 94.2 kB of memory.
Related papers
- Spatial Adaptation Layer: Interpretable Domain Adaptation For Biosignal Sensor Array Applications [0.7499722271664147]
Biosignal acquisition is key for healthcare applications and wearable devices.
Existing solutions often require large and expensive datasets and/or lack robustness and interpretability.
We propose the Spatial Adaptation Layer (SAL), which can be prepended to any biosignal array model.
We also introduce learnable baseline normalization (LBN) to reduce baseline fluctuations.
arXiv Detail & Related papers (2024-09-12T14:06:12Z) - Hybrid Spiking Neural Networks for Low-Power Intra-Cortical Brain-Machine Interfaces [42.72938925647165]
Intra-cortical brain-machine interfaces (iBMIs) have the potential to dramatically improve the lives of people with paraplegia.
Current iBMIs suffer from scalability and mobility limitations due to bulky hardware and wiring.
We are investigating hybrid spiking neural networks for embedded neural decoding in wireless iBMIs.
arXiv Detail & Related papers (2024-09-06T17:48:44Z) - MS-MANO: Enabling Hand Pose Tracking with Biomechanical Constraints [50.61346764110482]
We integrate a musculoskeletal system with a learnable parametric hand model, MANO, to create MS-MANO.
This model emulates the dynamics of muscles and tendons to drive the skeletal system, imposing physiologically realistic constraints on the resulting torque trajectories.
We also propose a simulation-in-the-loop pose refinement framework, BioPR, that refines the initial estimated pose through a multi-layer perceptron network.
arXiv Detail & Related papers (2024-04-16T02:18:18Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - A Tiny Transformer for Low-Power Arrhythmia Classification on Microcontrollers [10.203375838335935]
A promising approach for real-time analysis of the electrocardiographic (ECG) signal and the detection of heart conditions, such as arrhythmia, is represented by the transformer machine learning model.
We present a tiny transformer model for the analysis of the ECG signal, requiring only 6k parameters and reaching 98.97% accuracy in the recognition of the 5 most common arrhythmia classes from the MIT-BIH Arrhythmia database.
arXiv Detail & Related papers (2024-02-16T15:14:16Z) - Online Transformers with Spiking Neurons for Fast Prosthetic Hand
Control [1.6114012813668934]
In this paper, instead of the self-attention mechanism, we use a sliding window attention mechanism.
We show that this mechanism is more efficient for continuous signals with finite-range dependencies between input and target.
Our results hold great promises for accurate and fast online processing of sEMG signals for smooth prosthetic hand control.
arXiv Detail & Related papers (2023-03-21T13:59:35Z) - The Lazy Neuron Phenomenon: On Emergence of Activation Sparsity in
Transformers [59.87030906486969]
This paper studies the curious phenomenon for machine learning models with Transformer architectures that their activation maps are sparse.
We show that sparsity is a prevalent phenomenon that occurs for both natural language processing and vision tasks.
We discuss how sparsity immediately implies a way to significantly reduce the FLOP count and improve efficiency for Transformers.
arXiv Detail & Related papers (2022-10-12T15:25:19Z) - Multiple Time Series Fusion Based on LSTM An Application to CAP A Phase
Classification Using EEG [56.155331323304]
Deep learning based electroencephalogram channels' feature level fusion is carried out in this work.
Channel selection, fusion, and classification procedures were optimized by two optimization algorithms.
arXiv Detail & Related papers (2021-12-18T14:17:49Z) - SOUL: An Energy-Efficient Unsupervised Online Learning Seizure Detection
Classifier [68.8204255655161]
Implantable devices that record neural activity and detect seizures have been adopted to issue warnings or trigger neurostimulation to suppress seizures.
For an implantable seizure detection system, a low power, at-the-edge, online learning algorithm can be employed to dynamically adapt to neural signal drifts.
SOUL was fabricated in TSMC's 28 nm process occupying 0.1 mm2 and achieves 1.5 nJ/classification energy efficiency, which is at least 24x more efficient than state-of-the-art.
arXiv Detail & Related papers (2021-10-01T23:01:20Z) - Video-based Remote Physiological Measurement via Cross-verified Feature
Disentangling [121.50704279659253]
We propose a cross-verified feature disentangling strategy to disentangle the physiological features with non-physiological representations.
We then use the distilled physiological features for robust multi-task physiological measurements.
The disentangled features are finally used for the joint prediction of multiple physiological signals like average HR values and r signals.
arXiv Detail & Related papers (2020-07-16T09:39:17Z) - sEMG Gesture Recognition with a Simple Model of Attention [0.0]
We present our research in surface electromyography (sEMG) signal classification.
Our novel attention-based model achieves benchmark leading results on multiple industry-standard datasets.
Our results indicate that sEMG represents a promising avenue for future machine learning research.
arXiv Detail & Related papers (2020-06-05T19:28:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.