Closed-Loop Neural Interfaces with Embedded Machine Learning
- URL: http://arxiv.org/abs/2010.09457v2
- Date: Wed, 21 Oct 2020 11:56:43 GMT
- Title: Closed-Loop Neural Interfaces with Embedded Machine Learning
- Authors: Bingzhao Zhu, Uisub Shin, Mahsa Shoaran
- Abstract summary: We review the recent developments in embedding machine learning in neural interfaces.
We present our optimized tree-based model for low-power and memory-efficient classification of neural signal in brain implants.
Using energy-aware learning and model compression, we show that the proposed oblique trees can outperform conventional machine learning models in applications such as seizure or tremor detection and motor decoding.
- Score: 12.977151652608047
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural interfaces capable of multi-site electrical recording, on-site signal
classification, and closed-loop therapy are critical for the diagnosis and
treatment of neurological disorders. However, deploying machine learning
algorithms on low-power neural devices is challenging, given the tight
constraints on computational and memory resources for such devices. In this
paper, we review the recent developments in embedding machine learning in
neural interfaces, with a focus on design trade-offs and hardware efficiency.
We also present our optimized tree-based model for low-power and
memory-efficient classification of neural signal in brain implants. Using
energy-aware learning and model compression, we show that the proposed oblique
trees can outperform conventional machine learning models in applications such
as seizure or tremor detection and motor decoding.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Brain-Inspired Machine Intelligence: A Survey of
Neurobiologically-Plausible Credit Assignment [65.268245109828]
We examine algorithms for conducting credit assignment in artificial neural networks that are inspired or motivated by neurobiology.
We organize the ever-growing set of brain-inspired learning schemes into six general families and consider these in the context of backpropagation of errors.
The results of this review are meant to encourage future developments in neuro-mimetic systems and their constituent learning processes.
arXiv Detail & Related papers (2023-12-01T05:20:57Z) - Neuromorphic Auditory Perception by Neural Spiketrum [27.871072042280712]
We introduce a neural spike coding model called spiketrumtemporal, to transform the time-varying analog signals into efficient spike patterns.
The model provides a sparse and efficient coding scheme with precisely controllable spike rate that facilitates training of spiking neural networks in various auditory perception tasks.
arXiv Detail & Related papers (2023-09-11T13:06:19Z) - A Convolutional Spiking Network for Gesture Recognition in
Brain-Computer Interfaces [0.8122270502556371]
We propose a simple yet efficient machine learning-based approach for the exemplary problem of hand gesture classification based on brain signals.
We demonstrate that this approach generalizes to different subjects with both EEG and ECoG data and achieves superior accuracy in the range of 92.74-97.07%.
arXiv Detail & Related papers (2023-04-21T16:23:40Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - An embedding for EEG signals learned using a triplet loss [0.0]
In a brain-computer interface (BCI), decoded brain state information can be used with minimal time delay.
A challenge in such decoding tasks is posed by the small dataset sizes.
We propose novel domain-specific embeddings for neurophysiological data.
arXiv Detail & Related papers (2023-03-23T09:05:20Z) - Neuromorphic Artificial Intelligence Systems [58.1806704582023]
Modern AI systems, based on von Neumann architecture and classical neural networks, have a number of fundamental limitations in comparison with the brain.
This article discusses such limitations and the ways they can be mitigated.
It presents an overview of currently available neuromorphic AI projects in which these limitations are overcome.
arXiv Detail & Related papers (2022-05-25T20:16:05Z) - Neuro-BERT: Rethinking Masked Autoencoding for Self-supervised Neurological Pretraining [24.641328814546842]
We present Neuro-BERT, a self-supervised pre-training framework of neurological signals based on masked autoencoding in the Fourier domain.
We propose a novel pre-training task dubbed Fourier Inversion Prediction (FIP), which randomly masks out a portion of the input signal and then predicts the missing information.
By evaluating our method on several benchmark datasets, we show that Neuro-BERT improves downstream neurological-related tasks by a large margin.
arXiv Detail & Related papers (2022-04-20T16:48:18Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - Spiking Neural Networks Hardware Implementations and Challenges: a
Survey [53.429871539789445]
Spiking Neural Networks are cognitive algorithms mimicking neuron and synapse operational principles.
We present the state of the art of hardware implementations of spiking neural networks.
We discuss the strategies employed to leverage the characteristics of these event-driven algorithms at the hardware level.
arXiv Detail & Related papers (2020-05-04T13:24:00Z) - Structural plasticity on an accelerated analog neuromorphic hardware
system [0.46180371154032884]
We present a strategy to achieve structural plasticity by constantly rewiring the pre- and gpostsynaptic partners.
We implemented this algorithm on the analog neuromorphic system BrainScaleS-2.
We evaluated our implementation in a simple supervised learning scenario, showing its ability to optimize the network topology.
arXiv Detail & Related papers (2019-12-27T10:15:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.