Towards efficient end-to-end speech recognition with
biologically-inspired neural networks
- URL: http://arxiv.org/abs/2110.02743v1
- Date: Mon, 4 Oct 2021 21:24:10 GMT
- Title: Towards efficient end-to-end speech recognition with
biologically-inspired neural networks
- Authors: Thomas Bohnstingl, Ayush Garg, Stanis{\l}aw Wo\'zniak, George Saon,
Evangelos Eleftheriou and Angeliki Pantazi
- Abstract summary: We introduce neural connectivity concepts emulating the axo-somatic and the axo-axonic synapses.
We demonstrate for the first time, that a biologically realistic implementation of a large-scale ASR model can yield competitive performance levels.
- Score: 10.457580011403289
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Automatic speech recognition (ASR) is a capability which enables a program to
process human speech into a written form. Recent developments in artificial
intelligence (AI) have led to high-accuracy ASR systems based on deep neural
networks, such as the recurrent neural network transducer (RNN-T). However, the
core components and the performed operations of these approaches depart from
the powerful biological counterpart, i.e., the human brain. On the other hand,
the current developments in biologically-inspired ASR models, based on spiking
neural networks (SNNs), lag behind in terms of accuracy and focus primarily on
small scale applications. In this work, we revisit the incorporation of
biologically-plausible models into deep learning and we substantially enhance
their capabilities, by taking inspiration from the diverse neural and synaptic
dynamics found in the brain. In particular, we introduce neural connectivity
concepts emulating the axo-somatic and the axo-axonic synapses. Based on this,
we propose novel deep learning units with enriched neuro-synaptic dynamics and
integrate them into the RNN-T architecture. We demonstrate for the first time,
that a biologically realistic implementation of a large-scale ASR model can
yield competitive performance levels compared to the existing deep learning
models. Specifically, we show that such an implementation bears several
advantages, such as a reduced computational cost and a lower latency, which are
critical for speech recognition applications.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Research Advances and New Paradigms for Biology-inspired Spiking Neural Networks [8.315801422499861]
Spiking neural networks (SNNs) are gaining popularity in the computational simulation and artificial intelligence fields.
This paper explores the historical development of SNN and concludes that these two fields are intersecting and merging rapidly.
arXiv Detail & Related papers (2024-08-26T03:37:48Z) - Enhancing learning in spiking neural networks through neuronal heterogeneity and neuromodulatory signaling [52.06722364186432]
We propose a biologically-informed framework for enhancing artificial neural networks (ANNs)
Our proposed dual-framework approach highlights the potential of spiking neural networks (SNNs) for emulating diverse spiking behaviors.
We outline how the proposed approach integrates brain-inspired compartmental models and task-driven SNNs, bioinspiration and complexity.
arXiv Detail & Related papers (2024-07-05T14:11:28Z) - Exploring neural oscillations during speech perception via surrogate gradient spiking neural networks [59.38765771221084]
We present a physiologically inspired speech recognition architecture compatible and scalable with deep learning frameworks.
We show end-to-end gradient descent training leads to the emergence of neural oscillations in the central spiking neural network.
Our findings highlight the crucial inhibitory role of feedback mechanisms, such as spike frequency adaptation and recurrent connections, in regulating and synchronising neural activity to improve recognition performance.
arXiv Detail & Related papers (2024-04-22T09:40:07Z) - Astrocyte-Enabled Advancements in Spiking Neural Networks for Large
Language Modeling [7.863029550014263]
Astrocyte-Modulated Spiking Neural Network (AstroSNN) exhibits exceptional performance in tasks involving memory retention and natural language generation.
AstroSNN shows low latency, high throughput, and reduced memory usage in practical applications.
arXiv Detail & Related papers (2023-12-12T06:56:31Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - A brain basis of dynamical intelligence for AI and computational
neuroscience [0.0]
More brain-like capacities may demand new theories, models, and methods for designing artificial learning systems.
This article was inspired by our symposium on dynamical neuroscience and machine learning at the 6th Annual US/NIH BRAIN Initiative Investigators Meeting.
arXiv Detail & Related papers (2021-05-15T19:49:32Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.