Text Classification in Memristor-based Spiking Neural Networks
- URL: http://arxiv.org/abs/2207.13729v1
- Date: Wed, 27 Jul 2022 18:08:31 GMT
- Title: Text Classification in Memristor-based Spiking Neural Networks
- Authors: Jinqi Huang, Alex Serb, Spyros Stathopoulos, Themis Prodromakis
- Abstract summary: We develop a simulation framework with a virtual memristor array to demonstrate a sentiment analysis task in the IMDB movie reviews dataset.
We achieve the classification accuracy of 85.88% by converting a pre-trained ANN to a memristor-based SNN and 84.86% by training the memristor-based SNN directly.
We also investigate how global parameters such as spike train length, the read noise, and the weight updating stop conditions affect the neural networks in both approaches.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Memristors, emerging non-volatile memory devices, have shown promising
potential in neuromorphic hardware designs, especially in spiking neural
network (SNN) hardware implementation. Memristor-based SNNs have been
successfully applied in a wide range of various applications, including image
classification and pattern recognition. However, implementing memristor-based
SNNs in text classification is still under exploration. One of the main reasons
is that training memristor-based SNNs for text classification is costly due to
the lack of efficient learning rules and memristor non-idealities. To address
these issues and accelerate the research of exploring memristor-based spiking
neural networks in text classification applications, we develop a simulation
framework with a virtual memristor array using an empirical memristor model. We
use this framework to demonstrate a sentiment analysis task in the IMDB movie
reviews dataset. We take two approaches to obtain trained spiking neural
networks with memristor models: 1) by converting a pre-trained artificial
neural network (ANN) to a memristor-based SNN, or 2) by training a
memristor-based SNN directly. These two approaches can be applied in two
scenarios: offline classification and online training. We achieve the
classification accuracy of 85.88% by converting a pre-trained ANN to a
memristor-based SNN and 84.86% by training the memristor-based SNN directly,
given that the baseline training accuracy of the equivalent ANN is 86.02%. We
conclude that it is possible to achieve similar classification accuracy in
simulation from ANNs to SNNs and from non-memristive synapses to data-driven
memristive synapses. We also investigate how global parameters such as spike
train length, the read noise, and the weight updating stop conditions affect
the neural networks in both approaches.
Related papers
- High-performance deep spiking neural networks with 0.3 spikes per neuron [9.01407445068455]
It is hard to train biologically-inspired spiking neural networks (SNNs) than artificial neural networks (ANNs)
We show that training deep SNN models achieves the exact same performance as that of ANNs.
Our SNN accomplishes high-performance classification with less than 0.3 spikes per neuron, lending itself for an energy-efficient implementation.
arXiv Detail & Related papers (2023-06-14T21:01:35Z) - SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural
Networks [56.35403810762512]
Spiking neural networks (SNNs) with event-based computation are promising brain-inspired models for energy-efficient applications on neuromorphic hardware.
We study spike-based implicit differentiation on the equilibrium state (SPIDE) that extends the recently proposed training method.
arXiv Detail & Related papers (2023-02-01T04:22:59Z) - PC-SNN: Supervised Learning with Local Hebbian Synaptic Plasticity based
on Predictive Coding in Spiking Neural Networks [1.6172800007896282]
We propose a novel learning algorithm inspired by predictive coding theory.
We show that it can perform supervised learning fully autonomously and successfully as the backprop.
This method achieves a favorable performance compared to the state-of-the-art multi-layer SNNs.
arXiv Detail & Related papers (2022-11-24T09:56:02Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Rethinking Nearest Neighbors for Visual Classification [56.00783095670361]
k-NN is a lazy learning method that aggregates the distance between the test image and top-k neighbors in a training set.
We adopt k-NN with pre-trained visual representations produced by either supervised or self-supervised methods in two steps.
Via extensive experiments on a wide range of classification tasks, our study reveals the generality and flexibility of k-NN integration.
arXiv Detail & Related papers (2021-12-15T20:15:01Z) - SAR Image Classification Based on Spiking Neural Network through
Spike-Time Dependent Plasticity and Gradient Descent [7.106664778883502]
Spiking neural network (SNN) is one of the core components of brain-like intelligence.
This article constructs a complete SAR image based on unsupervised and supervised learning SNN.
arXiv Detail & Related papers (2021-06-15T09:36:04Z) - A Temporal Neural Network Architecture for Online Learning [0.6091702876917281]
Temporal neural networks (TNNs) communicate and process information encoded as relative spike times.
A TNN architecture is proposed and, as a proof-of-concept, TNN operation is demonstrated within the larger context of online supervised classification.
arXiv Detail & Related papers (2020-11-27T17:15:29Z) - Skip-Connected Self-Recurrent Spiking Neural Networks with Joint
Intrinsic Parameter and Synaptic Weight Training [14.992756670960008]
We propose a new type of RSNN called Skip-Connected Self-Recurrent SNNs (ScSr-SNNs)
ScSr-SNNs can boost performance by up to 2.55% compared with other types of RSNNs trained by state-of-the-art BP methods.
arXiv Detail & Related papers (2020-10-23T22:27:13Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.