Strategy and Benchmark for Converting Deep Q-Networks to Event-Driven
Spiking Neural Networks
- URL: http://arxiv.org/abs/2009.14456v2
- Date: Wed, 23 Dec 2020 01:21:48 GMT
- Title: Strategy and Benchmark for Converting Deep Q-Networks to Event-Driven
Spiking Neural Networks
- Authors: Weihao Tan, Devdhar Patel, Robert Kozma
- Abstract summary: Spiking neural networks (SNNs) have great potential for energy-efficient implementation of Deep Neural Networks (DNNs) on dedicated neuromorphic hardware.
Recent studies demonstrated competitive performance of SNNs compared with DNNs on image classification tasks.
- Score: 5.8010446129208155
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking neural networks (SNNs) have great potential for energy-efficient
implementation of Deep Neural Networks (DNNs) on dedicated neuromorphic
hardware. Recent studies demonstrated competitive performance of SNNs compared
with DNNs on image classification tasks, including CIFAR-10 and ImageNet data.
The present work focuses on using SNNs in combination with deep reinforcement
learning in ATARI games, which involves additional complexity as compared to
image classification. We review the theory of converting DNNs to SNNs and
extending the conversion to Deep Q-Networks (DQNs). We propose a robust
representation of the firing rate to reduce the error during the conversion
process. In addition, we introduce a new metric to evaluate the conversion
process by comparing the decisions made by the DQN and SNN, respectively. We
also analyze how the simulation time and parameter normalization influence the
performance of converted SNNs. We achieve competitive scores on 17
top-performing Atari games. To the best of our knowledge, our work is the first
to achieve state-of-the-art performance on multiple Atari games with SNNs. Our
work serves as a benchmark for the conversion of DQNs to SNNs and paves the way
for further research on solving reinforcement learning tasks with SNNs.
Related papers
- NAS-BNN: Neural Architecture Search for Binary Neural Networks [55.058512316210056]
We propose a novel neural architecture search scheme for binary neural networks, named NAS-BNN.
Our discovered binary model family outperforms previous BNNs for a wide range of operations (OPs) from 20M to 200M.
In addition, we validate the transferability of these searched BNNs on the object detection task, and our binary detectors with the searched BNNs achieve a novel state-of-the-art result, e.g., 31.6% mAP with 370M OPs, on MS dataset.
arXiv Detail & Related papers (2024-08-28T02:17:58Z) - LaSNN: Layer-wise ANN-to-SNN Distillation for Effective and Efficient
Training in Deep Spiking Neural Networks [7.0691139514420005]
Spiking Neural Networks (SNNs) are biologically realistic and practically promising in low-power because of their event-driven mechanism.
A conversion scheme is proposed to obtain competitive accuracy by mapping trained ANNs' parameters to SNNs with the same structures.
A novel SNN training framework is proposed, namely layer-wise ANN-to-SNN knowledge distillation (LaSNN)
arXiv Detail & Related papers (2023-04-17T03:49:35Z) - Optimal ANN-SNN Conversion for High-accuracy and Ultra-low-latency
Spiking Neural Networks [22.532709609646066]
Spiking Neural Networks (SNNs) have gained great attraction due to their distinctive properties of low power consumption and fast inference on neuromorphic hardware.
As the most effective method to get deep SNNs, ANN-SNN conversion has achieved comparable performance as ANNs on large-scale datasets.
In this paper, we theoretically analyze ANN-SNN conversion error and derive the estimated activation function of SNNs.
We prove that the expected conversion error between SNNs and ANNs is zero, enabling us to achieve high-accuracy and ultra-low-latency SNN
arXiv Detail & Related papers (2023-03-08T03:04:53Z) - Spiking Neural Network Decision Feedback Equalization [70.3497683558609]
We propose an SNN-based equalizer with a feedback structure akin to the decision feedback equalizer (DFE)
We show that our approach clearly outperforms conventional linear equalizers for three different exemplary channels.
The proposed SNN with a decision feedback structure enables the path to competitive energy-efficient transceivers.
arXiv Detail & Related papers (2022-11-09T09:19:15Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Optimal Conversion of Conventional Artificial Neural Networks to Spiking
Neural Networks [0.0]
Spiking neural networks (SNNs) are biology-inspired artificial neural networks (ANNs)
We propose a novel strategic pipeline that transfers the weights to the target SNN by combining threshold balance and soft-reset mechanisms.
Our method is promising to get implanted onto embedded platforms with better support of SNNs with limited energy and memory.
arXiv Detail & Related papers (2021-02-28T12:04:22Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Boosting Deep Neural Networks with Geometrical Prior Knowledge: A Survey [77.99182201815763]
Deep Neural Networks (DNNs) achieve state-of-the-art results in many different problem settings.
DNNs are often treated as black box systems, which complicates their evaluation and validation.
One promising field, inspired by the success of convolutional neural networks (CNNs) in computer vision tasks, is to incorporate knowledge about symmetric geometrical transformations.
arXiv Detail & Related papers (2020-06-30T14:56:05Z) - Training Deep Spiking Neural Networks [0.0]
Brain-inspired spiking neural networks (SNNs) with neuromorphic hardware may offer orders of magnitude higher energy efficiency.
We show that is is possible to train SNN with ResNet50 architecture on CIFAR100 and Imagenette object recognition datasets.
The trained SNN falls behind in accuracy compared to analogous ANN but requires several orders of magnitude less inference time steps.
arXiv Detail & Related papers (2020-06-08T09:47:05Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z) - An Efficient Spiking Neural Network for Recognizing Gestures with a DVS
Camera on the Loihi Neuromorphic Processor [12.118084418840152]
Spiking Neural Networks (SNNs) have come under the spotlight for machine learning based applications.
We show our methodology for the design of an SNN that achieves nearly the same accuracy results as its corresponding Deep Neural Networks (DNNs)
Our SNN achieves 89.64% classification accuracy and occupies only 37 Loihi cores.
arXiv Detail & Related papers (2020-05-16T17:00:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.