Effects of VLSI Circuit Constraints on Temporal-Coding Multilayer
Spiking Neural Networks
- URL: http://arxiv.org/abs/2106.10382v2
- Date: Fri, 25 Jun 2021 01:27:25 GMT
- Title: Effects of VLSI Circuit Constraints on Temporal-Coding Multilayer
Spiking Neural Networks
- Authors: Yusuke Sakemi, Takashi Morie, Takeo Hosomi, Kazuyuki Aihara
- Abstract summary: spiking neural network (SNN) has been attracting considerable attention not only as a mathematical model for the brain, but also as an energy-efficient information processing model for real-world applications.
In this study, we investigated the effects of the time discretization and/or weight quantization on the performance of SNNs.
- Score: 2.5234156040689237
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The spiking neural network (SNN) has been attracting considerable attention
not only as a mathematical model for the brain, but also as an energy-efficient
information processing model for real-world applications. In particular, SNNs
based on temporal coding are expected to be much more efficient than those
based on rate coding, because the former requires substantially fewer spikes to
carry out tasks. As SNNs are continuous-state and continuous-time models, it is
favorable to implement them with analog VLSI circuits. However, the
construction of the entire system with continuous-time analog circuits would be
infeasible when the system size is very large. Therefore, mixed-signal circuits
must be employed, and the time discretization and quantization of the synaptic
weights are necessary. Moreover, the analog VLSI implementation of SNNs
exhibits non-idealities, such as the effects of noise and device mismatches, as
well as other constraints arising from the analog circuit operation. In this
study, we investigated the effects of the time discretization and/or weight
quantization on the performance of SNNs. Furthermore, we elucidated the effects
the lower bound of the membrane potentials and the temporal fluctuation of the
firing threshold. Finally, we propose an optimal approach for the mapping of
mathematical SNN models to analog circuits with discretized time.
Related papers
- How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Input-Aware Dynamic Timestep Spiking Neural Networks for Efficient
In-Memory Computing [7.738130109655604]
Spiking Neural Networks (SNNs) have attracted widespread research interest because of their capability to process sparse and binary spike information.
We show that the energy cost and latency of SNNs scale linearly with the number of timesteps used on IMC hardware.
We propose input-aware Dynamic Timestep SNN (DT-SNN) to maximize the efficiency of SNNs.
arXiv Detail & Related papers (2023-05-27T03:01:27Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Adaptive-SpikeNet: Event-based Optical Flow Estimation using Spiking
Neural Networks with Learnable Neuronal Dynamics [6.309365332210523]
Spiking Neural Networks (SNNs) with their neuro-inspired event-driven processing can efficiently handle asynchronous data.
We propose an adaptive fully-spiking framework with learnable neuronal dynamics to alleviate the spike vanishing problem.
Our experiments on datasets show an average reduction of 13% in average endpoint error (AEE) compared to state-of-the-art ANNs.
arXiv Detail & Related papers (2022-09-21T21:17:56Z) - Space-Time Graph Neural Networks [104.55175325870195]
We introduce space-time graph neural network (ST-GNN) to jointly process the underlying space-time topology of time-varying network data.
Our analysis shows that small variations in the network topology and time evolution of a system does not significantly affect the performance of ST-GNNs.
arXiv Detail & Related papers (2021-10-06T16:08:44Z) - Coupled Oscillatory Recurrent Neural Network (coRNN): An accurate and
(gradient) stable architecture for learning long time dependencies [15.2292571922932]
We propose a novel architecture for recurrent neural networks.
Our proposed RNN is based on a time-discretization of a system of second-order ordinary differential equations.
Experiments show that the proposed RNN is comparable in performance to the state of the art on a variety of benchmarks.
arXiv Detail & Related papers (2020-10-02T12:35:04Z) - Multi-Tones' Phase Coding (MTPC) of Interaural Time Difference by
Spiking Neural Network [68.43026108936029]
We propose a pure spiking neural network (SNN) based computational model for precise sound localization in the noisy real-world environment.
We implement this algorithm in a real-time robotic system with a microphone array.
The experiment results show a mean error azimuth of 13 degrees, which surpasses the accuracy of the other biologically plausible neuromorphic approach for sound source localization.
arXiv Detail & Related papers (2020-07-07T08:22:56Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Exploiting Neuron and Synapse Filter Dynamics in Spatial Temporal
Learning of Deep Spiking Neural Network [7.503685643036081]
A bio-plausible SNN model with spatial-temporal property is a complex dynamic system.
We formulate SNN as a network of infinite impulse response (IIR) filters with neuron nonlinearity.
We propose a training algorithm that is capable to learn spatial-temporal patterns by searching for the optimal synapse filter kernels and weights.
arXiv Detail & Related papers (2020-02-19T01:27:39Z) - A Supervised Learning Algorithm for Multilayer Spiking Neural Networks
Based on Temporal Coding Toward Energy-Efficient VLSI Processor Design [2.6872737601772956]
Spiking neural networks (SNNs) are brain-inspired mathematical models with the ability to process information in the form of spikes.
We propose a novel supervised learning algorithm for SNNs based on temporal coding.
arXiv Detail & Related papers (2020-01-08T03:37:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.