LIAF-Net: Leaky Integrate and Analog Fire Network for Lightweight and
Efficient Spatiotemporal Information Processing
- URL: http://arxiv.org/abs/2011.06176v1
- Date: Thu, 12 Nov 2020 03:04:21 GMT
- Title: LIAF-Net: Leaky Integrate and Analog Fire Network for Lightweight and
Efficient Spatiotemporal Information Processing
- Authors: Zhenzhi Wu, Hehui Zhang, Yihan Lin, Guoqi Li, Meng Wang, Ye Tang
- Abstract summary: Deep network LIAF-Net is built on LIF-SNN for efficienttemporal processing.
As atemporal layer, LIAF can also be used with traditional artificial neural network (ANN) layers jointly.
- Score: 16.446511505488633
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spiking neural networks (SNNs) based on Leaky Integrate and Fire (LIF) model
have been applied to energy-efficient temporal and spatiotemporal processing
tasks. Thanks to the bio-plausible neuronal dynamics and simplicity, LIF-SNN
benefits from event-driven processing, however, usually faces the embarrassment
of reduced performance. This may because in LIF-SNN the neurons transmit
information via spikes. To address this issue, in this work, we propose a Leaky
Integrate and Analog Fire (LIAF) neuron model, so that analog values can be
transmitted among neurons, and a deep network termed as LIAF-Net is built on it
for efficient spatiotemporal processing. In the temporal domain, LIAF follows
the traditional LIF dynamics to maintain its temporal processing capability. In
the spatial domain, LIAF is able to integrate spatial information through
convolutional integration or fully-connected integration. As a spatiotemporal
layer, LIAF can also be used with traditional artificial neural network (ANN)
layers jointly. Experiment results indicate that LIAF-Net achieves comparable
performance to Gated Recurrent Unit (GRU) and Long short-term memory (LSTM) on
bAbI Question Answering (QA) tasks, and achieves state-of-the-art performance
on spatiotemporal Dynamic Vision Sensor (DVS) datasets, including MNIST-DVS,
CIFAR10-DVS and DVS128 Gesture, with much less number of synaptic weights and
computational overhead compared with traditional networks built by LSTM, GRU,
Convolutional LSTM (ConvLSTM) or 3D convolution (Conv3D). Compared with
traditional LIF-SNN, LIAF-Net also shows dramatic accuracy gain on all these
experiments. In conclusion, LIAF-Net provides a framework combining the
advantages of both ANNs and SNNs for lightweight and efficient spatiotemporal
information processing.
Related papers
- CLIF: Complementary Leaky Integrate-and-Fire Neuron for Spiking Neural Networks [5.587069105667678]
Spiking neural networks (SNNs) are promising brain-inspired energy-efficient models.
It remains a challenge to train SNNs due to their undifferentiable spiking mechanism.
We propose Leaky Integrate-and-Fire Neuron-based SNNs and Complementary Leaky Integrate-and-Fire Neuron.
arXiv Detail & Related papers (2024-02-07T08:51:57Z) - LC-TTFS: Towards Lossless Network Conversion for Spiking Neural Networks
with TTFS Coding [55.64533786293656]
We show that our algorithm can achieve a near-perfect mapping between the activation values of an ANN and the spike times of an SNN on a number of challenging AI tasks.
The study paves the way for deploying ultra-low-power TTFS-based SNNs on power-constrained edge computing platforms.
arXiv Detail & Related papers (2023-10-23T14:26:16Z) - Accelerating SNN Training with Stochastic Parallelizable Spiking Neurons [1.7056768055368383]
Spiking neural networks (SNN) are able to learn features while using less energy, especially on neuromorphic hardware.
Most widely used neuron in deep learning is the temporal and Fire (LIF) neuron.
arXiv Detail & Related papers (2023-06-22T04:25:27Z) - Properties and Potential Applications of Random Functional-Linked Types
of Neural Networks [81.56822938033119]
Random functional-linked neural networks (RFLNNs) offer an alternative way of learning in deep structure.
This paper gives some insights into the properties of RFLNNs from the viewpoints of frequency domain.
We propose a method to generate a BLS network with better performance, and design an efficient algorithm for solving Poison's equation.
arXiv Detail & Related papers (2023-04-03T13:25:22Z) - Towards Energy-Efficient, Low-Latency and Accurate Spiking LSTMs [1.7969777786551424]
Spiking Neural Networks (SNNs) have emerged as an attractive-temporal computing paradigm vision for complex tasks.
We propose an optimized spiking long short-term memory networks (LSTM) training framework that involves a novel.
rev-to-SNN conversion framework, followed by SNN training.
We evaluate our framework on sequential learning tasks including temporal M, Google Speech Commands (GSC) datasets, and UCI Smartphone on different LSTM architectures.
arXiv Detail & Related papers (2022-10-23T04:10:27Z) - Adaptive-SpikeNet: Event-based Optical Flow Estimation using Spiking
Neural Networks with Learnable Neuronal Dynamics [6.309365332210523]
Spiking Neural Networks (SNNs) with their neuro-inspired event-driven processing can efficiently handle asynchronous data.
We propose an adaptive fully-spiking framework with learnable neuronal dynamics to alleviate the spike vanishing problem.
Our experiments on datasets show an average reduction of 13% in average endpoint error (AEE) compared to state-of-the-art ANNs.
arXiv Detail & Related papers (2022-09-21T21:17:56Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - A journey in ESN and LSTM visualisations on a language task [77.34726150561087]
We trained ESNs and LSTMs on a Cross-Situationnal Learning (CSL) task.
The results are of three kinds: performance comparison, internal dynamics analyses and visualization of latent space.
arXiv Detail & Related papers (2020-12-03T08:32:01Z) - Automatic Remaining Useful Life Estimation Framework with Embedded
Convolutional LSTM as the Backbone [5.927250637620123]
We propose a new LSTM variant called embedded convolutional LSTM (E NeuralTM)
In ETM a group of different 1D convolutions is embedded into the LSTM structure. Through this, the temporal information is preserved between and within windows.
We show the superiority of our proposed ETM approach over the state-of-the-art approaches on several widely used benchmark data sets for RUL Estimation.
arXiv Detail & Related papers (2020-08-10T08:34:20Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.