Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network
- URL: http://arxiv.org/abs/2201.10943v1
- Date: Tue, 25 Jan 2022 02:05:20 GMT
- Title: Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network
- Authors: Lin Zhu, Xiao Wang, Yi Chang, Jianing Li, Tiejun Huang, Yonghong Tian
- Abstract summary: Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
- Score: 48.88510552931186
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neuromorphic vision sensor is a new bio-inspired imaging paradigm that
reports asynchronous, continuously per-pixel brightness changes called `events'
with high temporal resolution and high dynamic range. So far, the event-based
image reconstruction methods are based on artificial neural networks (ANN) or
hand-crafted spatiotemporal smoothing techniques. In this paper, we first
implement the image reconstruction work via fully spiking neural network (SNN)
architecture. As the bio-inspired neural networks, SNNs operating with
asynchronous binary spikes distributed over time, can potentially lead to
greater computational efficiency on event-driven hardware. We propose a novel
Event-based Video reconstruction framework based on a fully Spiking Neural
Network (EVSNN), which utilizes Leaky-Integrate-and-Fire (LIF) neuron and
Membrane Potential (MP) neuron. We find that the spiking neurons have the
potential to store useful temporal information (memory) to complete such
time-dependent tasks. Furthermore, to better utilize the temporal information,
we propose a hybrid potential-assisted framework (PA-EVSNN) using the membrane
potential of spiking neuron. The proposed neuron is referred as Adaptive
Membrane Potential (AMP) neuron, which adaptively updates the membrane
potential according to the input spikes. The experimental results demonstrate
that our models achieve comparable performance to ANN-based models on IJRR,
MVSEC, and HQF datasets. The energy consumptions of EVSNN and PA-EVSNN are
19.36$\times$ and 7.75$\times$ more computationally efficient than their ANN
architectures, respectively.
Related papers
- When Spiking neural networks meet temporal attention image decoding and adaptive spiking neuron [7.478056407323783]
Spiking Neural Networks (SNNs) are capable of encoding and processing temporal information in a biologically plausible way.
We propose a novel method for image decoding based on temporal attention (TAID) and an adaptive Leaky-Integrate-and-Fire neuron model.
arXiv Detail & Related papers (2024-06-05T08:21:55Z) - Fully Spiking Actor Network with Intra-layer Connections for
Reinforcement Learning [51.386945803485084]
We focus on the task where the agent needs to learn multi-dimensional deterministic policies to control.
Most existing spike-based RL methods take the firing rate as the output of SNNs, and convert it to represent continuous action space (i.e., the deterministic policy) through a fully-connected layer.
To develop a fully spiking actor network without any floating-point matrix operations, we draw inspiration from the non-spiking interneurons found in insects.
arXiv Detail & Related papers (2024-01-09T07:31:34Z) - Deep Pulse-Coupled Neural Networks [31.65350290424234]
Neural Networks (SNNs) capture the information processing mechanism of the brain by taking advantage of neurons.
In this work, we leverage a more biologically plausible neural model with complex dynamics, i.e., a pulse-coupled neural network (PCNN)
We construct deep pulse-coupled neural networks (DPCNNs) by replacing commonly used LIF neurons in SNNs with PCNN neurons.
arXiv Detail & Related papers (2023-12-24T08:26:00Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Exploiting High Performance Spiking Neural Networks with Efficient
Spiking Patterns [4.8416725611508244]
Spiking Neural Networks (SNNs) use discrete spike sequences to transmit information, which significantly mimics the information transmission of the brain.
This paper introduces the dynamic Burst pattern and designs the Leaky Integrate and Fire or Burst (LIFB) neuron that can make a trade-off between short-time performance and dynamic temporal performance.
arXiv Detail & Related papers (2023-01-29T04:22:07Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - A Spiking Neural Network for Image Segmentation [3.4998703934432682]
We convert the deep Artificial Neural Network (ANN) architecture U-Net to a Spiking Neural Network (SNN) architecture using the Nengo framework.
Both rate-based and spike-based models are trained and optimized for benchmarking performance and power.
The neuromorphic implementation on the Intel Loihi neuromorphic chip is over 2x more energy-efficient than conventional hardware.
arXiv Detail & Related papers (2021-06-16T16:23:18Z) - Effective and Efficient Computation with Multiple-timescale Spiking
Recurrent Neural Networks [0.9790524827475205]
We show how a novel type of adaptive spiking recurrent neural network (SRNN) is able to achieve state-of-the-art performance.
We calculate a $>$100x energy improvement for our SRNNs over classical RNNs on the harder tasks.
arXiv Detail & Related papers (2020-05-24T01:04:53Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z) - Non-linear Neurons with Human-like Apical Dendrite Activations [81.18416067005538]
We show that a standard neuron followed by our novel apical dendrite activation (ADA) can learn the XOR logical function with 100% accuracy.
We conduct experiments on six benchmark data sets from computer vision, signal processing and natural language processing.
arXiv Detail & Related papers (2020-02-02T21:09:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.