SpikeSEE: An Energy-Efficient Dynamic Scenes Processing Framework for
Retinal Prostheses
- URL: http://arxiv.org/abs/2209.07898v1
- Date: Fri, 16 Sep 2022 12:46:10 GMT
- Title: SpikeSEE: An Energy-Efficient Dynamic Scenes Processing Framework for
Retinal Prostheses
- Authors: Chuanqing Wang, Chaoming Fang, Yong Zou, Jie Yang, and Mohamad Sawan
- Abstract summary: We propose an energy-efficient dynamic scenes processing framework (SpikeSEE) that combines a spike representation encoding technique and a bio-inspired spiking recurrent neural network (SRNN) model.
Our proposed SpikeSEE predicts the response of ganglion cells more accurately with lower energy consumption.
- Score: 3.794154439461156
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Intelligent and low-power retinal prostheses are highly demanded in this era,
where wearable and implantable devices are used for numerous healthcare
applications. In this paper, we propose an energy-efficient dynamic scenes
processing framework (SpikeSEE) that combines a spike representation encoding
technique and a bio-inspired spiking recurrent neural network (SRNN) model to
achieve intelligent processing and extreme low-power computation for retinal
prostheses. The spike representation encoding technique could interpret dynamic
scenes with sparse spike trains, decreasing the data volume. The SRNN model,
inspired by the human retina special structure and spike processing method, is
adopted to predict the response of ganglion cells to dynamic scenes.
Experimental results show that the Pearson correlation coefficient of the
proposed SRNN model achieves 0.93, which outperforms the state of the art
processing framework for retinal prostheses. Thanks to the spike representation
and SRNN processing, the model can extract visual features in a
multiplication-free fashion. The framework achieves 12 times power reduction
compared with the convolutional recurrent neural network (CRNN)
processing-based framework. Our proposed SpikeSEE predicts the response of
ganglion cells more accurately with lower energy consumption, which alleviates
the precision and power issues of retinal prostheses and provides a potential
solution for wearable or implantable prostheses.
Related papers
- Stepwise Weighted Spike Coding for Deep Spiking Neural Networks [7.524721345903027]
Spiking Neural Networks (SNNs) seek to mimic the spiking behavior of biological neurons.
We propose a novel Stepwise Weighted Spike (SWS) coding scheme to enhance the encoding of information in spikes.
This approach compresses the spikes by weighting the significance of the spike in each step of neural computation, achieving high performance and low energy consumption.
arXiv Detail & Related papers (2024-08-30T12:39:25Z) - Fully Spiking Denoising Diffusion Implicit Models [61.32076130121347]
Spiking neural networks (SNNs) have garnered considerable attention owing to their ability to run on neuromorphic devices with super-high speeds.
We propose a novel approach fully spiking denoising diffusion implicit model (FSDDIM) to construct a diffusion model within SNNs.
We demonstrate that the proposed method outperforms the state-of-the-art fully spiking generative model.
arXiv Detail & Related papers (2023-12-04T09:07:09Z) - On the Trade-off Between Efficiency and Precision of Neural Abstraction [62.046646433536104]
Neural abstractions have been recently introduced as formal approximations of complex, nonlinear dynamical models.
We employ formal inductive synthesis procedures to generate neural abstractions that result in dynamical models with these semantics.
arXiv Detail & Related papers (2023-07-28T13:22:32Z) - Evolving Connectivity for Recurrent Spiking Neural Networks [8.80300633999542]
Recurrent neural networks (RSNNs) hold great potential for advancing artificial general intelligence.
We propose the evolving connectivity (EC) framework, an inference-only method for training RSNNs.
arXiv Detail & Related papers (2023-05-28T07:08:25Z) - SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural
Networks [56.35403810762512]
Spiking neural networks (SNNs) with event-based computation are promising brain-inspired models for energy-efficient applications on neuromorphic hardware.
We study spike-based implicit differentiation on the equilibrium state (SPIDE) that extends the recently proposed training method.
arXiv Detail & Related papers (2023-02-01T04:22:59Z) - Exploiting High Performance Spiking Neural Networks with Efficient
Spiking Patterns [4.8416725611508244]
Spiking Neural Networks (SNNs) use discrete spike sequences to transmit information, which significantly mimics the information transmission of the brain.
This paper introduces the dynamic Burst pattern and designs the Leaky Integrate and Fire or Burst (LIFB) neuron that can make a trade-off between short-time performance and dynamic temporal performance.
arXiv Detail & Related papers (2023-01-29T04:22:07Z) - An Adversarial Active Sampling-based Data Augmentation Framework for
Manufacturable Chip Design [55.62660894625669]
Lithography modeling is a crucial problem in chip design to ensure a chip design mask is manufacturable.
Recent developments in machine learning have provided alternative solutions in replacing the time-consuming lithography simulations with deep neural networks.
We propose a litho-aware data augmentation framework to resolve the dilemma of limited data and improve the machine learning model performance.
arXiv Detail & Related papers (2022-10-27T20:53:39Z) - Low-Light Image Restoration Based on Retina Model using Neural Networks [0.0]
The proposed neural network model saves the cost of computational overhead in contrast with traditional signal-processing models, and generates results comparable with complicated deep learning models from the subjective perspective.
This work shows that to directly simulate the functionalities of retinal neurons using neural networks not only avoids the manually seeking for the optimal parameters, but also paves the way to build corresponding artificial versions for certain neurobiological organizations.
arXiv Detail & Related papers (2022-10-04T08:14:49Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - Towards Lightweight Controllable Audio Synthesis with Conditional
Implicit Neural Representations [10.484851004093919]
Implicit neural representations (INRs) are neural networks used to approximate low-dimensional functions.
In this work we shed light on the potential of Conditional Implicit Neural Representations (CINRs) as lightweight backbones in generative frameworks for audio synthesis.
arXiv Detail & Related papers (2021-11-14T13:36:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.