Hybrid Spiking Neural Network Fine-tuning for Hippocampus Segmentation
- URL: http://arxiv.org/abs/2302.07328v1
- Date: Tue, 14 Feb 2023 20:18:57 GMT
- Title: Hybrid Spiking Neural Network Fine-tuning for Hippocampus Segmentation
- Authors: Ye Yue, Marc Baltes, Nidal Abujahar, Tao Sun, Charles D. Smith, Trevor
Bihl, Jundong Liu
- Abstract summary: Spiking neural networks (SNNs) have emerged as a low-power alternative to artificial neural networks (ANNs)
In this work, we propose a hybrid SNN training scheme and apply it to segment human hippocampi from magnetic resonance images.
- Score: 3.1247096708403914
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Over the past decade, artificial neural networks (ANNs) have made tremendous
advances, in part due to the increased availability of annotated data. However,
ANNs typically require significant power and memory consumptions to reach their
full potential. Spiking neural networks (SNNs) have recently emerged as a
low-power alternative to ANNs due to their sparsity nature. SNN, however, are
not as easy to train as ANNs. In this work, we propose a hybrid SNN training
scheme and apply it to segment human hippocampi from magnetic resonance images.
Our approach takes ANN-SNN conversion as an initialization step and relies on
spike-based backpropagation to fine-tune the network. Compared with the
conversion and direct training solutions, our method has advantages in both
segmentation accuracy and training efficiency. Experiments demonstrate the
effectiveness of our model in achieving the design goals.
Related papers
- LC-TTFS: Towards Lossless Network Conversion for Spiking Neural Networks
with TTFS Coding [55.64533786293656]
We show that our algorithm can achieve a near-perfect mapping between the activation values of an ANN and the spike times of an SNN on a number of challenging AI tasks.
The study paves the way for deploying ultra-low-power TTFS-based SNNs on power-constrained edge computing platforms.
arXiv Detail & Related papers (2023-10-23T14:26:16Z) - SPENSER: Towards a NeuroEvolutionary Approach for Convolutional Spiking
Neural Networks [0.0]
Spiking Neural Networks (SNNs) have attracted recent interest due to their energy efficiency and biological plausibility.
There is no consensus on the best learning algorithm for SNNs.
In this paper, we propose SPENSER, a framework for SNN generation based on DENSER.
arXiv Detail & Related papers (2023-05-18T14:06:37Z) - Joint A-SNN: Joint Training of Artificial and Spiking Neural Networks
via Self-Distillation and Weight Factorization [12.1610509770913]
Spiking Neural Networks (SNNs) mimic the spiking nature of brain neurons.
We propose a joint training framework of ANN and SNN, in which the ANN can guide the SNN's optimization.
Our method consistently outperforms many other state-of-the-art training methods.
arXiv Detail & Related papers (2023-05-03T13:12:17Z) - LaSNN: Layer-wise ANN-to-SNN Distillation for Effective and Efficient
Training in Deep Spiking Neural Networks [7.0691139514420005]
Spiking Neural Networks (SNNs) are biologically realistic and practically promising in low-power because of their event-driven mechanism.
A conversion scheme is proposed to obtain competitive accuracy by mapping trained ANNs' parameters to SNNs with the same structures.
A novel SNN training framework is proposed, namely layer-wise ANN-to-SNN knowledge distillation (LaSNN)
arXiv Detail & Related papers (2023-04-17T03:49:35Z) - Joint ANN-SNN Co-training for Object Localization and Image Segmentation [0.0]
Spiking neural networks (SNNs) have emerged as a low-power alternative to deep artificial neural networks (ANNs)
We propose a novel hybrid ANN-SNN co-training framework to improve the performance of converted SNNs.
arXiv Detail & Related papers (2023-03-10T14:45:02Z) - SNN2ANN: A Fast and Memory-Efficient Training Framework for Spiking
Neural Networks [117.56823277328803]
Spiking neural networks are efficient computation models for low-power environments.
We propose a SNN-to-ANN (SNN2ANN) framework to train the SNN in a fast and memory-efficient way.
Experiment results show that our SNN2ANN-based models perform well on the benchmark datasets.
arXiv Detail & Related papers (2022-06-19T16:52:56Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Kernel Based Progressive Distillation for Adder Neural Networks [71.731127378807]
Adder Neural Networks (ANNs) which only contain additions bring us a new way of developing deep neural networks with low energy consumption.
There is an accuracy drop when replacing all convolution filters by adder filters.
We present a novel method for further improving the performance of ANNs without increasing the trainable parameters.
arXiv Detail & Related papers (2020-09-28T03:29:19Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Training Deep Spiking Neural Networks [0.0]
Brain-inspired spiking neural networks (SNNs) with neuromorphic hardware may offer orders of magnitude higher energy efficiency.
We show that is is possible to train SNN with ResNet50 architecture on CIFAR100 and Imagenette object recognition datasets.
The trained SNN falls behind in accuracy compared to analogous ANN but requires several orders of magnitude less inference time steps.
arXiv Detail & Related papers (2020-06-08T09:47:05Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.