Spiking Synaptic Penalty: Appropriate Penalty Term for Energy-Efficient
Spiking Neural Networks
- URL: http://arxiv.org/abs/2302.01500v1
- Date: Fri, 3 Feb 2023 02:30:00 GMT
- Title: Spiking Synaptic Penalty: Appropriate Penalty Term for Energy-Efficient
Spiking Neural Networks
- Authors: Kazuma Suetake, Takuya Ushimaru, Ryuji Saiin, Yoshihide Sawada
- Abstract summary: Spiking neural networks (SNNs) are energy-efficient neural networks because of their spiking nature.
Here, we tackle this problem by introducing a novel penalty term for the spiking activity into the objective function in the training phase.
Our method is designed so as to optimize the energy consumption metric directly without modifying the network architecture.
- Score: 0.40145248246551063
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking neural networks (SNNs) are energy-efficient neural networks because
of their spiking nature. However, as the spike firing rate of SNNs increases,
the energy consumption does as well, and thus, the advantage of SNNs
diminishes. Here, we tackle this problem by introducing a novel penalty term
for the spiking activity into the objective function in the training phase. Our
method is designed so as to optimize the energy consumption metric directly
without modifying the network architecture. Therefore, the proposed method can
reduce the energy consumption more than other methods while maintaining the
accuracy. We conducted experiments for image classification tasks, and the
results indicate the effectiveness of the proposed method, which mitigates the
dilemma of the energy--accuracy trade-off.
Related papers
- Spiking Meets Attention: Efficient Remote Sensing Image Super-Resolution with Attention Spiking Neural Networks [57.17129753411926]
Spiking neural networks (SNNs) are emerging as a promising alternative to traditional artificial neural networks (ANNs)
We propose SpikeSR, which achieves state-of-the-art performance across various remote sensing benchmarks such as AID, DOTA, and DIOR.
arXiv Detail & Related papers (2025-03-06T09:06:06Z) - Deep-Unrolling Multidimensional Harmonic Retrieval Algorithms on Neuromorphic Hardware [78.17783007774295]
This paper explores the potential of conversion-based neuromorphic algorithms for highly accurate and energy-efficient single-snapshot multidimensional harmonic retrieval.
A novel method for converting the complex-valued convolutional layers and activations into spiking neural networks (SNNs) is developed.
The converted SNNs achieve almost five-fold power efficiency at moderate performance loss compared to the original CNNs.
arXiv Detail & Related papers (2024-12-05T09:41:33Z) - Multi-Bit Mechanism: A Novel Information Transmission Paradigm for Spiking Neural Networks [4.552065156611815]
spiking neural networks (SNNs) gain recognition for their high performance, low power consumption and enhanced biological interpretability.
Currently, the binary nature of spikes leads to considerable information loss in SNNs, ultimately causing performance degradation.
Our research introduces a multi-bit information transmission mechanism for SNNs.
arXiv Detail & Related papers (2024-07-08T08:46:31Z) - On Reducing Activity with Distillation and Regularization for Energy Efficient Spiking Neural Networks [0.19999259391104385]
Interest in spiking neural networks (SNNs) has been growing steadily, promising an energy-efficient alternative to formal neural networks (FNNs)
We propose to leverage Knowledge Distillation (KD) for SNNs training with surrogate gradient descent in order to optimize the trade-off between performance and spiking activity.
arXiv Detail & Related papers (2024-06-26T13:51:57Z) - Energy-efficient Spiking Neural Network Equalization for IM/DD Systems
with Optimized Neural Encoding [53.909333359654276]
We propose an energy-efficient equalizer for IM/DD systems based on spiking neural networks.
We optimize a neural spike encoding that boosts the equalizer's performance while decreasing energy consumption.
arXiv Detail & Related papers (2023-12-20T10:45:24Z) - Adaptive Calibration: A Unified Conversion Framework of Spiking Neural Networks [1.632439547798896]
Spiking Neural Networks (SNNs) have emerged as a promising energy-efficient alternative to traditional Artificial Neural Networks (ANNs)
This paper focuses on addressing the dual objectives of enhancing the performance and efficiency of SNNs through the established SNN conversion framework.
arXiv Detail & Related papers (2023-11-24T03:43:59Z) - Deep Multi-Threshold Spiking-UNet for Image Processing [51.88730892920031]
This paper introduces the novel concept of Spiking-UNet for image processing, which combines the power of Spiking Neural Networks (SNNs) with the U-Net architecture.
To achieve an efficient Spiking-UNet, we face two primary challenges: ensuring high-fidelity information propagation through the network via spikes and formulating an effective training strategy.
Experimental results show that, on image segmentation and denoising, our Spiking-UNet achieves comparable performance to its non-spiking counterpart.
arXiv Detail & Related papers (2023-07-20T16:00:19Z) - SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural
Networks [56.35403810762512]
Spiking neural networks (SNNs) with event-based computation are promising brain-inspired models for energy-efficient applications on neuromorphic hardware.
We study spike-based implicit differentiation on the equilibrium state (SPIDE) that extends the recently proposed training method.
arXiv Detail & Related papers (2023-02-01T04:22:59Z) - Fast Exploration of the Impact of Precision Reduction on Spiking Neural
Networks [63.614519238823206]
Spiking Neural Networks (SNNs) are a practical choice when the target hardware reaches the edge of computing.
We employ an Interval Arithmetic (IA) model to develop an exploration methodology that takes advantage of the capability of such a model to propagate the approximation error.
arXiv Detail & Related papers (2022-11-22T15:08:05Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Optimizing the Consumption of Spiking Neural Networks with Activity
Regularization [15.317534913990633]
Spiking Neural Networks (SNNs) are an example of bio-inspired techniques that can further save energy by using binary activations, and avoid consuming energy when not spiking.
In this work, we look into different techniques to enforce sparsity on the neural network activation maps and compare the effect of different training regularizers on the efficiency of the optimized DNNs and SNNs.
arXiv Detail & Related papers (2022-04-04T13:19:47Z) - SpikeDyn: A Framework for Energy-Efficient Spiking Neural Networks with
Continual and Unsupervised Learning Capabilities in Dynamic Environments [14.727296040550392]
Spiking Neural Networks (SNNs) bear the potential of efficient unsupervised and continual learning capabilities because of their biological plausibility.
We propose SpikeDyn, a framework for energy-efficient SNNs with continual and unsupervised learning capabilities in dynamic environments.
arXiv Detail & Related papers (2021-02-28T08:26:23Z) - And/or trade-off in artificial neurons: impact on adversarial robustness [91.3755431537592]
Presence of sufficient number of OR-like neurons in a network can lead to classification brittleness and increased vulnerability to adversarial attacks.
We define AND-like neurons and propose measures to increase their proportion in the network.
Experimental results on the MNIST dataset suggest that our approach holds promise as a direction for further exploration.
arXiv Detail & Related papers (2021-02-15T08:19:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.