Communication Trade-offs in Federated Learning of Spiking Neural
Networks
- URL: http://arxiv.org/abs/2303.00928v1
- Date: Mon, 27 Feb 2023 19:12:03 GMT
- Title: Communication Trade-offs in Federated Learning of Spiking Neural
Networks
- Authors: Soumi Chaki, David Weinberg, and Ayca \"Ozcelikkale
- Abstract summary: Spiking Neural Networks (SNNs) are biologically inspired alternatives to conventional Artificial Neural Networks (ANNs)
We consider SNNs in a federated learning setting where a high-quality global model is created by aggregating multiple local models from the clients without sharing any data.
We evaluate the performance of the SNNs using a subset of the Spiking Heidelberg digits (SHD) dataset.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking Neural Networks (SNNs) are biologically inspired alternatives to
conventional Artificial Neural Networks (ANNs). Despite promising preliminary
results, the trade-offs in the training of SNNs in a distributed scheme are not
well understood. Here, we consider SNNs in a federated learning setting where a
high-quality global model is created by aggregating multiple local models from
the clients without sharing any data. We investigate federated learning for
training multiple SNNs at clients when two mechanisms reduce the uplink
communication cost: i) random masking of the model updates sent from the
clients to the server; and ii) client dropouts where some clients do not send
their updates to the server. We evaluated the performance of the SNNs using a
subset of the Spiking Heidelberg digits (SHD) dataset. The results show that a
trade-off between the random masking and the client drop probabilities is
crucial to obtain a satisfactory performance for a fixed number of clients.
Related papers
- Spiking Neural Networks in Vertical Federated Learning: Performance Trade-offs [2.1756721838833797]
Federated machine learning enables model training across multiple clients.
Vertical Federated Learning (VFL) deals with instances where the clients have different feature sets of the same samples.
Spiking Neural Networks (SNNs) are being leveraged to enable fast and accurate processing at the edge.
arXiv Detail & Related papers (2024-07-24T23:31:02Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Deep Learning in Spiking Phasor Neural Networks [0.6767885381740952]
Spiking Neural Networks (SNNs) have attracted the attention of the deep learning community for use in low-latency, low-power neuromorphic hardware.
In this paper, we introduce Spiking Phasor Neural Networks (SPNNs)
SPNNs are based on complex-valued Deep Neural Networks (DNNs), representing phases by spike times.
arXiv Detail & Related papers (2022-04-01T15:06:15Z) - Architecture Agnostic Federated Learning for Neural Networks [19.813602191888837]
This work introduces a novel Federated Heterogeneous Neural Networks (FedHeNN) framework.
FedHeNN allows each client to build a personalised model without enforcing a common architecture across clients.
The key idea of FedHeNN is to use the instance-level representations obtained from peer clients to guide the simultaneous training on each client.
arXiv Detail & Related papers (2022-02-15T22:16:06Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - Federated Learning with Spiking Neural Networks [13.09613811272936]
Spiking Neural Networks (SNNs) are emerging as an energy-efficient alternative to the traditional Artificial Neural Networks (ANNs)
We propose a federated learning framework for decentralized and privacy-preserving training of SNNs.
We observe that SNNs outperform ANNs in terms of overall accuracy by over 15% when the data is distributed across a large number of clients in the federation.
arXiv Detail & Related papers (2021-06-11T19:00:58Z) - Skip-Connected Self-Recurrent Spiking Neural Networks with Joint
Intrinsic Parameter and Synaptic Weight Training [14.992756670960008]
We propose a new type of RSNN called Skip-Connected Self-Recurrent SNNs (ScSr-SNNs)
ScSr-SNNs can boost performance by up to 2.55% compared with other types of RSNNs trained by state-of-the-art BP methods.
arXiv Detail & Related papers (2020-10-23T22:27:13Z) - Adversarial Robustness through Bias Variance Decomposition: A New
Perspective for Federated Learning [41.525434598682764]
Federated learning learns a neural network model by aggregating the knowledge from a group of distributed clients under the privacy-preserving constraint.
We show that this paradigm might inherit the adversarial vulnerability of the centralized neural network.
We propose an adversarially robust federated learning framework, named Fed_BVA, with improved server and client update mechanisms.
arXiv Detail & Related papers (2020-09-18T18:58:25Z) - Neural Networks Enhancement with Logical Knowledge [83.9217787335878]
We propose an extension of KENN for relational data.
The results show that KENN is capable of increasing the performances of the underlying neural network even in the presence relational data.
arXiv Detail & Related papers (2020-09-13T21:12:20Z) - Multi-Sample Online Learning for Probabilistic Spiking Neural Networks [43.8805663900608]
Spiking Neural Networks (SNNs) capture some of the efficiency of biological brains for inference and learning.
This paper introduces an online learning rule based on generalized expectation-maximization (GEM)
Experimental results on structured output memorization and classification on a standard neuromorphic data set demonstrate significant improvements in terms of log-likelihood, accuracy, and calibration.
arXiv Detail & Related papers (2020-07-23T10:03:58Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.