Slimmable Quantum Federated Learning
- URL: http://arxiv.org/abs/2207.10221v1
- Date: Wed, 20 Jul 2022 22:39:23 GMT
- Title: Slimmable Quantum Federated Learning
- Authors: Won Joon Yun, Jae Pyoung Kim, Soyi Jung, Jihong Park, Mehdi Bennis,
and Joongheon Kim
- Abstract summary: Quantum federated learning (QFL) has recently received increasing attention, where quantum neural networks (QNNs) are integrated into federated learning (FL)
We propose slimmable QFL (SlimQFL) in this article, which is a dynamic QFL framework that can cope with time-varying communication channels and computing energy limitations.
Simulation results corroborate that SlimQFL achieves higher classification accuracy than Vanilla QFL, particularly under poor channel conditions on average.
- Score: 44.89303833148191
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Quantum federated learning (QFL) has recently received increasing attention,
where quantum neural networks (QNNs) are integrated into federated learning
(FL). In contrast to the existing static QFL methods, we propose slimmable QFL
(SlimQFL) in this article, which is a dynamic QFL framework that can cope with
time-varying communication channels and computing energy limitations. This is
made viable by leveraging the unique nature of a QNN where its angle parameters
and pole parameters can be separately trained and dynamically exploited.
Simulation results corroborate that SlimQFL achieves higher classification
accuracy than Vanilla QFL, particularly under poor channel conditions on
average.
Related papers
- Federated Quantum Long Short-term Memory (FedQLSTM) [58.50321380769256]
Quantum federated learning (QFL) can facilitate collaborative learning across multiple clients using quantum machine learning (QML) models.
No prior work has focused on developing a QFL framework that utilizes temporal data to approximate functions.
A novel QFL framework that is the first to integrate quantum long short-term memory (QLSTM) models with temporal data is proposed.
arXiv Detail & Related papers (2023-12-21T21:40:47Z) - Foundations of Quantum Federated Learning Over Classical and Quantum
Networks [59.121263013213756]
Quantum federated learning (QFL) is a novel framework that integrates the advantages of classical federated learning (FL) with the computational power of quantum technologies.
QFL can be deployed over both classical and quantum communication networks.
arXiv Detail & Related papers (2023-10-23T02:56:00Z) - Optimizing Quantum Federated Learning Based on Federated Quantum Natural
Gradient Descent [17.05322956052278]
We propose an efficient optimization algorithm, namely federated quantum natural descent (FQNGD)
Compared with gradient descent methods like Adam and Adagrad, the FQNGD algorithm admits much fewer training for the QFL to get converged.
Our experiments on a handwritten digit classification dataset justify the effectiveness of the FQNGD for the QFL framework.
arXiv Detail & Related papers (2023-02-27T11:34:16Z) - TeD-Q: a tensor network enhanced distributed hybrid quantum machine
learning framework [59.07246314484875]
TeD-Q is an open-source software framework for quantum machine learning.
It seamlessly integrates classical machine learning libraries with quantum simulators.
It provides a graphical mode in which the quantum circuit and the training progress can be visualized in real-time.
arXiv Detail & Related papers (2023-01-13T09:35:05Z) - Quantum Federated Learning with Entanglement Controlled Circuits and
Superposition Coding [44.89303833148191]
We develop a depth-controllable architecture of entangled slimmable quantum neural networks (eSQNNs)
We propose an entangled slimmable QFL (eSQFL) that communicates the superposition-coded parameters of eS-QNNs.
In an image classification task, extensive simulations corroborate the effectiveness of eSQFL.
arXiv Detail & Related papers (2022-12-04T03:18:03Z) - Federated Quantum Natural Gradient Descent for Quantum Federated
Learning [7.028664795605032]
In this work, we put forth an efficient learning algorithm, namely federated quantum natural gradient descent (FQNGD)
The FQNGD algorithm admits much fewer training iterations for the QFL model to get converged.
Compared with other federated learning algorithms, our experiments on a handwritten digit classification dataset corroborate the effectiveness of the FQNGD algorithm for the QFL.
arXiv Detail & Related papers (2022-08-15T07:17:11Z) - On the learnability of quantum neural networks [132.1981461292324]
We consider the learnability of the quantum neural network (QNN) built on the variational hybrid quantum-classical scheme.
We show that if a concept can be efficiently learned by QNN, then it can also be effectively learned by QNN even with gate noise.
arXiv Detail & Related papers (2020-07-24T06:34:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.