Federated Quantum Long Short-term Memory (FedQLSTM)
- URL: http://arxiv.org/abs/2312.14309v1
- Date: Thu, 21 Dec 2023 21:40:47 GMT
- Title: Federated Quantum Long Short-term Memory (FedQLSTM)
- Authors: Mahdi Chehimi, Samuel Yen-Chi Chen, Walid Saad, Shinjae Yoo
- Abstract summary: Quantum federated learning (QFL) can facilitate collaborative learning across multiple clients using quantum machine learning (QML) models.
No prior work has focused on developing a QFL framework that utilizes temporal data to approximate functions.
A novel QFL framework that is the first to integrate quantum long short-term memory (QLSTM) models with temporal data is proposed.
- Score: 58.50321380769256
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum federated learning (QFL) can facilitate collaborative learning across
multiple clients using quantum machine learning (QML) models, while preserving
data privacy. Although recent advances in QFL span different tasks like
classification while leveraging several data types, no prior work has focused
on developing a QFL framework that utilizes temporal data to approximate
functions useful to analyze the performance of distributed quantum sensing
networks. In this paper, a novel QFL framework that is the first to integrate
quantum long short-term memory (QLSTM) models with temporal data is proposed.
The proposed federated QLSTM (FedQLSTM) framework is exploited for performing
the task of function approximation. In this regard, three key use cases are
presented: Bessel function approximation, sinusoidal delayed quantum feedback
control function approximation, and Struve function approximation. Simulation
results confirm that, for all considered use cases, the proposed FedQLSTM
framework achieves a faster convergence rate under one local training epoch,
minimizing the overall computations, and saving 25-33% of the number of
communication rounds needed until convergence compared to an FL framework with
classical LSTM models.
Related papers
- Unifying (Quantum) Statistical and Parametrized (Quantum) Algorithms [65.268245109828]
We take inspiration from Kearns' SQ oracle and Valiant's weak evaluation oracle.
We introduce an extensive yet intuitive framework that yields unconditional lower bounds for learning from evaluation queries.
arXiv Detail & Related papers (2023-10-26T18:23:21Z) - Efficient quantum recurrent reinforcement learning via quantum reservoir
computing [3.6881738506505988]
Quantum reinforcement learning (QRL) has emerged as a framework to solve sequential decision-making tasks.
This work presents a novel approach to address this challenge by constructing QRL agents utilizing QRNN-based quantum long short-term memory (QLSTM)
arXiv Detail & Related papers (2023-09-13T22:18:38Z) - QKSAN: A Quantum Kernel Self-Attention Network [53.96779043113156]
A Quantum Kernel Self-Attention Mechanism (QKSAM) is introduced to combine the data representation merit of Quantum Kernel Methods (QKM) with the efficient information extraction capability of SAM.
A Quantum Kernel Self-Attention Network (QKSAN) framework is proposed based on QKSAM, which ingeniously incorporates the Deferred Measurement Principle (DMP) and conditional measurement techniques.
Four QKSAN sub-models are deployed on PennyLane and IBM Qiskit platforms to perform binary classification on MNIST and Fashion MNIST.
arXiv Detail & Related papers (2023-08-25T15:08:19Z) - A Novel Stochastic LSTM Model Inspired by Quantum Machine Learning [0.0]
Works in quantum machine learning (QML) over the past few years indicate that QML algorithms can function just as well as their classical counterparts.
This work aims to elucidate if it is possible to achieve some of QML's major reported benefits on classical machines by incorporating itsity.
arXiv Detail & Related papers (2023-05-17T13:44:25Z) - Optimizing Quantum Federated Learning Based on Federated Quantum Natural
Gradient Descent [17.05322956052278]
We propose an efficient optimization algorithm, namely federated quantum natural descent (FQNGD)
Compared with gradient descent methods like Adam and Adagrad, the FQNGD algorithm admits much fewer training for the QFL to get converged.
Our experiments on a handwritten digit classification dataset justify the effectiveness of the FQNGD for the QFL framework.
arXiv Detail & Related papers (2023-02-27T11:34:16Z) - Federated Quantum Natural Gradient Descent for Quantum Federated
Learning [7.028664795605032]
In this work, we put forth an efficient learning algorithm, namely federated quantum natural gradient descent (FQNGD)
The FQNGD algorithm admits much fewer training iterations for the QFL model to get converged.
Compared with other federated learning algorithms, our experiments on a handwritten digit classification dataset corroborate the effectiveness of the FQNGD algorithm for the QFL.
arXiv Detail & Related papers (2022-08-15T07:17:11Z) - Green, Quantized Federated Learning over Wireless Networks: An
Energy-Efficient Design [68.86220939532373]
The finite precision level is captured through the use of quantized neural networks (QNNs) that quantize weights and activations in fixed-precision format.
The proposed FL framework can reduce energy consumption until convergence by up to 70% compared to a baseline FL algorithm.
arXiv Detail & Related papers (2022-07-19T16:37:24Z) - QSAN: A Near-term Achievable Quantum Self-Attention Network [73.15524926159702]
Self-Attention Mechanism (SAM) is good at capturing the internal connections of features.
A novel Quantum Self-Attention Network (QSAN) is proposed for image classification tasks on near-term quantum devices.
arXiv Detail & Related papers (2022-07-14T12:22:51Z) - Quantum Federated Learning with Quantum Data [87.49715898878858]
Quantum machine learning (QML) has emerged as a promising field that leans on the developments in quantum computing to explore large complex machine learning problems.
This paper proposes the first fully quantum federated learning framework that can operate over quantum data and, thus, share the learning of quantum circuit parameters in a decentralized manner.
arXiv Detail & Related papers (2021-05-30T12:19:27Z) - Quantum Long Short-Term Memory [3.675884635364471]
Long short-term memory (LSTM) is a recurrent neural network (RNN) for sequence and temporal dependency data modeling.
We propose a hybrid quantum-classical model of LSTM, which we dub QLSTM.
Our work paves the way toward implementing machine learning algorithms for sequence modeling on noisy intermediate-scale quantum (NISQ) devices.
arXiv Detail & Related papers (2020-09-03T16:41:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.