Federated Quantum Kernel-Based Long Short-term Memory for Human Activity Recognition
- URL: http://arxiv.org/abs/2508.06078v2
- Date: Mon, 11 Aug 2025 06:25:26 GMT
- Title: Federated Quantum Kernel-Based Long Short-term Memory for Human Activity Recognition
- Authors: Yu-Chao Hsu, Jiun-Cheng Jiang, Chun-Hua Lin, Wei-Ting Chen, Kuo-Chung Peng, Prayag Tiwari, Samuel Yen-Chi Chen, En-Jui Kuo,
- Abstract summary: We introduce the Federated Quantum Kernel-Based Long Short-term Memory (Fed-QK-LSTM) framework.<n>Within Fed-QK-LSTM framework, we enhance human activity recognition in privacy-sensitive environments.<n>We showcase the potential of Fed-QK-LSTM framework for robust and privacy-preserving human activity recognition in real-world applications.
- Score: 12.82920414864798
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: In this work, we introduce the Federated Quantum Kernel-Based Long Short-term Memory (Fed-QK-LSTM) framework, integrating the quantum kernel methods and Long Short-term Memory into federated learning. Within Fed-QK-LSTM framework, we enhance human activity recognition (HAR) in privacy-sensitive environments and leverage quantum computing for distributed learning systems. The DeepConv-QK-LSTM architecture on each client node employs convolutional layers for efficient local pattern capture, this design enables the use of a shallow QK-LSTM to model long-range relationships within the HAR data. The quantum kernel method enables the model to capture complex non-linear relationships in multivariate time-series data with fewer trainable parameters. Experimental results on RealWorld HAR dataset demonstrate that Fed-QK-LSTM framework achieves competitive accuracy across different client settings and local training rounds. We showcase the potential of Fed-QK-LSTM framework for robust and privacy-preserving human activity recognition in real-world applications, especially in edge computing environments and on scarce quantum devices.
Related papers
- AQER: a scalable and efficient data loader for digital quantum computers [62.40228216126285]
We develop AQER, a scalable AQL method that constructs the loading circuit by systematically reducing entanglement in target states.<n>We conduct systematic experiments to evaluate the effectiveness of AQER, using synthetic datasets, classical image and language datasets, and a quantum many-body state datasets with up to 50 qubits.
arXiv Detail & Related papers (2026-02-02T14:39:42Z) - QKAN-LSTM: Quantum-inspired Kolmogorov-Arnold Long Short-term Memory [11.996286932948124]
Long short-term memory (LSTM) models are central to sequential modeling tasks in domains such as urban telecommunication forecasting.<n>We propose the Quantum-inspired Kolmogorov-Arnold Long Short-Term Memory (QKAN-LSTM)<n>QKAN-LSTM integrates Data Re-Uploading Activation modules into the gating structure of LSTMs.
arXiv Detail & Related papers (2025-12-04T18:03:23Z) - VQC-MLPNet: An Unconventional Hybrid Quantum-Classical Architecture for Scalable and Robust Quantum Machine Learning [50.95799256262098]
Variational quantum circuits (VQCs) hold promise for quantum machine learning but face challenges in expressivity, trainability, and noise resilience.<n>We propose VQC-MLPNet, a hybrid architecture where a VQC generates the first-layer weights of a classical multilayer perceptron during training, while inference is performed entirely classically.
arXiv Detail & Related papers (2025-06-12T01:38:15Z) - Federated Quantum-Train Long Short-Term Memory for Gravitational Wave Signal [3.360429911727189]
We present Federated QT-LSTM, a novel framework that combines the Quantum-Train (QT) methodology with Long Short-Term Memory (LSTM) networks in a federated learning setup.<n>By leveraging quantum neural networks (QNNs) to generate classical LSTM model parameters during training, the framework effectively addresses challenges in model compression, scalability, and computational efficiency.
arXiv Detail & Related papers (2025-03-20T11:34:13Z) - Quantum Kernel-Based Long Short-term Memory for Climate Time-Series Forecasting [0.24739484546803336]
We present the Quantum Kernel-Based Long short-memory (QK-LSTM) network, which integrates quantum kernel methods into classical LSTM architectures.<n>QK-LSTM captures intricate nonlinear dependencies and temporal dynamics with fewer trainable parameters.
arXiv Detail & Related papers (2024-12-12T01:16:52Z) - FL-QDSNNs: Federated Learning with Quantum Dynamic Spiking Neural Networks [4.635820333232683]
This paper introduces the Federated Learning-Quantum Dynamic Spiking Neural Networks (FL-QDSNNs) framework.<n>Central to our framework is a novel dynamic threshold mechanism for activating quantum gates in Quantum Spiking Neural Networks (QSNNs)<n>Our FL-QDSNNs framework has demonstrated superior accuracies-up to 94% on the Iris dataset and markedly outperforms existing Quantum Federated Learning (QFL) approaches.
arXiv Detail & Related papers (2024-12-03T09:08:33Z) - Quantum Kernel-Based Long Short-term Memory [0.30723404270319693]
We introduce the Quantum Kernel-Based Long Short-Term Memory (QK-LSTM) network to capture complex, non-linear patterns in sequential data.
This quantum-enhanced architecture demonstrates efficient convergence, robust loss minimization, and model compactness.
Benchmark comparisons reveal that QK-LSTM achieves performance on par with classical LSTM models, yet with fewer parameters.
arXiv Detail & Related papers (2024-11-20T11:39:30Z) - Federated Quantum Long Short-term Memory (FedQLSTM) [58.50321380769256]
Quantum federated learning (QFL) can facilitate collaborative learning across multiple clients using quantum machine learning (QML) models.
No prior work has focused on developing a QFL framework that utilizes temporal data to approximate functions.
A novel QFL framework that is the first to integrate quantum long short-term memory (QLSTM) models with temporal data is proposed.
arXiv Detail & Related papers (2023-12-21T21:40:47Z) - QKSAN: A Quantum Kernel Self-Attention Network [53.96779043113156]
A Quantum Kernel Self-Attention Mechanism (QKSAM) is introduced to combine the data representation merit of Quantum Kernel Methods (QKM) with the efficient information extraction capability of SAM.
A Quantum Kernel Self-Attention Network (QKSAN) framework is proposed based on QKSAM, which ingeniously incorporates the Deferred Measurement Principle (DMP) and conditional measurement techniques.
Four QKSAN sub-models are deployed on PennyLane and IBM Qiskit platforms to perform binary classification on MNIST and Fashion MNIST.
arXiv Detail & Related papers (2023-08-25T15:08:19Z) - QSAN: A Near-term Achievable Quantum Self-Attention Network [73.15524926159702]
Self-Attention Mechanism (SAM) is good at capturing the internal connections of features.
A novel Quantum Self-Attention Network (QSAN) is proposed for image classification tasks on near-term quantum devices.
arXiv Detail & Related papers (2022-07-14T12:22:51Z) - When BERT Meets Quantum Temporal Convolution Learning for Text
Classification in Heterogeneous Computing [75.75419308975746]
This work proposes a vertical federated learning architecture based on variational quantum circuits to demonstrate the competitive performance of a quantum-enhanced pre-trained BERT model for text classification.
Our experiments on intent classification show that our proposed BERT-QTC model attains competitive experimental results in the Snips and ATIS spoken language datasets.
arXiv Detail & Related papers (2022-02-17T09:55:21Z) - Federated Stochastic Gradient Descent Begets Self-Induced Momentum [151.4322255230084]
Federated learning (FL) is an emerging machine learning method that can be applied in mobile edge systems.
We show that running to the gradient descent (SGD) in such a setting can be viewed as adding a momentum-like term to the global aggregation process.
arXiv Detail & Related papers (2022-02-17T02:01:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.