Federated Quantum-Train Long Short-Term Memory for Gravitational Wave Signal
- URL: http://arxiv.org/abs/2503.16049v1
- Date: Thu, 20 Mar 2025 11:34:13 GMT
- Title: Federated Quantum-Train Long Short-Term Memory for Gravitational Wave Signal
- Authors: Chen-Yu Liu, Samuel Yen-Chi Chen, Kuan-Cheng Chen, Wei-Jia Huang, Yen-Jui Chang,
- Abstract summary: We present Federated QT-LSTM, a novel framework that combines the Quantum-Train (QT) methodology with Long Short-Term Memory (LSTM) networks in a federated learning setup.<n>By leveraging quantum neural networks (QNNs) to generate classical LSTM model parameters during training, the framework effectively addresses challenges in model compression, scalability, and computational efficiency.
- Score: 3.360429911727189
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present Federated QT-LSTM, a novel framework that combines the Quantum-Train (QT) methodology with Long Short-Term Memory (LSTM) networks in a federated learning setup. By leveraging quantum neural networks (QNNs) to generate classical LSTM model parameters during training, the framework effectively addresses challenges in model compression, scalability, and computational efficiency. Importantly, Federated QT-LSTM eliminates the reliance on quantum devices during inference, making it practical for real-world applications. Experiments on simulated gravitational wave (GW) signal datasets demonstrate the framework's superior performance compared to baseline models, including LSTM and QLSTM, achieving lower training and testing losses while significantly reducing the number of trainable parameters. The results also reveal that deeper QT layers enhance model expressiveness for complex tasks, highlighting the adaptability of the framework. Federated QT-LSTM provides a scalable and efficient solution for privacy-preserving distributed learning, showcasing the potential of quantum-inspired techniques in advancing time-series prediction and signal reconstruction tasks.
Related papers
- Hybrid Quantum Recurrent Neural Network For Remaining Useful Life Prediction [67.410870290301]
We introduce a Hybrid Quantum Recurrent Neural Network framework, combining Quantum Long Short-Term Memory layers with classical dense layers for Remaining Useful Life forecasting.
Experimental results demonstrate that, despite having fewer trainable parameters, the Hybrid Quantum Recurrent Neural Network achieves up to a 5% improvement over a Recurrent Neural Network.
arXiv Detail & Related papers (2025-04-29T14:41:41Z) - Quantum Kernel-Based Long Short-term Memory for Climate Time-Series Forecasting [0.24739484546803336]
We present the Quantum Kernel-Based Long short-memory (QK-LSTM) network, which integrates quantum kernel methods into classical LSTM architectures.<n>QK-LSTM captures intricate nonlinear dependencies and temporal dynamics with fewer trainable parameters.
arXiv Detail & Related papers (2024-12-12T01:16:52Z) - Quantum Kernel-Based Long Short-term Memory [0.30723404270319693]
We introduce the Quantum Kernel-Based Long Short-Term Memory (QK-LSTM) network to capture complex, non-linear patterns in sequential data.
This quantum-enhanced architecture demonstrates efficient convergence, robust loss minimization, and model compactness.
Benchmark comparisons reveal that QK-LSTM achieves performance on par with classical LSTM models, yet with fewer parameters.
arXiv Detail & Related papers (2024-11-20T11:39:30Z) - Quantum-Train with Tensor Network Mapping Model and Distributed Circuit Ansatz [0.8192907805418583]
Quantum-Train (QT) is a hybrid quantum-classical machine learning framework.
It maps quantum state measurements to classical neural network weights.
Traditional QT framework employs a multi-layer perceptron (MLP) for this task, but it struggles with scalability and interpretability.
We introduce a distributed circuit ansatz designed for large-scale quantum machine learning with multiple small quantum processing unit nodes.
arXiv Detail & Related papers (2024-09-11T03:51:34Z) - Implementation Guidelines and Innovations in Quantum LSTM Networks [2.938337278931738]
This paper presents a theoretical analysis and an implementation plan for a Quantum LSTM model, which seeks to integrate quantum computing principles with traditional LSTM networks.
The actual architecture and its practical effectiveness in enhancing sequential data processing remain to be developed and demonstrated in future work.
arXiv Detail & Related papers (2024-06-13T10:26:14Z) - Federated Quantum Long Short-term Memory (FedQLSTM) [58.50321380769256]
Quantum federated learning (QFL) can facilitate collaborative learning across multiple clients using quantum machine learning (QML) models.
No prior work has focused on developing a QFL framework that utilizes temporal data to approximate functions.
A novel QFL framework that is the first to integrate quantum long short-term memory (QLSTM) models with temporal data is proposed.
arXiv Detail & Related papers (2023-12-21T21:40:47Z) - Pre-training Tensor-Train Networks Facilitates Machine Learning with Variational Quantum Circuits [70.97518416003358]
Variational quantum circuits (VQCs) hold promise for quantum machine learning on noisy intermediate-scale quantum (NISQ) devices.
While tensor-train networks (TTNs) can enhance VQC representation and generalization, the resulting hybrid model, TTN-VQC, faces optimization challenges due to the Polyak-Lojasiewicz (PL) condition.
To mitigate this challenge, we introduce Pre+TTN-VQC, a pre-trained TTN model combined with a VQC.
arXiv Detail & Related papers (2023-05-18T03:08:18Z) - Bayesian Neural Network Language Modeling for Speech Recognition [59.681758762712754]
State-of-the-art neural network language models (NNLMs) represented by long short term memory recurrent neural networks (LSTM-RNNs) and Transformers are becoming highly complex.
In this paper, an overarching full Bayesian learning framework is proposed to account for the underlying uncertainty in LSTM-RNN and Transformer LMs.
arXiv Detail & Related papers (2022-08-28T17:50:19Z) - When BERT Meets Quantum Temporal Convolution Learning for Text
Classification in Heterogeneous Computing [75.75419308975746]
This work proposes a vertical federated learning architecture based on variational quantum circuits to demonstrate the competitive performance of a quantum-enhanced pre-trained BERT model for text classification.
Our experiments on intent classification show that our proposed BERT-QTC model attains competitive experimental results in the Snips and ATIS spoken language datasets.
arXiv Detail & Related papers (2022-02-17T09:55:21Z) - Quantum Long Short-Term Memory [3.675884635364471]
Long short-term memory (LSTM) is a recurrent neural network (RNN) for sequence and temporal dependency data modeling.
We propose a hybrid quantum-classical model of LSTM, which we dub QLSTM.
Our work paves the way toward implementing machine learning algorithms for sequence modeling on noisy intermediate-scale quantum (NISQ) devices.
arXiv Detail & Related papers (2020-09-03T16:41:09Z) - Rewiring the Transformer with Depth-Wise LSTMs [55.50278212605607]
We present a Transformer with depth-wise LSTMs connecting cascading Transformer layers and sub-layers.
Experiments with the 6-layer Transformer show significant BLEU improvements in both WMT 14 English-German / French tasks and the OPUS-100 many-to-many multilingual NMT task.
arXiv Detail & Related papers (2020-07-13T09:19:34Z) - Object Tracking through Residual and Dense LSTMs [67.98948222599849]
Deep learning-based trackers based on LSTMs (Long Short-Term Memory) recurrent neural networks have emerged as a powerful alternative.
DenseLSTMs outperform Residual and regular LSTM, and offer a higher resilience to nuisances.
Our case study supports the adoption of residual-based RNNs for enhancing the robustness of other trackers.
arXiv Detail & Related papers (2020-06-22T08:20:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.