Quantum Heterogeneous Distributed Deep Learning Architectures: Models,
Discussions, and Applications
- URL: http://arxiv.org/abs/2202.11200v1
- Date: Sat, 19 Feb 2022 12:59:11 GMT
- Title: Quantum Heterogeneous Distributed Deep Learning Architectures: Models,
Discussions, and Applications
- Authors: Yunseok Kwak, Won Joon Yun, Jae Pyoung Kim, Hyunhee Cho, Minseok Choi,
Soyi Jung, Joongheon Kim
- Abstract summary: Quantum deep learning (QDL) and distributed deep learning (DDL) are emerging to complement existing deep learning methods.
QDL takes computational gains by replacing deep learning computations on local devices and servers with quantum deep learning.
It can increase data security by using a quantum secure communication protocol between the server and the client.
- Score: 13.241451755566365
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Deep learning (DL) has already become a state-of-the-art technology for
various data processing tasks. However, data security and computational
overload problems frequently occur due to their high data and computational
power dependence. To solve this problem, quantum deep learning (QDL) and
distributed deep learning (DDL) are emerging to complement existing DL methods
by reducing computational overhead and strengthening data security.
Furthermore, a quantum distributed deep learning (QDDL) technique that combines
these advantages and maximizes them is in the spotlight. QDL takes
computational gains by replacing deep learning computations on local devices
and servers with quantum deep learning. On the other hand, besides the
advantages of the existing distributed learning structure, it can increase data
security by using a quantum secure communication protocol between the server
and the client. Although many attempts have been made to confirm and
demonstrate these various possibilities, QDDL research is still in its infancy.
This paper discusses the model structure studied so far and its possibilities
and limitations to introduce and promote these studies. It also discusses the
areas of applied research so far and in the future and the possibilities of new
methodologies.
Related papers
- Swarm Learning: A Survey of Concepts, Applications, and Trends [3.55026004901472]
Deep learning models have raised privacy and security concerns due to their reliance on large datasets on central servers.
Federated learning (FL) has introduced a novel approach to building a versatile, large-scale machine learning framework.
Swarm learning (SL) has been proposed in collaboration with Hewlett Packard Enterprise (HPE)
SL represents a decentralized machine learning framework that leverages blockchain technology for secure, scalable, and private data management.
arXiv Detail & Related papers (2024-05-01T14:59:24Z) - Federated Fine-Tuning of LLMs on the Very Edge: The Good, the Bad, the Ugly [62.473245910234304]
This paper takes a hardware-centric approach to explore how Large Language Models can be brought to modern edge computing systems.
We provide a micro-level hardware benchmark, compare the model FLOP utilization to a state-of-the-art data center GPU, and study the network utilization in realistic conditions.
arXiv Detail & Related papers (2023-10-04T20:27:20Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Towards a Better Theoretical Understanding of Independent Subnetwork Training [56.24689348875711]
We take a closer theoretical look at Independent Subnetwork Training (IST)
IST is a recently proposed and highly effective technique for solving the aforementioned problems.
We identify fundamental differences between IST and alternative approaches, such as distributed methods with compressed communication.
arXiv Detail & Related papers (2023-06-28T18:14:22Z) - Quantum Federated Learning for Distributed Quantum Networks [9.766446130011706]
We propose a quantum federated learning for distributed quantum networks by utilizing interesting characteristics of quantum mechanics.
A quantum gradient descent algorithm is provided to help clients in the distributed quantum networks to train local models.
A quantum secure multi-party computation protocol is designed, which utilizes the Chinese residual theorem.
arXiv Detail & Related papers (2022-12-25T14:37:23Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Towards Quantum-Enabled 6G Slicing [0.5156484100374059]
Quantum machine learning (QML) paradigms and their synergies with network slicing can be envisioned to be a disruptive technology.
We propose a cloud-based federated learning framework based on quantum deep reinforcement learning (QDRL)
Specifically, the decision agents leverage the remold of classical deep reinforcement learning (DRL) algorithm into variational quantum circuits (VQCs) to obtain the optimal cooperative control on slice resources.
arXiv Detail & Related papers (2022-10-21T07:16:06Z) - Deep Transfer Learning with Ridge Regression [7.843067454030999]
Deep models trained with massive amounts of data demonstrate promising generalisation ability on unseen data from relevant domains.
We address this issue by leveraging the low-rank property of learnt feature vectors produced from deep neural networks (DNNs) with the closed-form solution provided in kernel ridge regression (KRR)
Our method is successful on supervised and semi-supervised transfer learning tasks.
arXiv Detail & Related papers (2020-06-11T20:21:35Z) - A Privacy-Preserving Distributed Architecture for
Deep-Learning-as-a-Service [68.84245063902908]
This paper introduces a novel distributed architecture for deep-learning-as-a-service.
It is able to preserve the user sensitive data while providing Cloud-based machine and deep learning services.
arXiv Detail & Related papers (2020-03-30T15:12:03Z) - Deep Learning for Ultra-Reliable and Low-Latency Communications in 6G
Networks [84.2155885234293]
We first summarize how to apply data-driven supervised deep learning and deep reinforcement learning in URLLC.
To address these open problems, we develop a multi-level architecture that enables device intelligence, edge intelligence, and cloud intelligence for URLLC.
arXiv Detail & Related papers (2020-02-22T14:38:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.