Exponential Quantum Communication Advantage in Distributed Inference and Learning
- URL: http://arxiv.org/abs/2310.07136v3
- Date: Thu, 26 Sep 2024 22:32:02 GMT
- Title: Exponential Quantum Communication Advantage in Distributed Inference and Learning
- Authors: Dar Gilboa, Hagay Michaeli, Daniel Soudry, Jarrod R. McClean,
- Abstract summary: We present a framework for distributed computation over a quantum network.
We show that for models within this framework, inference and training using gradient descent can be performed with exponentially less communication.
We also show that models in this class can encode highly nonlinear features of their inputs, and their expressivity increases exponentially with model depth.
- Score: 19.827903766111987
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Training and inference with large machine learning models that far exceed the memory capacity of individual devices necessitates the design of distributed architectures, forcing one to contend with communication constraints. We present a framework for distributed computation over a quantum network in which data is encoded into specialized quantum states. We prove that for models within this framework, inference and training using gradient descent can be performed with exponentially less communication compared to their classical analogs, and with relatively modest overhead relative to standard gradient-based methods. We show that certain graph neural networks are particularly amenable to implementation within this framework, and moreover present empirical evidence that they perform well on standard benchmarks. To our knowledge, this is the first example of exponential quantum advantage for a generic class of machine learning problems that hold regardless of the data encoding cost. Moreover, we show that models in this class can encode highly nonlinear features of their inputs, and their expressivity increases exponentially with model depth. We also delineate the space of models for which exponential communication advantages hold by showing that they cannot hold for linear classification. Our results can be combined with natural privacy advantages in the communicated quantum states that limit the amount of information that can be extracted from them about the data and model parameters. Taken as a whole, these findings form a promising foundation for distributed machine learning over quantum networks.
Related papers
- Expressive variational quantum circuits provide inherent privacy in
federated learning [2.3255115473995134]
Federated learning has emerged as a viable solution to train machine learning models without the need to share data with the central aggregator.
Standard neural network-based federated learning models have been shown to be susceptible to data leakage from the gradients shared with the server.
We show that expressive maps lead to inherent privacy against gradient inversion attacks.
arXiv Detail & Related papers (2023-09-22T17:04:50Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - Interpretable Quantum Advantage in Neural Sequence Learning [2.575030923243061]
We study the relative expressive power between a broad class of neural network sequence models and a class of recurrent models based on Gaussian operations with non-Gaussian measurements.
We pinpoint quantum contextuality as the source of an unconditional memory separation in the expressivity of the two model classes.
In doing so, we demonstrate that our introduced quantum models are able to outperform state of the art classical models even in practice.
arXiv Detail & Related papers (2022-09-28T18:34:04Z) - Inducing Gaussian Process Networks [80.40892394020797]
We propose inducing Gaussian process networks (IGN), a simple framework for simultaneously learning the feature space as well as the inducing points.
The inducing points, in particular, are learned directly in the feature space, enabling a seamless representation of complex structured domains.
We report on experimental results for real-world data sets showing that IGNs provide significant advances over state-of-the-art methods.
arXiv Detail & Related papers (2022-04-21T05:27:09Z) - Generalization Metrics for Practical Quantum Advantage in Generative
Models [68.8204255655161]
Generative modeling is a widely accepted natural use case for quantum computers.
We construct a simple and unambiguous approach to probe practical quantum advantage for generative modeling by measuring the algorithm's generalization performance.
Our simulation results show that our quantum-inspired models have up to a $68 times$ enhancement in generating unseen unique and valid samples.
arXiv Detail & Related papers (2022-01-21T16:35:35Z) - Quantum Self-Supervised Learning [22.953284192004034]
We propose a hybrid quantum-classical neural network architecture for contrastive self-supervised learning.
We apply our best quantum model to classify unseen images on the ibmq_paris quantum computer.
arXiv Detail & Related papers (2021-03-26T18:00:00Z) - Enhancing Generative Models via Quantum Correlations [1.6099403809839032]
Generative modeling using samples drawn from the probability distribution constitutes a powerful approach for unsupervised machine learning.
We show theoretically that such quantum correlations provide a powerful resource for generative modeling.
We numerically test this separation on standard machine learning data sets and show that it holds for practical problems.
arXiv Detail & Related papers (2021-01-20T22:57:22Z) - Generation of High-Resolution Handwritten Digits with an Ion-Trap
Quantum Computer [55.41644538483948]
We implement a quantum-circuit based generative model to learn and sample the prior distribution of a Generative Adversarial Network.
We train this hybrid algorithm on an ion-trap device based on $171$Yb$+$ ion qubits to generate high-quality images.
arXiv Detail & Related papers (2020-12-07T18:51:28Z) - Reservoir Memory Machines as Neural Computers [70.5993855765376]
Differentiable neural computers extend artificial neural networks with an explicit memory without interference.
We achieve some of the computational capabilities of differentiable neural computers with a model that can be trained very efficiently.
arXiv Detail & Related papers (2020-09-14T12:01:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.