Kernel Matrix Completion for Offline Quantum-Enhanced Machine Learning
- URL: http://arxiv.org/abs/2112.08449v1
- Date: Wed, 15 Dec 2021 19:44:39 GMT
- Title: Kernel Matrix Completion for Offline Quantum-Enhanced Machine Learning
- Authors: Annie Naveh, Imogen Fitzgerald, Anna Phan, Andrew Lockwood, and Travis
L. Scholten
- Abstract summary: We show quantum kernel matrices can be extended to incorporate new data using a classical (chordal-graph-based) matrix completion algorithm.
The minimal sample complexity needed for perfect completion is dependent on matrix rank.
On a real-world, industrially-relevant data set, the completion error behaves gracefully even when the minimal sample complexity is not reached.
- Score: 0.09786690381850353
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Enhancing classical machine learning (ML) algorithms through quantum kernels
is a rapidly growing research topic in quantum machine learning (QML). A key
challenge in using kernels -- both classical and quantum -- is that ML
workflows involve acquiring new observations, for which new kernel values need
to be calculated. Transferring data back-and-forth between where the new
observations are generated & a quantum computer incurs a time delay; this delay
may exceed the timescales relevant for using the QML algorithm in the first
place. In this work, we show quantum kernel matrices can be extended to
incorporate new data using a classical (chordal-graph-based) matrix completion
algorithm. The minimal sample complexity needed for perfect completion is
dependent on matrix rank. We empirically show that (a) quantum kernel matrices
can be completed using this algorithm when the minimal sample complexity is
met, (b) the error of the completion degrades gracefully in the presence of
finite-sampling noise, and (c) the rank of quantum kernel matrices depends
weakly on the expressibility of the quantum feature map generating the kernel.
Further, on a real-world, industrially-relevant data set, the completion error
behaves gracefully even when the minimal sample complexity is not reached.
Related papers
- Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - The curse of random quantum data [62.24825255497622]
We quantify the performances of quantum machine learning in the landscape of quantum data.
We find that the training efficiency and generalization capabilities in quantum machine learning will be exponentially suppressed with the increase in qubits.
Our findings apply to both the quantum kernel method and the large-width limit of quantum neural networks.
arXiv Detail & Related papers (2024-08-19T12:18:07Z) - QUACK: Quantum Aligned Centroid Kernel [0.0]
We introduce QUACK, a quantum kernel algorithm whose time complexity scales linear with the number of samples during training.
Our algorithm is able to handle high-dimensional datasets such as MNIST with 784 features without any dimensionality reduction.
arXiv Detail & Related papers (2024-05-01T04:00:09Z) - Robust Dequantization of the Quantum Singular value Transformation and
Quantum Machine Learning Algorithms [0.0]
We show how many techniques from randomized linear algebra can be adapted to work under this weaker assumption.
We also apply these results to obtain a robust dequantization of many quantum machine learning algorithms.
arXiv Detail & Related papers (2023-04-11T02:09:13Z) - Quantum Clustering with k-Means: a Hybrid Approach [117.4705494502186]
We design, implement, and evaluate three hybrid quantum k-Means algorithms.
We exploit quantum phenomena to speed up the computation of distances.
We show that our hybrid quantum k-Means algorithms can be more efficient than the classical version.
arXiv Detail & Related papers (2022-12-13T16:04:16Z) - Quantum Worst-Case to Average-Case Reductions for All Linear Problems [66.65497337069792]
We study the problem of designing worst-case to average-case reductions for quantum algorithms.
We provide an explicit and efficient transformation of quantum algorithms that are only correct on a small fraction of their inputs into ones that are correct on all inputs.
arXiv Detail & Related papers (2022-12-06T22:01:49Z) - Noisy Quantum Kernel Machines [58.09028887465797]
An emerging class of quantum learning machines is that based on the paradigm of quantum kernels.
We study how dissipation and decoherence affect their performance.
We show that decoherence and dissipation can be seen as an implicit regularization for the quantum kernel machines.
arXiv Detail & Related papers (2022-04-26T09:52:02Z) - Quantum Semi-Supervised Kernel Learning [4.726777092009554]
We present a quantum machine learning algorithm for training Semi-Supervised Kernel Support Vector Machines.
We show that it maintains the same speedup as the fully-supervised Quantum LS-SVM.
arXiv Detail & Related papers (2022-04-22T13:39:55Z) - Quantum Machine Learning For Classical Data [0.0]
We study the intersection of quantum computing and supervised machine learning algorithms.
In particular, we investigate what extent quantum computers can be used to accelerate supervised machine learning algorithms.
arXiv Detail & Related papers (2021-05-08T12:11:44Z) - A rigorous and robust quantum speed-up in supervised machine learning [6.402634424631123]
In this paper, we establish a rigorous quantum speed-up for supervised classification using a general-purpose quantum learning algorithm.
Our quantum classifier is a conventional support vector machine that uses a fault-tolerant quantum computer to estimate a kernel function.
arXiv Detail & Related papers (2020-10-05T17:22:22Z) - Statistical Limits of Supervised Quantum Learning [90.0289160657379]
We show that if the bound on the accuracy is taken into account, quantum machine learning algorithms for supervised learning cannot achieve polylogarithmic runtimes in the input dimension.
We conclude that, when no further assumptions on the problem are made, quantum machine learning algorithms for supervised learning can have at most speedups over efficient classical algorithms.
arXiv Detail & Related papers (2020-01-28T17:35:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.