Exponential concentration in quantum kernel methods
- URL: http://arxiv.org/abs/2208.11060v2
- Date: Sun, 14 Apr 2024 18:47:09 GMT
- Title: Exponential concentration in quantum kernel methods
- Authors: Supanut Thanasilp, Samson Wang, M. Cerezo, Zoƫ Holmes,
- Abstract summary: We study the performance of quantum kernel models from the perspective of resources needed to accurately estimate kernel values.
We identify four sources that can lead to concentration including: expressivity of data embedding, global measurements, entanglement and noise.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Kernel methods in Quantum Machine Learning (QML) have recently gained significant attention as a potential candidate for achieving a quantum advantage in data analysis. Among other attractive properties, when training a kernel-based model one is guaranteed to find the optimal model's parameters due to the convexity of the training landscape. However, this is based on the assumption that the quantum kernel can be efficiently obtained from quantum hardware. In this work we study the performance of quantum kernel models from the perspective of the resources needed to accurately estimate kernel values. We show that, under certain conditions, values of quantum kernels over different input data can be exponentially concentrated (in the number of qubits) towards some fixed value. Thus on training with a polynomial number of measurements, one ends up with a trivial model where the predictions on unseen inputs are independent of the input data. We identify four sources that can lead to concentration including: expressivity of data embedding, global measurements, entanglement and noise. For each source, an associated concentration bound of quantum kernels is analytically derived. Lastly, we show that when dealing with classical data, training a parametrized data embedding with a kernel alignment method is also susceptible to exponential concentration. Our results are verified through numerical simulations for several QML tasks. Altogether, we provide guidelines indicating that certain features should be avoided to ensure the efficient evaluation of quantum kernels and so the performance of quantum kernel methods.
Related papers
- The curse of random quantum data [62.24825255497622]
We quantify the performances of quantum machine learning in the landscape of quantum data.
We find that the training efficiency and generalization capabilities in quantum machine learning will be exponentially suppressed with the increase in qubits.
Our findings apply to both the quantum kernel method and the large-width limit of quantum neural networks.
arXiv Detail & Related papers (2024-08-19T12:18:07Z) - In Search of Quantum Advantage: Estimating the Number of Shots in Quantum Kernel Methods [30.565491081930997]
We develop an approach for estimating desired precision of kernel values, which is translated into the number of circuit runs.
We stress that quantum kernel methods should not only be considered from the machine learning performance perspective, but also from the context of the resource consumption.
arXiv Detail & Related papers (2024-07-22T16:29:35Z) - QUACK: Quantum Aligned Centroid Kernel [0.0]
We introduce QUACK, a quantum kernel algorithm whose time complexity scales linear with the number of samples during training.
Our algorithm is able to handle high-dimensional datasets such as MNIST with 784 features without any dimensionality reduction.
arXiv Detail & Related papers (2024-05-01T04:00:09Z) - Power Characterization of Noisy Quantum Kernels [52.47151453259434]
We show that noise may make quantum kernel methods to only have poor prediction capability, even when the generalization error is small.
We provide a crucial warning to employ noisy quantum kernel methods for quantum computation.
arXiv Detail & Related papers (2024-01-31T01:02:16Z) - Neural auto-designer for enhanced quantum kernels [59.616404192966016]
We present a data-driven approach that automates the design of problem-specific quantum feature maps.
Our work highlights the substantial role of deep learning in advancing quantum machine learning.
arXiv Detail & Related papers (2024-01-20T03:11:59Z) - Importance sampling for stochastic quantum simulations [68.8204255655161]
We introduce the qDrift protocol, which builds random product formulas by sampling from the Hamiltonian according to the coefficients.
We show that the simulation cost can be reduced while achieving the same accuracy, by considering the individual simulation cost during the sampling stage.
Results are confirmed by numerical simulations performed on a lattice nuclear effective field theory.
arXiv Detail & Related papers (2022-12-12T15:06:32Z) - Noisy Quantum Kernel Machines [58.09028887465797]
An emerging class of quantum learning machines is that based on the paradigm of quantum kernels.
We study how dissipation and decoherence affect their performance.
We show that decoherence and dissipation can be seen as an implicit regularization for the quantum kernel machines.
arXiv Detail & Related papers (2022-04-26T09:52:02Z) - Training Quantum Embedding Kernels on Near-Term Quantum Computers [0.08563354084119063]
Quantum embedding kernels (QEKs) constructed by embedding data into the Hilbert space of a quantum computer are a particular quantum kernel technique.
We first provide an accessible introduction to quantum embedding kernels and then analyze the practical issues arising when realizing them on a noisy near-term quantum computer.
arXiv Detail & Related papers (2021-05-05T18:41:13Z) - Towards understanding the power of quantum kernels in the NISQ era [79.8341515283403]
We show that the advantage of quantum kernels is vanished for large size datasets, few number of measurements, and large system noise.
Our work provides theoretical guidance of exploring advanced quantum kernels to attain quantum advantages on NISQ devices.
arXiv Detail & Related papers (2021-03-31T02:41:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.