Expressibility-induced Concentration of Quantum Neural Tangent Kernels
- URL: http://arxiv.org/abs/2311.04965v1
- Date: Wed, 8 Nov 2023 19:00:01 GMT
- Title: Expressibility-induced Concentration of Quantum Neural Tangent Kernels
- Authors: Li-Wei Yu, Weikang Li, Qi Ye, Zhide Lu, Zizhao Han, Dong-Ling Deng
- Abstract summary: We study the connections between the trainability and expressibility of quantum tangent kernel models.
For global loss functions, we rigorously prove that high expressibility of both the global and local quantum encodings can lead to exponential concentration of quantum tangent kernel values to zero.
Our discoveries unveil a pivotal characteristic of quantum neural tangent kernels, offering valuable insights for the design of wide quantum variational circuit models.
- Score: 4.561685127984694
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum tangent kernel methods provide an efficient approach to analyzing the
performance of quantum machine learning models in the infinite-width limit,
which is of crucial importance in designing appropriate circuit architectures
for certain learning tasks. Recently, they have been adapted to describe the
convergence rate of training errors in quantum neural networks in an analytical
manner. Here, we study the connections between the trainability and
expressibility of quantum tangent kernel models. In particular, for global loss
functions, we rigorously prove that high expressibility of both the global and
local quantum encodings can lead to exponential concentration of quantum
tangent kernel values to zero. Whereas for local loss functions, such issue of
exponential concentration persists owing to the high expressibility, but can be
partially mitigated. We further carry out extensive numerical simulations to
support our analytical theories. Our discoveries unveil a pivotal
characteristic of quantum neural tangent kernels, offering valuable insights
for the design of wide quantum variational circuit models in practical
applications.
Related papers
- Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - Quantum-enhanced learning with a controllable bosonic variational sensor network [0.40964539027092906]
Supervised learning assisted by an entangled sensor network (SLAEN)
We propose a SLAEN capable of handling nonlinear data classification tasks.
We uncover a threshold phenomenon in classification error -- when the energy of probes exceeds a certain threshold, the error drastically to zero.
arXiv Detail & Related papers (2024-04-28T19:41:40Z) - Neural auto-designer for enhanced quantum kernels [59.616404192966016]
We present a data-driven approach that automates the design of problem-specific quantum feature maps.
Our work highlights the substantial role of deep learning in advancing quantum machine learning.
arXiv Detail & Related papers (2024-01-20T03:11:59Z) - Coreset selection can accelerate quantum machine learning models with
provable generalization [6.733416056422756]
Quantum neural networks (QNNs) and quantum kernels stand as prominent figures in the realm of quantum machine learning.
We present a unified approach: coreset selection, aimed at expediting the training of QNNs and quantum kernels.
arXiv Detail & Related papers (2023-09-19T08:59:46Z) - The Quantum Path Kernel: a Generalized Quantum Neural Tangent Kernel for
Deep Quantum Machine Learning [52.77024349608834]
Building a quantum analog of classical deep neural networks represents a fundamental challenge in quantum computing.
Key issue is how to address the inherent non-linearity of classical deep learning.
We introduce the Quantum Path Kernel, a formulation of quantum machine learning capable of replicating those aspects of deep machine learning.
arXiv Detail & Related papers (2022-12-22T16:06:24Z) - QuanGCN: Noise-Adaptive Training for Robust Quantum Graph Convolutional
Networks [124.7972093110732]
We propose quantum graph convolutional networks (QuanGCN), which learns the local message passing among nodes with the sequence of crossing-gate quantum operations.
To mitigate the inherent noises from modern quantum devices, we apply sparse constraint to sparsify the nodes' connections.
Our QuanGCN is functionally comparable or even superior than the classical algorithms on several benchmark graph datasets.
arXiv Detail & Related papers (2022-11-09T21:43:16Z) - Noisy Quantum Kernel Machines [58.09028887465797]
An emerging class of quantum learning machines is that based on the paradigm of quantum kernels.
We study how dissipation and decoherence affect their performance.
We show that decoherence and dissipation can be seen as an implicit regularization for the quantum kernel machines.
arXiv Detail & Related papers (2022-04-26T09:52:02Z) - Analytic theory for the dynamics of wide quantum neural networks [7.636414695095235]
We study the dynamics of gradient descent for the training error of a class of variational quantum machine learning models.
For random quantum circuits, we predict and characterize an exponential decay of the residual training error as a function of the parameters of the system.
arXiv Detail & Related papers (2022-03-30T23:24:06Z) - Representation Learning via Quantum Neural Tangent Kernels [10.168123455922249]
Variational quantum circuits are used in quantum machine learning and variational quantum simulation tasks.
Here we discuss these problems, analyzing variational quantum circuits using the theory of neural tangent kernels.
We analytically solve the dynamics in the frozen limit, or lazy training regime, where variational angles change slowly and a linear perturbation is good enough.
arXiv Detail & Related papers (2021-11-08T01:30:34Z) - The Hintons in your Neural Network: a Quantum Field Theory View of Deep
Learning [84.33745072274942]
We show how to represent linear and non-linear layers as unitary quantum gates, and interpret the fundamental excitations of the quantum model as particles.
On top of opening a new perspective and techniques for studying neural networks, the quantum formulation is well suited for optical quantum computing.
arXiv Detail & Related papers (2021-03-08T17:24:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.