Deep Quantum Neural Networks are Gaussian Process
- URL: http://arxiv.org/abs/2305.12664v1
- Date: Mon, 22 May 2023 03:07:43 GMT
- Title: Deep Quantum Neural Networks are Gaussian Process
- Authors: Ali Rad
- Abstract summary: We present a framework to examine the impact of finite width in the closed-form relationship using a $ 1/d$ expansion.
We elucidate the relationship between GP and its parameter space equivalent, characterized by the Quantum Neural Tangent Kernels (QNTK)
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The overparameterization of variational quantum circuits, as a model of
Quantum Neural Networks (QNN), not only improves their trainability but also
serves as a method for evaluating the property of a given ansatz by
investigating their kernel behavior in this regime. In this study, we shift our
perspective from the traditional viewpoint of training in parameter space into
function space by employing the Bayesian inference in the Reproducing Kernel
Hilbert Space (RKHS). We observe the influence of initializing parameters using
random Haar distribution results in the QNN behaving similarly to a Gaussian
Process (QNN-GP) at wide width or, empirically, at a deep depth. This outcome
aligns with the behaviors observed in classical neural networks under similar
circumstances with Gaussian initialization. Moreover, we present a framework to
examine the impact of finite width in the closed-form relationship using a $
1/d$ expansion, where $d$ represents the dimension of the circuit's Hilbert
space. The deviation from Gaussian output can be monitored by introducing new
quantum meta-kernels. Furthermore, we elucidate the relationship between GP and
its parameter space equivalent, characterized by the Quantum Neural Tangent
Kernels (QNTK). This study offers a systematic way to study QNN behavior in
over- and under-parameterized scenarios, based on the perturbation method, and
addresses the limitations of tracking the gradient descent methods for
higher-order corrections like dQNTK and ddQNTK. Additionally, this
probabilistic viewpoint lends itself naturally to accommodating noise within
our model.
Related papers
- Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - Neural Networks Asymptotic Behaviours for the Resolution of Inverse
Problems [0.0]
This paper presents a study of the effectiveness of Neural Network (NN) techniques for deconvolution inverse problems.
We consider NNs limits, corresponding to Gaussian Processes (GPs), where non-linearities in the parameters of the NN can be neglected.
We address the deconvolution inverse problem in the case of a quantum harmonic oscillator simulated through Monte Carlo techniques on a lattice.
arXiv Detail & Related papers (2024-02-14T17:42:24Z) - Wide Neural Networks as Gaussian Processes: Lessons from Deep
Equilibrium Models [16.07760622196666]
We study the deep equilibrium model (DEQ), an infinite-depth neural network with shared weight matrices across layers.
Our analysis reveals that as the width of DEQ layers approaches infinity, it converges to a Gaussian process.
Remarkably, this convergence holds even when the limits of depth and width are interchanged.
arXiv Detail & Related papers (2023-10-16T19:00:43Z) - Deep quantum neural networks form Gaussian processes [0.0]
We prove an analogous result for Quantum Neural Networks (QNNs)
We show that the outputs of certain models based on Haar random unitary or deep QNNs converge to Gaussian processes in the limit of large Hilbert space dimension $d$.
arXiv Detail & Related papers (2023-05-17T05:32:45Z) - Analyzing Convergence in Quantum Neural Networks: Deviations from Neural
Tangent Kernels [20.53302002578558]
A quantum neural network (QNN) is a parameterized mapping efficiently implementable on near-term Noisy Intermediate-Scale Quantum (NISQ) computers.
Despite the existing empirical and theoretical investigations, the convergence of QNN training is not fully understood.
arXiv Detail & Related papers (2023-03-26T22:58:06Z) - Gradient Descent in Neural Networks as Sequential Learning in RKBS [63.011641517977644]
We construct an exact power-series representation of the neural network in a finite neighborhood of the initial weights.
We prove that, regardless of width, the training sequence produced by gradient descent can be exactly replicated by regularized sequential learning.
arXiv Detail & Related papers (2023-02-01T03:18:07Z) - The Quantum Path Kernel: a Generalized Quantum Neural Tangent Kernel for
Deep Quantum Machine Learning [52.77024349608834]
Building a quantum analog of classical deep neural networks represents a fundamental challenge in quantum computing.
Key issue is how to address the inherent non-linearity of classical deep learning.
We introduce the Quantum Path Kernel, a formulation of quantum machine learning capable of replicating those aspects of deep machine learning.
arXiv Detail & Related papers (2022-12-22T16:06:24Z) - Interrelation of equivariant Gaussian processes and convolutional neural
networks [77.34726150561087]
Currently there exists rather promising new trend in machine leaning (ML) based on the relationship between neural networks (NN) and Gaussian processes (GP)
In this work we establish a relationship between the many-channel limit for CNNs equivariant with respect to two-dimensional Euclidean group with vector-valued neuron activations and the corresponding independently introduced equivariant Gaussian processes (GP)
arXiv Detail & Related papers (2022-09-17T17:02:35Z) - Symmetric Pruning in Quantum Neural Networks [111.438286016951]
Quantum neural networks (QNNs) exert the power of modern quantum machines.
QNNs with handcraft symmetric ansatzes generally experience better trainability than those with asymmetric ansatzes.
We propose the effective quantum neural tangent kernel (EQNTK) to quantify the convergence of QNNs towards the global optima.
arXiv Detail & Related papers (2022-08-30T08:17:55Z) - Chaos and Complexity from Quantum Neural Network: A study with Diffusion
Metric in Machine Learning [0.0]
We study the phenomena of quantum chaos and complexity in the machine learning dynamics of Quantum Neural Network (QNN)
We employ a statistical and differential geometric approach to study the learning theory of QNN.
arXiv Detail & Related papers (2020-11-16T10:41:47Z) - Kernel and Rich Regimes in Overparametrized Models [69.40899443842443]
We show that gradient descent on overparametrized multilayer networks can induce rich implicit biases that are not RKHS norms.
We also demonstrate this transition empirically for more complex matrix factorization models and multilayer non-linear networks.
arXiv Detail & Related papers (2020-02-20T15:43:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.