Effects of noise on the overparametrization of quantum neural networks
- URL: http://arxiv.org/abs/2302.05059v2
- Date: Mon, 26 Feb 2024 23:10:53 GMT
- Title: Effects of noise on the overparametrization of quantum neural networks
- Authors: Diego Garc\'ia-Mart\'in, Martin Larocca, M. Cerezo
- Abstract summary: We show that noise can "turn on" previously-zero eigenvalues of the QFIM.
Our results imply that current QNN capacity measures are ill-defined when hardware noise is present.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Overparametrization is one of the most surprising and notorious phenomena in
machine learning. Recently, there have been several efforts to study if, and
how, Quantum Neural Networks (QNNs) acting in the absence of hardware noise can
be overparametrized. In particular, it has been proposed that a QNN can be
defined as overparametrized if it has enough parameters to explore all
available directions in state space. That is, if the rank of the Quantum Fisher
Information Matrix (QFIM) for the QNN's output state is saturated. Here, we
explore how the presence of noise affects the overparametrization phenomenon.
Our results show that noise can "turn on" previously-zero eigenvalues of the
QFIM. This enables the parametrized state to explore directions that were
otherwise inaccessible, thus potentially turning an overparametrized QNN into
an underparametrized one. For small noise levels, the QNN is
quasi-overparametrized, as large eigenvalues coexists with small ones. Then, we
prove that as the magnitude of noise increases all the eigenvalues of the QFIM
become exponentially suppressed, indicating that the state becomes insensitive
to any change in the parameters. As such, there is a pull-and-tug effect where
noise can enable new directions, but also suppress the sensitivity to parameter
updates. Finally, our results imply that current QNN capacity measures are
ill-defined when hardware noise is present.
Related papers
- Noise-resistant adaptive Hamiltonian learning [30.632260870411177]
An adaptive Hamiltonian learning (AHL) model for data analysis and quantum state simulation is proposed to overcome problems such as low efficiency.
A noise-resistant quantum neural network (RQNN) based on AHL is developed, which improves the noise robustness of the quantum neural network.
arXiv Detail & Related papers (2025-01-14T11:12:59Z) - Symmetry breaking in geometric quantum machine learning in the presence
of noise [0.0]
This work studies the behavior of EQNN models in the presence of noise.
We claim that the symmetry breaking grows linearly in the number of layers and noise strength.
We provide strategies to enhance the symmetry protection of EQNN models in the presence of noise.
arXiv Detail & Related papers (2024-01-17T19:00:00Z) - Reduction of finite sampling noise in quantum neural networks [0.0]
We introduce the variance regularization, a technique for reducing the variance of the expectation value during the quantum model training.
This technique requires no additional circuit evaluations if the QNN is properly constructed.
We show that in our examples, it lowers the variance by an order of magnitude on average and leads to a significantly reduced noise level of the QNN.
arXiv Detail & Related papers (2023-06-02T15:59:47Z) - Towards Neural Variational Monte Carlo That Scales Linearly with System
Size [67.09349921751341]
Quantum many-body problems are central to demystifying some exotic quantum phenomena, e.g., high-temperature superconductors.
The combination of neural networks (NN) for representing quantum states, and the Variational Monte Carlo (VMC) algorithm, has been shown to be a promising method for solving such problems.
We propose a NN architecture called Vector-Quantized Neural Quantum States (VQ-NQS) that utilizes vector-quantization techniques to leverage redundancies in the local-energy calculations of the VMC algorithm.
arXiv Detail & Related papers (2022-12-21T19:00:04Z) - Symmetric Pruning in Quantum Neural Networks [111.438286016951]
Quantum neural networks (QNNs) exert the power of modern quantum machines.
QNNs with handcraft symmetric ansatzes generally experience better trainability than those with asymmetric ansatzes.
We propose the effective quantum neural tangent kernel (EQNTK) to quantify the convergence of QNNs towards the global optima.
arXiv Detail & Related papers (2022-08-30T08:17:55Z) - Noisy Quantum Kernel Machines [58.09028887465797]
An emerging class of quantum learning machines is that based on the paradigm of quantum kernels.
We study how dissipation and decoherence affect their performance.
We show that decoherence and dissipation can be seen as an implicit regularization for the quantum kernel machines.
arXiv Detail & Related papers (2022-04-26T09:52:02Z) - Learning Noise via Dynamical Decoupling of Entangled Qubits [49.38020717064383]
Noise in entangled quantum systems is difficult to characterize due to many-body effects involving multiple degrees of freedom.
We develop and apply multi-qubit dynamical decoupling sequences that characterize noise that occurs during two-qubit gates.
arXiv Detail & Related papers (2022-01-26T20:22:38Z) - Toward Trainability of Deep Quantum Neural Networks [87.04438831673063]
Quantum Neural Networks (QNNs) with random structures have poor trainability due to the exponentially vanishing gradient as the circuit depth and the qubit number increase.
We provide the first viable solution to the vanishing gradient problem for deep QNNs with theoretical guarantees.
arXiv Detail & Related papers (2021-12-30T10:27:08Z) - Theory of overparametrization in quantum neural networks [0.0]
We rigorously analyze the overparametrization phenomenon in Quantum Neural Networks (QNNs) with periodic structure.
Our results show that the dimension of the Lie algebra obtained from the generators of the QNN is an upper bound for $M_c$.
We then connect the notion of overparametrization to the QNN capacity, so that when a QNN is overparametrized, its capacity achieves its maximum possible value.
arXiv Detail & Related papers (2021-09-23T22:39:48Z) - Evaluating the noise resilience of variational quantum algorithms [0.0]
We simulate the effects of different types of noise in state preparation circuits of variational quantum algorithms.
We find that the inclusion of redundant parameterised gates makes the quantum circuits more resilient to noise.
arXiv Detail & Related papers (2020-11-02T16:56:58Z) - On the learnability of quantum neural networks [132.1981461292324]
We consider the learnability of the quantum neural network (QNN) built on the variational hybrid quantum-classical scheme.
We show that if a concept can be efficiently learned by QNN, then it can also be effectively learned by QNN even with gate noise.
arXiv Detail & Related papers (2020-07-24T06:34:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.