Chaos and Complexity from Quantum Neural Network: A study with Diffusion
Metric in Machine Learning
- URL: http://arxiv.org/abs/2011.07145v2
- Date: Tue, 16 Mar 2021 14:36:03 GMT
- Title: Chaos and Complexity from Quantum Neural Network: A study with Diffusion
Metric in Machine Learning
- Authors: Sayantan Choudhury, Ankan Dutta and Debisree Ray
- Abstract summary: We study the phenomena of quantum chaos and complexity in the machine learning dynamics of Quantum Neural Network (QNN)
We employ a statistical and differential geometric approach to study the learning theory of QNN.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work, our prime objective is to study the phenomena of quantum chaos
and complexity in the machine learning dynamics of Quantum Neural Network
(QNN). A Parameterized Quantum Circuits (PQCs) in the hybrid quantum-classical
framework is introduced as a universal function approximator to perform
optimization with Stochastic Gradient Descent (SGD). We employ a statistical
and differential geometric approach to study the learning theory of QNN. The
evolution of parametrized unitary operators is correlated with the trajectory
of parameters in the Diffusion metric. We establish the parametrized version of
Quantum Complexity and Quantum Chaos in terms of physically relevant
quantities, which are not only essential in determining the stability, but also
essential in providing a very significant lower bound to the generalization
capability of QNN. We explicitly prove that when the system executes limit
cycles or oscillations in the phase space, the generalization capability of QNN
is maximized. Finally, we have determined the generalization capability bound
on the variance of parameters of the QNN in a steady state condition using
Cauchy Schwartz Inequality.
Related papers
- Efficient Learning for Linear Properties of Bounded-Gate Quantum Circuits [63.733312560668274]
Given a quantum circuit containing d tunable RZ gates and G-d Clifford gates, can a learner perform purely classical inference to efficiently predict its linear properties?
We prove that the sample complexity scaling linearly in d is necessary and sufficient to achieve a small prediction error, while the corresponding computational complexity may scale exponentially in d.
We devise a kernel-based learning model capable of trading off prediction error and computational complexity, transitioning from exponential to scaling in many practical settings.
arXiv Detail & Related papers (2024-08-22T08:21:28Z) - Randomness-enhanced expressivity of quantum neural networks [7.7991930692137466]
We propose a novel approach to enhance the expressivity of QNNs by incorporating randomness into quantum circuits.
We prove that our approach can accurately approximate arbitrary target operators using Uhlmann's theorem for majorization.
We find the expressivity of QNNs is enhanced by introducing randomness for multiple learning tasks, which could have broad application in quantum machine learning.
arXiv Detail & Related papers (2023-08-09T07:17:13Z) - Analyzing Convergence in Quantum Neural Networks: Deviations from Neural
Tangent Kernels [20.53302002578558]
A quantum neural network (QNN) is a parameterized mapping efficiently implementable on near-term Noisy Intermediate-Scale Quantum (NISQ) computers.
Despite the existing empirical and theoretical investigations, the convergence of QNN training is not fully understood.
arXiv Detail & Related papers (2023-03-26T22:58:06Z) - The Quantum Path Kernel: a Generalized Quantum Neural Tangent Kernel for
Deep Quantum Machine Learning [52.77024349608834]
Building a quantum analog of classical deep neural networks represents a fundamental challenge in quantum computing.
Key issue is how to address the inherent non-linearity of classical deep learning.
We introduce the Quantum Path Kernel, a formulation of quantum machine learning capable of replicating those aspects of deep machine learning.
arXiv Detail & Related papers (2022-12-22T16:06:24Z) - Implementation and Learning of Quantum Hidden Markov Models [0.0]
We propose a unitary parameterization and an efficient learning algorithm for Quantum Hidden Markov Models (QHMMs)
By leveraging the richer dynamics of quantum channels, we demonstrate the greater efficiency of quantum generators compared to classical ones.
We show that any QHMM can be efficiently implemented and simulated using a quantum circuit with mid-circuit measurements.
arXiv Detail & Related papers (2022-12-07T17:25:02Z) - Symmetric Pruning in Quantum Neural Networks [111.438286016951]
Quantum neural networks (QNNs) exert the power of modern quantum machines.
QNNs with handcraft symmetric ansatzes generally experience better trainability than those with asymmetric ansatzes.
We propose the effective quantum neural tangent kernel (EQNTK) to quantify the convergence of QNNs towards the global optima.
arXiv Detail & Related papers (2022-08-30T08:17:55Z) - Theory of Quantum Generative Learning Models with Maximum Mean
Discrepancy [67.02951777522547]
We study learnability of quantum circuit Born machines (QCBMs) and quantum generative adversarial networks (QGANs)
We first analyze the generalization ability of QCBMs and identify their superiorities when the quantum devices can directly access the target distribution.
Next, we prove how the generalization error bound of QGANs depends on the employed Ansatz, the number of qudits, and input states.
arXiv Detail & Related papers (2022-05-10T08:05:59Z) - Realizing Quantum Convolutional Neural Networks on a Superconducting
Quantum Processor to Recognize Quantum Phases [2.1465372441653354]
Quantum neural networks tailored to recognize specific features of quantum states by combining unitary operations, measurements and feedforward promise to require fewer measurements and to tolerate errors.
We realize a quantum convolutional neural network (QCNN) on a 7-qubit superconducting quantum processor to identify symmetry-protected topological phases of a spin model characterized by a non-zero string order parameter.
We find that, despite being composed of finite-fidelity gates itself, the QCNN recognizes the topological phase with higher fidelity than direct measurements of the string order parameter for the prepared states.
arXiv Detail & Related papers (2021-09-13T12:32:57Z) - On the learnability of quantum neural networks [132.1981461292324]
We consider the learnability of the quantum neural network (QNN) built on the variational hybrid quantum-classical scheme.
We show that if a concept can be efficiently learned by QNN, then it can also be effectively learned by QNN even with gate noise.
arXiv Detail & Related papers (2020-07-24T06:34:34Z) - Entanglement Classification via Neural Network Quantum States [58.720142291102135]
In this paper we combine machine-learning tools and the theory of quantum entanglement to perform entanglement classification for multipartite qubit systems in pure states.
We use a parameterisation of quantum systems using artificial neural networks in a restricted Boltzmann machine (RBM) architecture, known as Neural Network Quantum States (NNS)
arXiv Detail & Related papers (2019-12-31T07:40:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.