A General Approach to Dropout in Quantum Neural Networks
- URL: http://arxiv.org/abs/2310.04120v1
- Date: Fri, 6 Oct 2023 09:39:30 GMT
- Title: A General Approach to Dropout in Quantum Neural Networks
- Authors: Francesco Scala, Andrea Ceschini, Massimo Panella, Dario Gerace
- Abstract summary: "Overfitting" is the phenomenon occurring when a given model learns the training data excessively well.
With the advent of Quantum Neural Networks as learning models, overfitting might soon become an issue.
- Score: 1.5771347525430772
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In classical Machine Learning, "overfitting" is the phenomenon occurring when
a given model learns the training data excessively well, and it thus performs
poorly on unseen data. A commonly employed technique in Machine Learning is the
so called "dropout", which prevents computational units from becoming too
specialized, hence reducing the risk of overfitting. With the advent of Quantum
Neural Networks as learning models, overfitting might soon become an issue,
owing to the increasing depth of quantum circuits as well as multiple embedding
of classical features, which are employed to give the computational
nonlinearity. Here we present a generalized approach to apply the dropout
technique in Quantum Neural Network models, defining and analysing different
quantum dropout strategies to avoid overfitting and achieve a high level of
generalization. Our study allows to envision the power of quantum dropout in
enabling generalization, providing useful guidelines on determining the maximal
dropout probability for a given model, based on overparametrization theory. It
also highlights how quantum dropout does not impact the features of the Quantum
Neural Networks model, such as expressibility and entanglement. All these
conclusions are supported by extensive numerical simulations, and may pave the
way to efficiently employing deep Quantum Machine Learning models based on
state-of-the-art Quantum Neural Networks.
Related papers
- Quantum Latent Diffusion Models [65.16624577812436]
We propose a potential version of a quantum diffusion model that leverages the established idea of classical latent diffusion models.
This involves using a traditional autoencoder to reduce images, followed by operations with variational circuits in the latent space.
The results demonstrate an advantage in using a quantum version, as evidenced by obtaining better metrics for the images generated by the quantum version.
arXiv Detail & Related papers (2025-01-19T21:24:02Z) - Quantum-Inspired Weight-Constrained Neural Network: Reducing Variable Numbers by 100x Compared to Standard Neural Networks [5.6805708828651]
We develop a classical weight-constrained neural network that generates weights based on quantum-inspired insights.
This approach can reduce the number of variables in a classical neural network by a factor of 135 while preserving its learnability.
In addition, we develop a dropout method to enhance the robustness of quantum machine learning models, which are highly susceptible to adversarial attacks.
arXiv Detail & Related papers (2024-12-26T21:35:12Z) - Bridging Classical and Quantum Machine Learning: Knowledge Transfer From
Classical to Quantum Neural Networks Using Knowledge Distillation [0.0]
This paper introduces a new method to transfer knowledge from classical to quantum neural networks using knowledge distillation.
We adapt classical convolutional neural network (CNN) architectures like LeNet and AlexNet to serve as teacher networks.
Quantum models achieve an average accuracy improvement of 0.80% on the MNIST dataset and 5.40% on the more complex Fashion MNIST dataset.
arXiv Detail & Related papers (2023-11-23T05:06:43Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Quantum Neural Network for Quantum Neural Computing [0.0]
We propose a new quantum neural network model for quantum neural computing.
Our model circumvents the problem that the state-space size grows exponentially with the number of neurons.
We benchmark our model for handwritten digit recognition and other nonlinear classification tasks.
arXiv Detail & Related papers (2023-05-15T11:16:47Z) - The Quantum Path Kernel: a Generalized Quantum Neural Tangent Kernel for
Deep Quantum Machine Learning [52.77024349608834]
Building a quantum analog of classical deep neural networks represents a fundamental challenge in quantum computing.
Key issue is how to address the inherent non-linearity of classical deep learning.
We introduce the Quantum Path Kernel, a formulation of quantum machine learning capable of replicating those aspects of deep machine learning.
arXiv Detail & Related papers (2022-12-22T16:06:24Z) - A didactic approach to quantum machine learning with a single qubit [68.8204255655161]
We focus on the case of learning with a single qubit, using data re-uploading techniques.
We implement the different proposed formulations in toy and real-world datasets using the qiskit quantum computing SDK.
arXiv Detail & Related papers (2022-11-23T18:25:32Z) - The Hintons in your Neural Network: a Quantum Field Theory View of Deep
Learning [84.33745072274942]
We show how to represent linear and non-linear layers as unitary quantum gates, and interpret the fundamental excitations of the quantum model as particles.
On top of opening a new perspective and techniques for studying neural networks, the quantum formulation is well suited for optical quantum computing.
arXiv Detail & Related papers (2021-03-08T17:24:29Z) - The power of quantum neural networks [3.327474729829121]
In the near-term, however, the benefits of quantum machine learning are not so clear.
We use tools from information geometry to define a notion of expressibility for quantum and classical models.
We show that quantum neural networks are able to achieve a significantly better effective dimension than comparable classical neural networks.
arXiv Detail & Related papers (2020-10-30T18:13:32Z) - Quantum Deformed Neural Networks [83.71196337378022]
We develop a new quantum neural network layer designed to run efficiently on a quantum computer.
It can be simulated on a classical computer when restricted in the way it entangles input states.
arXiv Detail & Related papers (2020-10-21T09:46:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.