Quantum HyperNetworks: Training Binary Neural Networks in Quantum
Superposition
- URL: http://arxiv.org/abs/2301.08292v1
- Date: Thu, 19 Jan 2023 20:06:48 GMT
- Title: Quantum HyperNetworks: Training Binary Neural Networks in Quantum
Superposition
- Authors: Juan Carrasquilla, Mohamed Hibat-Allah, Estelle Inack, Alireza
Makhzani, Kirill Neklyudov, Graham W. Taylor, Giacomo Torlai
- Abstract summary: We introduce quantum hypernetworks as a mechanism to train binary neural networks on quantum computers.
We show that our approach effectively finds optimal parameters, hyperparameters and architectural choices with high probability on classification problems.
Our unified approach provides an immense scope for other applications in the field of machine learning.
- Score: 16.1356415877484
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Binary neural networks, i.e., neural networks whose parameters and
activations are constrained to only two possible values, offer a compelling
avenue for the deployment of deep learning models on energy- and memory-limited
devices. However, their training, architectural design, and hyperparameter
tuning remain challenging as these involve multiple computationally expensive
combinatorial optimization problems. Here we introduce quantum hypernetworks as
a mechanism to train binary neural networks on quantum computers, which unify
the search over parameters, hyperparameters, and architectures in a single
optimization loop. Through classical simulations, we demonstrate that of our
approach effectively finds optimal parameters, hyperparameters and
architectural choices with high probability on classification problems
including a two-dimensional Gaussian dataset and a scaled-down version of the
MNIST handwritten digits. We represent our quantum hypernetworks as variational
quantum circuits, and find that an optimal circuit depth maximizes the
probability of finding performant binary neural networks. Our unified approach
provides an immense scope for other applications in the field of machine
learning.
Related papers
- Training-efficient density quantum machine learning [2.918930150557355]
Quantum machine learning requires powerful, flexible and efficiently trainable models.
We present density quantum neural networks, a learning model incorporating randomisation over a set of trainable unitaries.
arXiv Detail & Related papers (2024-05-30T16:40:28Z) - Principled Architecture-aware Scaling of Hyperparameters [69.98414153320894]
Training a high-quality deep neural network requires choosing suitable hyperparameters, which is a non-trivial and expensive process.
In this work, we precisely characterize the dependence of initializations and maximal learning rates on the network architecture.
We demonstrate that network rankings can be easily changed by better training networks in benchmarks.
arXiv Detail & Related papers (2024-02-27T11:52:49Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Realization of a quantum neural network using repeat-until-success
circuits in a superconducting quantum processor [0.0]
In this paper, we use repeat-until-success circuits enabled by real-time control-flow feedback to realize quantum neurons with non-linear activation functions.
As an example, we construct a minimal feedforward quantum neural network capable of learning all 2-to-1-bit Boolean functions.
This model is shown to perform non-linear classification and effectively learns from multiple copies of a single training state consisting of the maximal superposition of all inputs.
arXiv Detail & Related papers (2022-12-21T03:26:32Z) - Hyperparameter Importance of Quantum Neural Networks Across Small
Datasets [1.1470070927586014]
A quantum neural network can play a similar role to a neural network.
Very little is known about suitable circuit architectures for machine learning.
This work introduces new methodologies to study quantum machine learning models.
arXiv Detail & Related papers (2022-06-20T20:26:20Z) - Multiclass classification using quantum convolutional neural networks
with hybrid quantum-classical learning [0.5999777817331318]
We propose a quantum machine learning approach based on quantum convolutional neural networks for solving multiclass classification problems.
We use the proposed approach to demonstrate the 4-class classification for the case of the MNIST dataset using eight qubits for data encoding and four acnilla qubits.
Our results demonstrate comparable accuracy of our solution with classical convolutional neural networks with comparable numbers of trainable parameters.
arXiv Detail & Related papers (2022-03-29T09:07:18Z) - A quantum algorithm for training wide and deep classical neural networks [72.2614468437919]
We show that conditions amenable to classical trainability via gradient descent coincide with those necessary for efficiently solving quantum linear systems.
We numerically demonstrate that the MNIST image dataset satisfies such conditions.
We provide empirical evidence for $O(log n)$ training of a convolutional neural network with pooling.
arXiv Detail & Related papers (2021-07-19T23:41:03Z) - Quantum Annealing Formulation for Binary Neural Networks [40.99969857118534]
In this work, we explore binary neural networks, which are lightweight yet powerful models typically intended for resource constrained devices.
We devise a quadratic unconstrained binary optimization formulation for the training problem.
While the problem is intractable, i.e., the cost to estimate the binary weights scales exponentially with network size, we show how the problem can be optimized directly on a quantum annealer.
arXiv Detail & Related papers (2021-07-05T03:20:54Z) - Variational Quantum Optimization with Multi-Basis Encodings [62.72309460291971]
We introduce a new variational quantum algorithm that benefits from two innovations: multi-basis graph complexity and nonlinear activation functions.
Our results in increased optimization performance, two increase in effective landscapes and a reduction in measurement progress.
arXiv Detail & Related papers (2021-06-24T20:16:02Z) - Entanglement Rate Optimization in Heterogeneous Quantum Communication
Networks [79.8886946157912]
Quantum communication networks are emerging as a promising technology that could constitute a key building block in future communication networks in the 6G era and beyond.
Recent advances led to the deployment of small- and large-scale quantum communication networks with real quantum hardware.
In quantum networks, entanglement is a key resource that allows for data transmission between different nodes.
arXiv Detail & Related papers (2021-05-30T11:34:23Z) - The Hintons in your Neural Network: a Quantum Field Theory View of Deep
Learning [84.33745072274942]
We show how to represent linear and non-linear layers as unitary quantum gates, and interpret the fundamental excitations of the quantum model as particles.
On top of opening a new perspective and techniques for studying neural networks, the quantum formulation is well suited for optical quantum computing.
arXiv Detail & Related papers (2021-03-08T17:24:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.