Tunable Quantum Neural Networks for Boolean Functions
- URL: http://arxiv.org/abs/2003.14122v2
- Date: Fri, 27 Nov 2020 16:44:42 GMT
- Title: Tunable Quantum Neural Networks for Boolean Functions
- Authors: Viet Pham Ngoc and Herbert Wiklicky
- Abstract summary: We introduce the idea of a generic quantum circuit whose gates can be tuned to learn any Boolean functions.
In order to perform the learning task, we have devised an algorithm that leverages the absence of measurements.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper we propose a new approach to quantum neural networks. Our
multi-layer architecture avoids the use of measurements that usually emulate
the non-linear activation functions which are characteristic of the classical
neural networks. Despite this, our proposed architecture is still able to learn
any Boolean function. This ability arises from the correspondence that exists
between a Boolean function and a particular quantum circuit made out of
multi-controlled NOT gates. This correspondence is built via a polynomial
representation of the function called the algebraic normal form. We use this
construction to introduce the idea of a generic quantum circuit whose gates can
be tuned to learn any Boolean functions. In order to perform the learning task,
we have devised an algorithm that leverages the absence of measurements. When
presented with a superposition of all the binary inputs of length $n$, the
network can learn the target function in at most $n+1$ updates.
Related papers
- Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - Realization of a quantum neural network using repeat-until-success
circuits in a superconducting quantum processor [0.0]
In this paper, we use repeat-until-success circuits enabled by real-time control-flow feedback to realize quantum neurons with non-linear activation functions.
As an example, we construct a minimal feedforward quantum neural network capable of learning all 2-to-1-bit Boolean functions.
This model is shown to perform non-linear classification and effectively learns from multiple copies of a single training state consisting of the maximal superposition of all inputs.
arXiv Detail & Related papers (2022-12-21T03:26:32Z) - Power and limitations of single-qubit native quantum neural networks [5.526775342940154]
Quantum neural networks (QNNs) have emerged as a leading strategy to establish applications in machine learning, chemistry, and optimization.
We formulate a theoretical framework for the expressive ability of data re-uploading quantum neural networks.
arXiv Detail & Related papers (2022-05-16T17:58:27Z) - Tunable Quantum Neural Networks in the QPAC-Learning Framework [0.0]
We investigate the performances of tunable quantum neural networks in the Quantum Probably Approximately Correct (QPAC) learning framework.
In order to tune the network so that it can approximate a target concept, we have devised and implemented an algorithm based on amplitude amplification.
The numerical results show that this approach can efficiently learn concepts from a simple class.
arXiv Detail & Related papers (2022-05-03T14:10:15Z) - Parametrized constant-depth quantum neuron [56.51261027148046]
We propose a framework that builds quantum neurons based on kernel machines.
We present here a neuron that applies a tensor-product feature mapping to an exponentially larger space.
It turns out that parametrization allows the proposed neuron to optimally fit underlying patterns that the existing neuron cannot fit.
arXiv Detail & Related papers (2022-02-25T04:57:41Z) - Quantum activation functions for quantum neural networks [0.0]
We show how to approximate any analytic function to any required accuracy without the need to measure the states encoding the information.
Our results recast the science of artificial neural networks in the architecture of gate-model quantum computers.
arXiv Detail & Related papers (2022-01-10T23:55:49Z) - A quantum algorithm for training wide and deep classical neural networks [72.2614468437919]
We show that conditions amenable to classical trainability via gradient descent coincide with those necessary for efficiently solving quantum linear systems.
We numerically demonstrate that the MNIST image dataset satisfies such conditions.
We provide empirical evidence for $O(log n)$ training of a convolutional neural network with pooling.
arXiv Detail & Related papers (2021-07-19T23:41:03Z) - The Hintons in your Neural Network: a Quantum Field Theory View of Deep
Learning [84.33745072274942]
We show how to represent linear and non-linear layers as unitary quantum gates, and interpret the fundamental excitations of the quantum model as particles.
On top of opening a new perspective and techniques for studying neural networks, the quantum formulation is well suited for optical quantum computing.
arXiv Detail & Related papers (2021-03-08T17:24:29Z) - Variational Monte Carlo calculations of $\mathbf{A\leq 4}$ nuclei with
an artificial neural-network correlator ansatz [62.997667081978825]
We introduce a neural-network quantum state ansatz to model the ground-state wave function of light nuclei.
We compute the binding energies and point-nucleon densities of $Aleq 4$ nuclei as emerging from a leading-order pionless effective field theory Hamiltonian.
arXiv Detail & Related papers (2020-07-28T14:52:28Z) - Exposing Hardware Building Blocks to Machine Learning Frameworks [4.56877715768796]
We focus on how to design topologies that complement such a view of neurons as unique functions.
We develop a library that supports training a neural network with custom sparsity and quantization.
arXiv Detail & Related papers (2020-04-10T14:26:00Z) - Machine learning transfer efficiencies for noisy quantum walks [62.997667081978825]
We show that the process of finding requirements on both a graph type and a quantum system coherence can be automated.
The automation is done by using a convolutional neural network of a particular type that learns to understand with which network and under which coherence requirements quantum advantage is possible.
Our results are of importance for demonstration of advantage in quantum experiments and pave the way towards automating scientific research and discoveries.
arXiv Detail & Related papers (2020-01-15T18:36:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.