Development and Training of Quantum Neural Networks, Based on the
Principles of Grover's Algorithm
- URL: http://arxiv.org/abs/2110.01443v1
- Date: Fri, 1 Oct 2021 14:08:43 GMT
- Title: Development and Training of Quantum Neural Networks, Based on the
Principles of Grover's Algorithm
- Authors: Cesar Borisovich Pronin, Andrey Vladimirovich Ostroukh
- Abstract summary: This paper proposes the concept of combining the training process of a neural network with the functional structure of that neural network, interpreted as a quantum circuit.
As a simple example of a neural network, to showcase the concept, a perceptron with one trainable parameter - the weight of a synapse connected to a hidden neuron.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper highlights the possibility of creating quantum neural networks
that are trained by Grover's Search Algorithm. The purpose of this work is to
propose the concept of combining the training process of a neural network,
which is performed on the principles of Grover's algorithm, with the functional
structure of that neural network, interpreted as a quantum circuit. As a simple
example of a neural network, to showcase the concept, a perceptron with one
trainable parameter - the weight of a synapse connected to a hidden neuron.
Related papers
- Novel Kernel Models and Exact Representor Theory for Neural Networks Beyond the Over-Parameterized Regime [52.00917519626559]
This paper presents two models of neural-networks and their training applicable to neural networks of arbitrary width, depth and topology.
We also present an exact novel representor theory for layer-wise neural network training with unregularized gradient descent in terms of a local-extrinsic neural kernel (LeNK)
This representor theory gives insight into the role of higher-order statistics in neural network training and the effect of kernel evolution in neural-network kernel models.
arXiv Detail & Related papers (2024-05-24T06:30:36Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - NeuralFastLAS: Fast Logic-Based Learning from Raw Data [54.938128496934695]
Symbolic rule learners generate interpretable solutions, however they require the input to be encoded symbolically.
Neuro-symbolic approaches overcome this issue by mapping raw data to latent symbolic concepts using a neural network.
We introduce NeuralFastLAS, a scalable and fast end-to-end approach that trains a neural network jointly with a symbolic learner.
arXiv Detail & Related papers (2023-10-08T12:33:42Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - On the explainability of quantum neural networks based on variational quantum circuits [0.0]
Ridge functions are used to describe and study the lower bound of the approximation done by the neural networks.
We show that quantum neural networks based on variational quantum circuits can be written as a linear combination of ridge functions.
arXiv Detail & Related papers (2023-01-12T18:46:28Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - A note on the complex and bicomplex valued neural networks [0.0]
We first write a proof of the perceptron convergence algorithm for the complex multivalued neural networks (CMVNNs)
Our primary goal is to formulate and prove the perceptron convergence algorithm for the bicomplex multivalued neural networks (BMVNNs)
arXiv Detail & Related papers (2022-02-04T19:25:01Z) - Quantum activation functions for quantum neural networks [0.0]
We show how to approximate any analytic function to any required accuracy without the need to measure the states encoding the information.
Our results recast the science of artificial neural networks in the architecture of gate-model quantum computers.
arXiv Detail & Related papers (2022-01-10T23:55:49Z) - Development of Quantum Circuits for Perceptron Neural Network Training,
Based on the Principles of Grover's Algorithm [0.0]
This paper highlights the possibility of forming quantum circuits for training neural networks.
The perceptron was chosen as the architecture for the example neural network.
arXiv Detail & Related papers (2021-10-15T13:07:18Z) - The Hintons in your Neural Network: a Quantum Field Theory View of Deep
Learning [84.33745072274942]
We show how to represent linear and non-linear layers as unitary quantum gates, and interpret the fundamental excitations of the quantum model as particles.
On top of opening a new perspective and techniques for studying neural networks, the quantum formulation is well suited for optical quantum computing.
arXiv Detail & Related papers (2021-03-08T17:24:29Z) - A neural network model of perception and reasoning [0.0]
We show that a simple set of biologically consistent organizing principles confer these capabilities to neuronal networks.
We implement these principles in a novel machine learning algorithm, based on concept construction instead of optimization, to design deep neural networks that reason with explainable neuron activity.
arXiv Detail & Related papers (2020-02-26T06:26:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.