Neuromorphic Computing is Turing-Complete
- URL: http://arxiv.org/abs/2104.13983v1
- Date: Wed, 28 Apr 2021 19:25:01 GMT
- Title: Neuromorphic Computing is Turing-Complete
- Authors: Prasanna Date, Catherine Schuman, Bill Kay, Thomas Potok
- Abstract summary: Neuromorphic computing is a non-von Neumann computing paradigm that performs computation by emulating the human brain.
Neuromorphic systems are extremely energy-efficient and known to consume thousands of times less power than CPU and GPU.
We devise neuromorphic circuits for computing all the mu-recursive functions and all the mu-recursive operators.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neuromorphic computing is a non-von Neumann computing paradigm that performs
computation by emulating the human brain. Neuromorphic systems are extremely
energy-efficient and known to consume thousands of times less power than CPUs
and GPUs. They have the potential to drive critical use cases such as
autonomous vehicles, edge computing and internet of things in the future. For
this reason, they are sought to be an indispensable part of the future
computing landscape. Neuromorphic systems are mainly used for spike-based
machine learning applications, although there are some non-machine learning
applications in graph theory, differential equations, and spike-based
simulations. These applications suggest that neuromorphic computing might be
capable of general-purpose computing. However, general-purpose computability of
neuromorphic computing has not been established yet. In this work, we prove
that neuromorphic computing is Turing-complete and therefore capable of
general-purpose computing. Specifically, we present a model of neuromorphic
computing, with just two neuron parameters (threshold and leak), and two
synaptic parameters (weight and delay). We devise neuromorphic circuits for
computing all the {\mu}-recursive functions (i.e., constant, successor and
projection functions) and all the {\mu}-recursive operators (i.e., composition,
primitive recursion and minimization operators). Given that the {\mu}-recursive
functions and operators are precisely the ones that can be computed using a
Turing machine, this work establishes the Turing-completeness of neuromorphic
computing.
Related papers
- Hebbian Learning based Orthogonal Projection for Continual Learning of
Spiking Neural Networks [74.3099028063756]
We develop a new method with neuronal operations based on lateral connections and Hebbian learning.
We show that Hebbian and anti-Hebbian learning on recurrent lateral connections can effectively extract the principal subspace of neural activities.
Our method consistently solves for spiking neural networks with nearly zero forgetting.
arXiv Detail & Related papers (2024-02-19T09:29:37Z) - Concepts and Paradigms for Neuromorphic Programming [0.0]
Currently, neuromorphic computers are mostly limited to machine learning methods adapted from deep learning.
Neuromorphic computers have potential far beyond deep learning if we can only make use of their computational properties to harness their full power.
arXiv Detail & Related papers (2023-10-27T16:48:11Z) - A Neural Lambda Calculus: Neurosymbolic AI meets the foundations of
computing and functional programming [0.0]
We will analyze the ability of neural networks to learn how to execute programs as a whole.
We will introduce the use of integrated neural learning and calculi formalization.
arXiv Detail & Related papers (2023-04-18T20:30:16Z) - Encoding Integers and Rationals on Neuromorphic Computers using Virtual
Neuron [0.0]
We present the virtual neuron as an encoding mechanism for integers and rational numbers.
We show that it can perform an addition operation using 23 nJ of energy on average using a mixed-signal memristor-based neuromorphic processor.
arXiv Detail & Related papers (2022-08-15T23:18:26Z) - Neuromorphic Artificial Intelligence Systems [58.1806704582023]
Modern AI systems, based on von Neumann architecture and classical neural networks, have a number of fundamental limitations in comparison with the brain.
This article discusses such limitations and the ways they can be mitigated.
It presents an overview of currently available neuromorphic AI projects in which these limitations are overcome.
arXiv Detail & Related papers (2022-05-25T20:16:05Z) - Neurocompositional computing: From the Central Paradox of Cognition to a
new generation of AI systems [120.297940190903]
Recent progress in AI has resulted from the use of limited forms of neurocompositional computing.
New, deeper forms of neurocompositional computing create AI systems that are more robust, accurate, and comprehensible.
arXiv Detail & Related papers (2022-05-02T18:00:10Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - Neurocoder: Learning General-Purpose Computation Using Stored Neural
Programs [64.56890245622822]
Neurocoder is an entirely new class of general-purpose conditional computational machines.
It "codes" itself in a data-responsive way by composing relevant programs from a set of shareable, modular programs.
We show new capacity to learn modular programs, handle severe pattern shifts and remember old programs as new ones are learnt.
arXiv Detail & Related papers (2020-09-24T01:39:16Z) - A provably stable neural network Turing Machine [13.615420026818038]
We introduce a neural stack architecture, including a differentiable parametrized stack operator that approximates stack push and pop operations.
Using the neural stack with a recurrent neural network, we introduce a neural network Pushdown Automaton (nnPDA) and prove that nnPDA with finite/bounded neurons and time can simulate any PDA.
We prove that differentiable nnTM with bounded neurons can simulate Turing Machine (TM) in real-time.
arXiv Detail & Related papers (2020-06-05T19:45:49Z) - Reservoir memory machines [79.79659145328856]
We propose reservoir memory machines, which are able to solve some of the benchmark tests for Neural Turing Machines.
Our model can also be seen as an extension of echo state networks with an external memory, enabling arbitrarily long storage without interference.
arXiv Detail & Related papers (2020-02-12T01:45:00Z) - On the computational power and complexity of Spiking Neural Networks [0.0]
We introduce spiking neural networks as a machine model where---in contrast to the familiar Turing machine---information and the manipulation thereof are co-located in the machine.
We introduce canonical problems, define hierarchies of complexity classes and provide some first completeness results.
arXiv Detail & Related papers (2020-01-23T10:40:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.