On the computational power and complexity of Spiking Neural Networks
- URL: http://arxiv.org/abs/2001.08439v1
- Date: Thu, 23 Jan 2020 10:40:16 GMT
- Title: On the computational power and complexity of Spiking Neural Networks
- Authors: Johan Kwisthout, Nils Donselaar
- Abstract summary: We introduce spiking neural networks as a machine model where---in contrast to the familiar Turing machine---information and the manipulation thereof are co-located in the machine.
We introduce canonical problems, define hierarchies of complexity classes and provide some first completeness results.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The last decade has seen the rise of neuromorphic architectures based on
artificial spiking neural networks, such as the SpiNNaker, TrueNorth, and Loihi
systems. The massive parallelism and co-locating of computation and memory in
these architectures potentially allows for an energy usage that is orders of
magnitude lower compared to traditional Von Neumann architectures. However, to
date a comparison with more traditional computational architectures
(particularly with respect to energy usage) is hampered by the lack of a formal
machine model and a computational complexity theory for neuromorphic
computation. In this paper we take the first steps towards such a theory. We
introduce spiking neural networks as a machine model where---in contrast to the
familiar Turing machine---information and the manipulation thereof are
co-located in the machine. We introduce canonical problems, define hierarchies
of complexity classes and provide some first completeness results.
Related papers
- Benchmarking the human brain against computational architectures [0.0]
We report a new methodological framework for benchmarking cognitive performance.
We determine computational efficiencies in experiments with human participants.
We show that a neuromorphic architecture with limited field-of-view size and added noise provides a good approximation to our results.
arXiv Detail & Related papers (2023-05-15T08:00:26Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Neuromorphic Artificial Intelligence Systems [58.1806704582023]
Modern AI systems, based on von Neumann architecture and classical neural networks, have a number of fundamental limitations in comparison with the brain.
This article discusses such limitations and the ways they can be mitigated.
It presents an overview of currently available neuromorphic AI projects in which these limitations are overcome.
arXiv Detail & Related papers (2022-05-25T20:16:05Z) - Machines of finite depth: towards a formalization of neural networks [0.0]
We provide a unifying framework where artificial neural networks and their architectures can be formally described as particular cases of a general mathematical construction--machines of finite depth.
We prove this statement theoretically and practically, via a unified implementation that generalizes several classical architectures--dense, convolutional, and recurrent neural networks with a rich shortcut structure--and their respective backpropagation rules.
arXiv Detail & Related papers (2022-04-27T09:17:15Z) - The BrainScaleS-2 accelerated neuromorphic system with hybrid plasticity [0.0]
We describe the second generation of the BrainScaleS neuromorphic architecture, emphasizing applications enabled by this architecture.
It combines a custom accelerator core supporting the accelerated physical emulation of bio-inspired spiking neural network primitives with a tightly coupled digital processor and a digital event-routing network.
arXiv Detail & Related papers (2022-01-26T17:13:46Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - A deep learning theory for neural networks grounded in physics [2.132096006921048]
We argue that building large, fast and efficient neural networks on neuromorphic architectures requires rethinking the algorithms to implement and train them.
Our framework applies to a very broad class of models, namely systems whose state or dynamics are described by variational equations.
arXiv Detail & Related papers (2021-03-18T02:12:48Z) - Reservoir Memory Machines as Neural Computers [70.5993855765376]
Differentiable neural computers extend artificial neural networks with an explicit memory without interference.
We achieve some of the computational capabilities of differentiable neural computers with a model that can be trained very efficiently.
arXiv Detail & Related papers (2020-09-14T12:01:30Z) - Parametric machines: a fresh approach to architecture search [0.0]
We show how simple machines can be combined into more complex ones.
We explore finite- and infinite-depth machines, which generalize neural networks and neural ordinary differential equations.
arXiv Detail & Related papers (2020-07-06T14:27:06Z) - Hyperbolic Neural Networks++ [66.16106727715061]
We generalize the fundamental components of neural networks in a single hyperbolic geometry model, namely, the Poincar'e ball model.
Experiments show the superior parameter efficiency of our methods compared to conventional hyperbolic components, and stability and outperformance over their Euclidean counterparts.
arXiv Detail & Related papers (2020-06-15T08:23:20Z) - Spiking Neural Networks Hardware Implementations and Challenges: a
Survey [53.429871539789445]
Spiking Neural Networks are cognitive algorithms mimicking neuron and synapse operational principles.
We present the state of the art of hardware implementations of spiking neural networks.
We discuss the strategies employed to leverage the characteristics of these event-driven algorithms at the hardware level.
arXiv Detail & Related papers (2020-05-04T13:24:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.