Contemporary implementations of spiking bio-inspired neural networks
- URL: http://arxiv.org/abs/2412.17926v1
- Date: Mon, 23 Dec 2024 19:33:43 GMT
- Title: Contemporary implementations of spiking bio-inspired neural networks
- Authors: Andrey E. Schegolev, Marina V. Bastrakova, Michael A. Sergeev, Anastasia A. Maksimovskaya, Nikolay V. Klenov, Igor I. Soloviev,
- Abstract summary: spiking neural networks are the most bio-similar of all neural networks.
The specifics of the physical processes in the network cells affect their ability to simulate the neural activity of living neural tissue.
This survey reviews existing hardware neuromorphic implementations of bio-inspired spiking networks in the "semiconductor", "superconductor" and "optical" domains.
- Score: 0.0
- License:
- Abstract: The extensive development of the field of spiking neural networks has led to many areas of research that have a direct impact on people's lives. As the most bio-similar of all neural networks, spiking neural networks not only allow the solution of recognition and clustering problems (including dynamics), but also contribute to the growing knowledge of the human nervous system. Our analysis has shown that the hardware implementation is of great importance, since the specifics of the physical processes in the network cells affect their ability to simulate the neural activity of living neural tissue, the efficiency of certain stages of information processing, storage and transmission. This survey reviews existing hardware neuromorphic implementations of bio-inspired spiking networks in the "semiconductor", "superconductor" and "optical" domains. Special attention is given to the possibility of effective "hybrids" of different approaches
Related papers
- Retinal Vessel Segmentation via Neuron Programming [17.609169389489633]
This paper introduces a novel approach to neural network design, termed neuron programming'', to enhance a network's representation ability at the neuronal level.
Comprehensive experiments validate that neuron programming can achieve competitive performance in retinal blood segmentation.
arXiv Detail & Related papers (2024-11-17T16:03:30Z) - Enhancing learning in spiking neural networks through neuronal heterogeneity and neuromodulatory signaling [52.06722364186432]
We propose a biologically-informed framework for enhancing artificial neural networks (ANNs)
Our proposed dual-framework approach highlights the potential of spiking neural networks (SNNs) for emulating diverse spiking behaviors.
We outline how the proposed approach integrates brain-inspired compartmental models and task-driven SNNs, bioinspiration and complexity.
arXiv Detail & Related papers (2024-07-05T14:11:28Z) - Design and development of opto-neural processors for simulation of
neural networks trained in image detection for potential implementation in
hybrid robotics [0.0]
Living neural networks offer advantages of lower power consumption, faster processing, and biological realism.
This work proposes a simulated living neural network trained indirectly by backpropagating STDP based algorithms using precision activation by optogenetics.
arXiv Detail & Related papers (2024-01-17T04:42:49Z) - Brain-Inspired Machine Intelligence: A Survey of
Neurobiologically-Plausible Credit Assignment [65.268245109828]
We examine algorithms for conducting credit assignment in artificial neural networks that are inspired or motivated by neurobiology.
We organize the ever-growing set of brain-inspired learning schemes into six general families and consider these in the context of backpropagation of errors.
The results of this review are meant to encourage future developments in neuro-mimetic systems and their constituent learning processes.
arXiv Detail & Related papers (2023-12-01T05:20:57Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Constraints on the design of neuromorphic circuits set by the properties
of neural population codes [61.15277741147157]
In the brain, information is encoded, transmitted and used to inform behaviour.
Neuromorphic circuits need to encode information in a way compatible to that used by populations of neuron in the brain.
arXiv Detail & Related papers (2022-12-08T15:16:04Z) - Functional Connectome: Approximating Brain Networks with Artificial
Neural Networks [1.952097552284465]
We show that trained deep neural networks are able to capture the computations performed by synthetic biological networks with high accuracy.
We show that trained deep neural networks are able to perform zero-shot generalisation in novel environments.
Our study reveals a novel and promising direction in systems neuroscience, and can be expanded upon with a multitude of downstream applications.
arXiv Detail & Related papers (2022-11-23T13:12:13Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z) - Deep physical neural networks enabled by a backpropagation algorithm for
arbitrary physical systems [3.7785805908699803]
We propose a radical alternative for implementing deep neural network models: Physical Neural Networks.
We introduce a hybrid physical-digital algorithm called Physics-Aware Training to efficiently train sequences of controllable physical systems to act as deep neural networks.
arXiv Detail & Related papers (2021-04-27T18:00:02Z) - Under the Hood of Neural Networks: Characterizing Learned
Representations by Functional Neuron Populations and Network Ablations [0.3441021278275805]
We shed light on the roles of single neurons and groups of neurons within the network fulfilling a learned task.
We find that neither a neuron's magnitude or selectivity of activation, nor its impact on network performance are sufficient stand-alone indicators.
arXiv Detail & Related papers (2020-04-02T20:45:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.