Minimal Neuron Circuits -- Part I: Resonators
- URL: http://arxiv.org/abs/2506.02341v1
- Date: Tue, 03 Jun 2025 00:32:37 GMT
- Title: Minimal Neuron Circuits -- Part I: Resonators
- Authors: Amr Nabil, T. Nandha Kumar, Haider Abbas F. Almurib,
- Abstract summary: Spiking neurons act as computational units that determine the decision to fire an action potential.<n>This work presents a methodology to implement biologically plausible yet scalable spiking neurons in hardware.<n>We show that it is more efficient to design neurons that mimic the $I_Na,p+I_K$ model rather than the more complicated Hodgkin-Huxley model.
- Score: 1.1624569521079424
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking Neural Networks have earned increased recognition in recent years owing to their biological plausibility and event-driven computation. Spiking neurons are the fundamental building components of Spiking Neural Networks. Those neurons act as computational units that determine the decision to fire an action potential. This work presents a methodology to implement biologically plausible yet scalable spiking neurons in hardware. We show that it is more efficient to design neurons that mimic the $I_{Na,p}+I_{K}$ model rather than the more complicated Hodgkin-Huxley model. We demonstrate our methodology by presenting eleven novel minimal spiking neuron circuits in Parts I and II of the paper. We categorize the neuron circuits presented into two types: Resonators and Integrators. We discuss the methodology employed in designing neurons of the resonator type in Part I, while we discuss neurons of the integrator type in Part II. In part I, we postulate that Sodium channels exhibit type-N negative differential resistance. Consequently, we present three novel minimal neuron circuits that use type-N negative differential resistance circuits or devices as the Sodium channel. Nevertheless, the aim of the paper is not to present a set of minimal neuron circuits but rather the methodology utilized to construct those circuits.
Related papers
- Integrated Artificial Neurons from Metal Halide Perovskites [0.0]
Hardware neural networks could perform certain computational tasks orders of magnitude more energy-efficiently than conventional computers.<n> artificial neurons are a key component of these networks and are currently implemented with electronic circuits based on capacitors and transistors.<n>Here we demonstrate a fully on-chip artificial neuron based on microscale electrodes and perovskite semiconductors.
arXiv Detail & Related papers (2024-11-29T16:30:23Z) - PAON: A New Neuron Model using Padé Approximants [6.337675203577426]
Convolutional neural networks (CNN) are built upon the classical McCulloch-Pitts neuron model.
We introduce a brand new neuron model called Pade neurons (Paons), inspired by the Pade approximants.
Our experiments on the single-image super-resolution task show that PadeNets can obtain better results than competing architectures.
arXiv Detail & Related papers (2024-03-18T13:49:30Z) - Neuroscience inspired scientific machine learning (Part-1): Variable
spiking neuron for regression [2.1756081703276]
We introduce in this paper a novel spiking neuron, termed Variable Spiking Neuron (VSN)
It can reduce the redundant firing using lessons from biological neuron inspired Leaky Integrate and Fire Spiking Neurons (LIF-SN)
arXiv Detail & Related papers (2023-11-15T08:59:06Z) - A versatile circuit for emulating active biological dendrites applied to
sound localisation and neuron imitation [0.0]
We introduce a versatile circuit that emulates a segment of a dendrite which exhibits gain, introduces delays, and performs integration.
We also find that dendrites can form bursting neurons.
This significant discovery suggests the potential to fabricate neural networks solely comprised of dendrite circuits.
arXiv Detail & Related papers (2023-10-25T09:42:24Z) - Constraints on the design of neuromorphic circuits set by the properties
of neural population codes [61.15277741147157]
In the brain, information is encoded, transmitted and used to inform behaviour.
Neuromorphic circuits need to encode information in a way compatible to that used by populations of neuron in the brain.
arXiv Detail & Related papers (2022-12-08T15:16:04Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Neuromorphic Artificial Intelligence Systems [58.1806704582023]
Modern AI systems, based on von Neumann architecture and classical neural networks, have a number of fundamental limitations in comparison with the brain.
This article discusses such limitations and the ways they can be mitigated.
It presents an overview of currently available neuromorphic AI projects in which these limitations are overcome.
arXiv Detail & Related papers (2022-05-25T20:16:05Z) - Parametrized constant-depth quantum neuron [56.51261027148046]
We propose a framework that builds quantum neurons based on kernel machines.
We present here a neuron that applies a tensor-product feature mapping to an exponentially larger space.
It turns out that parametrization allows the proposed neuron to optimally fit underlying patterns that the existing neuron cannot fit.
arXiv Detail & Related papers (2022-02-25T04:57:41Z) - Energy-Efficient High-Accuracy Spiking Neural Network Inference Using
Time-Domain Neurons [0.18352113484137625]
This paper presents a low-power highly linear time-domain I&F neuron circuit.
The proposed neuron leads to more than 4.3x lower error rate on the MNIST inference.
The power consumed by the proposed neuron circuit is simulated to be 0.230uW per neuron, which is orders of magnitude lower than the existing voltage-domain neurons.
arXiv Detail & Related papers (2022-02-04T08:24:03Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z) - And/or trade-off in artificial neurons: impact on adversarial robustness [91.3755431537592]
Presence of sufficient number of OR-like neurons in a network can lead to classification brittleness and increased vulnerability to adversarial attacks.
We define AND-like neurons and propose measures to increase their proportion in the network.
Experimental results on the MNIST dataset suggest that our approach holds promise as a direction for further exploration.
arXiv Detail & Related papers (2021-02-15T08:19:05Z) - Non-linear Neurons with Human-like Apical Dendrite Activations [81.18416067005538]
We show that a standard neuron followed by our novel apical dendrite activation (ADA) can learn the XOR logical function with 100% accuracy.
We conduct experiments on six benchmark data sets from computer vision, signal processing and natural language processing.
arXiv Detail & Related papers (2020-02-02T21:09:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.