Izhikevich-Inspired Optoelectronic Neurons with Excitatory and
Inhibitory Inputs for Energy-Efficient Photonic Spiking Neural Networks
- URL: http://arxiv.org/abs/2105.02809v1
- Date: Mon, 3 May 2021 03:58:03 GMT
- Title: Izhikevich-Inspired Optoelectronic Neurons with Excitatory and
Inhibitory Inputs for Energy-Efficient Photonic Spiking Neural Networks
- Authors: Yun-jhu Lee, Mehmet Berkay On, Xian Xiao, Roberto Proietti, S. J. Ben
Yoo
- Abstract summary: We develop a detailed optoelectronic neuron model in Verilog-A.
We conduct simulations using fully connected (FC) and convolutional neural networks (CNN)
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We designed, prototyped, and experimentally demonstrated, for the first time
to our knowledge, an optoelectronic spiking neuron inspired by the Izhikevich
model incorporating both excitatory and inhibitory optical spiking inputs and
producing optical spiking outputs accordingly. The optoelectronic neurons
consist of three transistors acting as electrical spiking circuits, a
vertical-cavity surface-emitting laser (VCSEL) for optical spiking outputs, and
two photodetectors for excitatory and inhibitory optical spiking inputs.
Additional inclusion of capacitors and resistors complete the
Izhikevich-inspired optoelectronic neurons, which receive excitatory and
inhibitory optical spikes as inputs from other optoelectronic neurons. We
developed a detailed optoelectronic neuron model in Verilog-A and simulated the
circuit-level operation of various cases with excitatory input and inhibitory
input signals. The experimental results closely resemble the simulated results
and demonstrate how the excitatory inputs trigger the optical spiking outputs
while the inhibitory inputs suppress the outputs. Utilizing the simulated
neuron model, we conducted simulations using fully connected (FC) and
convolutional neural networks (CNN). The simulation results using MNIST
handwritten digits recognition show 90% accuracy on unsupervised learning and
97% accuracy on a supervised modified FC neural network. We further designed a
nanoscale optoelectronic neuron utilizing quantum impedance conversion where a
200 aJ/spike input can trigger the output from on-chip nanolasers with 10
fJ/spike. The nanoscale neuron can support a fanout of ~80 or overcome 19 dB
excess optical loss while running at 10 GSpikes/second in the neural network,
which corresponds to 100x throughput and 1000x energy-efficiency improvement
compared to state-of-art electrical neuromorphic hardware such as Loihi and
NeuroGrid.
Related papers
- Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - Learning with Chemical versus Electrical Synapses -- Does it Make a
Difference? [61.85704286298537]
Bio-inspired neural networks have the potential to advance our understanding of neural computation and improve the state-of-the-art of AI systems.
We conduct experiments with autonomous lane-keeping through a photorealistic autonomous driving simulator to evaluate their performance under diverse conditions.
arXiv Detail & Related papers (2023-11-21T13:07:20Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Phenomenological Model of Superconducting Optoelectronic Loop Neurons [0.0]
Superconducting optoelectronic loop neurons are a class of circuits potentially conducive to networks for large-scale artificial cognition.
To date, all simulations of loop neurons have used first-principles circuit analysis to model the behavior of synapses, dendrites, and neurons.
Here we introduce a modeling framework that captures the behavior of the relevant synaptic, dendritic, and neuronal circuits.
arXiv Detail & Related papers (2022-10-18T16:38:35Z) - Adapting Brain-Like Neural Networks for Modeling Cortical Visual
Prostheses [68.96380145211093]
Cortical prostheses are devices implanted in the visual cortex that attempt to restore lost vision by electrically stimulating neurons.
Currently, the vision provided by these devices is limited, and accurately predicting the visual percepts resulting from stimulation is an open challenge.
We propose to address this challenge by utilizing 'brain-like' convolutional neural networks (CNNs), which have emerged as promising models of the visual system.
arXiv Detail & Related papers (2022-09-27T17:33:19Z) - Artificial optoelectronic spiking neuron based on a resonant tunnelling
diode coupled to a vertical cavity surface emitting laser [0.17354071459927545]
Excitable optoelectronic devices represent one of the key building blocks for implementation of artificial spiking neurons.
This work introduces and experimentally investigates an opto-electro-optical (O/E/O) artificial neuron built with a resonant tunnelling diode (RTD)
We demonstrate a well defined excitability threshold, above which this neuron produces 100 ns optical spiking responses with characteristic neural-like refractory period.
arXiv Detail & Related papers (2022-06-22T14:43:03Z) - Energy-Efficient High-Accuracy Spiking Neural Network Inference Using
Time-Domain Neurons [0.18352113484137625]
This paper presents a low-power highly linear time-domain I&F neuron circuit.
The proposed neuron leads to more than 4.3x lower error rate on the MNIST inference.
The power consumed by the proposed neuron circuit is simulated to be 0.230uW per neuron, which is orders of magnitude lower than the existing voltage-domain neurons.
arXiv Detail & Related papers (2022-02-04T08:24:03Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z) - Resonant tunnelling diode nano-optoelectronic spiking nodes for
neuromorphic information processing [0.0]
We introduce an optoelectronic artificial neuron capable of operating at ultrafast rates and with low energy consumption.
The proposed system combines an excitable tunnelling diode (RTD) element with a nanoscale light source.
arXiv Detail & Related papers (2021-07-14T14:11:04Z) - A superconducting nanowire spiking element for neural networks [0.0]
Key to the success of largescale neural networks is a power-efficient spiking element that is scalable and easily interfaced with traditional control electronics.
We present a spiking element fabricated from superconducting nanowires that has pulse energies on the order of 10 aJ.
We demonstrate that the device reproduces essential characteristics of biological neurons, such as a refractory period and a firing threshold.
arXiv Detail & Related papers (2020-07-29T20:48:36Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.