A spiking photonic neural network of 40.000 neurons, trained with rank-order coding for leveraging sparsity
- URL: http://arxiv.org/abs/2411.19209v2
- Date: Wed, 29 Jan 2025 12:04:40 GMT
- Title: A spiking photonic neural network of 40.000 neurons, trained with rank-order coding for leveraging sparsity
- Authors: Ria Talukder, Anas Skalli, Xavier Porte, Simon Thorpe, Daniel Brunner,
- Abstract summary: We present a photonic neural network (SNN) comprising 40,000 neurons using off-the-shelf components.
The network achieves 83.5% accuracy on MNIST using 22% of neurons, and 77.5% with 8.5% neuron utilization.
This demonstration integrates photonic nonlinearity, excitability, and sparse computation, paving the way for efficient large-scale photonic neuromorphic systems.
- Score: 0.0
- License:
- Abstract: piking neural networks are neuromorphic systems that emulate certain aspects of biological neurons, offering potential advantages in energy efficiency and speed by for example leveraging sparsity. While CMOS-based electronic SNN hardware has shown promise, scalability and parallelism challenges remain. Photonics provides a promising platform for SNNs due to the speed of excitable photonic devices standing in as neurons and the parallelism and low-latency of optical signal conduction. Here, we present a photonic SNN comprising 40,000 neurons using off-the-shelf components, including a spatial light modulator and a CMOS camera, enabling scalable and cost-effective implementations for photonic SNN proof of concept studies. The system is governed by a modified Ikeda map, were adding additional inhibitory feedback forcing introduces excitability akin to biological dynamics. Using latency encoding and sparsity, the network achieves 83.5% accuracy on MNIST using 22% of neurons, and 77.5% with 8.5% neuron utilization. Training is performed via liquid state machine concepts combined with the hardware-compatible SPSA algorithm, marking its first use in photonic neural networks. This demonstration integrates photonic nonlinearity, excitability, and sparse computation, paving the way for efficient large-scale photonic neuromorphic systems.
Related papers
- Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - Free-Space Optical Spiking Neural Network [0.0]
We introduce the Free-space Optical deep Spiking Convolutional Neural Network (OSCNN)
This novel approach draws inspiration from computational models of the human eye.
Our results demonstrate promising performance with minimal latency and power consumption compared to their electronic ONN counterparts.
arXiv Detail & Related papers (2023-11-08T09:41:14Z) - SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence [51.6943465041708]
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
arXiv Detail & Related papers (2023-10-25T13:15:17Z) - Spatially Varying Nanophotonic Neural Networks [39.1303097259564]
Photonic processors that execute operations using photons instead of electrons promise to enable optical neural networks with ultra-low latency and power consumption.
Existing optical neural networks, limited by the underlying network designs, have achieved image recognition accuracy far below that of state-of-the-art electronic neural networks.
arXiv Detail & Related papers (2023-08-07T08:48:46Z) - Sharing Leaky-Integrate-and-Fire Neurons for Memory-Efficient Spiking
Neural Networks [9.585985556876537]
Non-linear activation of Leaky-Integrate-and-Fire (LIF) neurons requires additional memory to store a membrane voltage to capture the temporal dynamics of spikes.
We propose a simple and effective solution, EfficientLIF-Net, which shares the LIF neurons across different layers and channels.
Our EfficientLIF-Net achieves comparable accuracy with the standard SNNs while bringing up to 4.3X forward memory efficiency and 21.9X backward memory efficiency for LIF neurons.
arXiv Detail & Related papers (2023-05-26T22:55:26Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - A Resource-efficient Spiking Neural Network Accelerator Supporting
Emerging Neural Encoding [6.047137174639418]
Spiking neural networks (SNNs) recently gained momentum due to their low-power multiplication-free computing.
SNNs require very long spike trains (up to 1000) to reach an accuracy similar to their artificial neural network (ANN) counterparts for large models.
We present a novel hardware architecture that can efficiently support SNN with emerging neural encoding.
arXiv Detail & Related papers (2022-06-06T10:56:25Z) - Single-Shot Optical Neural Network [55.41644538483948]
'Weight-stationary' analog optical and electronic hardware has been proposed to reduce the compute resources required by deep neural networks.
We present a scalable, single-shot-per-layer weight-stationary optical processor.
arXiv Detail & Related papers (2022-05-18T17:49:49Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z) - Photonic neural field on a silicon chip: large-scale, high-speed
neuro-inspired computing and sensing [0.0]
Photonic neural networks have significant potential for high-speed neural processing with low latency and ultralow energy consumption.
We propose the concept of a photonic neural field and implement it experimentally on a silicon chip to realize highly scalable neuro-inspired computing.
In this study, we use the on-chip photonic neural field as a reservoir of information and demonstrate a high-speed chaotic time-series prediction with low errors.
arXiv Detail & Related papers (2021-05-22T09:28:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.