Hybrid Synaptic Structure for Spiking Neural Network Realization
- URL: http://arxiv.org/abs/2311.07787v1
- Date: Mon, 13 Nov 2023 22:42:07 GMT
- Title: Hybrid Synaptic Structure for Spiking Neural Network Realization
- Authors: Sasan Razmkhah, Mustafa Altay Karamuftuoglu and Ali Bozbey
- Abstract summary: This paper introduces a compact SFQ-based synapse design that applies positive and negative weighted inputs to the JJ-Soma.
The JJ-Synapse can operate at ultra-high frequencies, exhibits orders of magnitude lower power consumption than CMOS counterparts, and can be conveniently fabricated using commercial Nb processes.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural networks and neuromorphic computing play pivotal roles in deep
learning and machine vision. Due to their dissipative nature and inherent
limitations, traditional semiconductor-based circuits face challenges in
realizing ultra-fast and low-power neural networks. However, the spiking
behavior characteristic of single flux quantum (SFQ) circuits positions them as
promising candidates for spiking neural networks (SNNs). Our previous work
showcased a JJ-Soma design capable of operating at tens of gigahertz while
consuming only a fraction of the power compared to traditional circuits, as
documented in [1]. This paper introduces a compact SFQ-based synapse design
that applies positive and negative weighted inputs to the JJ-Soma. Using an
RSFQ synapse empowers us to replicate the functionality of a biological neuron,
a crucial step in realizing a complete SNN. The JJ-Synapse can operate at
ultra-high frequencies, exhibits orders of magnitude lower power consumption
than CMOS counterparts, and can be conveniently fabricated using commercial Nb
processes. Furthermore, the network's flexibility enables modifications by
incorporating cryo-CMOS circuits for weight value adjustments. In our endeavor,
we have successfully designed, fabricated, and partially tested the JJ-Synapse
within our cryocooler system. Integration with the JJ-Soma further facilitates
the realization of a high-speed inference SNN.
Related papers
- Predicting Chaotic Systems with Quantum Echo-state Networks [1.349950008899546]
We present and examine a quantum circuit (QC) that implements and aims to improve upon the classical echo-state network (ESN)
QESNs aim to reduce this need for prohibitively large reservoirs by leveraging the unique capabilities of quantum computers.
We conducted simulated QC experiments on the chaotic Lorenz system, both with noisy and noiseless models, to demonstrate the circuit's performance.
arXiv Detail & Related papers (2024-12-10T20:39:16Z) - Hardware-Friendly Implementation of Physical Reservoir Computing with CMOS-based Time-domain Analog Spiking Neurons [0.26963330643873434]
This paper introduces a spiking neural network (SNN) for a hardware-friendly physical reservoir computing (RC) on a complementary metal-oxide-semiconductor (CMOS) platform.
We demonstrate RC through short-term memory and exclusive OR tasks, and the spoken digit recognition task with an accuracy of 97.7%.
arXiv Detail & Related papers (2024-09-18T00:23:00Z) - Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Hybrid Spiking Neural Networks for Low-Power Intra-Cortical Brain-Machine Interfaces [42.72938925647165]
Intra-cortical brain-machine interfaces (iBMIs) have the potential to dramatically improve the lives of people with paraplegia.
Current iBMIs suffer from scalability and mobility limitations due to bulky hardware and wiring.
We are investigating hybrid spiking neural networks for embedded neural decoding in wireless iBMIs.
arXiv Detail & Related papers (2024-09-06T17:48:44Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - Deep Neuromorphic Networks with Superconducting Single Flux Quanta [45.60688252288563]
Neuromorphic circuits are a promising approach to computing where techniques used by the brain to achieve high efficiency are exploited.
Many existing neuromorphic circuits rely on unconventional and useful properties of novel technologies to better mimic the operation of the brain.
One such technology is single flux quantum (SFQ) logic -- a cryogenic superconductive technology in which the data are represented by quanta of magnetic flux (fluxons)
The movement of a fluxon within a circuit produces a quantized voltage pulse (SFQ pulse), resembling a neuronal spiking event.
arXiv Detail & Related papers (2023-09-21T10:44:02Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - A Resource-efficient Spiking Neural Network Accelerator Supporting
Emerging Neural Encoding [6.047137174639418]
Spiking neural networks (SNNs) recently gained momentum due to their low-power multiplication-free computing.
SNNs require very long spike trains (up to 1000) to reach an accuracy similar to their artificial neural network (ANN) counterparts for large models.
We present a novel hardware architecture that can efficiently support SNN with emerging neural encoding.
arXiv Detail & Related papers (2022-06-06T10:56:25Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z) - LogicNets: Co-Designed Neural Networks and Circuits for
Extreme-Throughput Applications [6.9276012494882835]
We present a novel method for designing neural network topologies that directly map to a highly efficient FPGA implementation.
We show that the combination of sparsity and low-bit activation quantization results in high-speed circuits with small logic depth and low LUT cost.
arXiv Detail & Related papers (2020-04-06T22:15:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.