SuperNeuro: A Fast and Scalable Simulator for Neuromorphic Computing
- URL: http://arxiv.org/abs/2305.02510v1
- Date: Thu, 4 May 2023 02:43:01 GMT
- Title: SuperNeuro: A Fast and Scalable Simulator for Neuromorphic Computing
- Authors: Prasanna Date, Chathika Gunaratne, Shruti Kulkarni, Robert Patton,
Mark Coletti, Thomas Potok
- Abstract summary: SuperNeuro is a fast and scalable simulator for neuromorphic computing.
We demonstrate that SuperNeuro can be approximately 10--300 times faster than some of the other simulators for small sparse networks.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In many neuromorphic workflows, simulators play a vital role for important
tasks such as training spiking neural networks (SNNs), running neuroscience
simulations, and designing, implementing and testing neuromorphic algorithms.
Currently available simulators are catered to either neuroscience workflows
(such as NEST and Brian2) or deep learning workflows (such as BindsNET). While
the neuroscience-based simulators are slow and not very scalable, the deep
learning-based simulators do not support certain functionalities such as
synaptic delay that are typical of neuromorphic workloads. In this paper, we
address this gap in the literature and present SuperNeuro, which is a fast and
scalable simulator for neuromorphic computing, capable of both homogeneous and
heterogeneous simulations as well as GPU acceleration. We also present
preliminary results comparing SuperNeuro to widely used neuromorphic simulators
such as NEST, Brian2 and BindsNET in terms of computation times. We demonstrate
that SuperNeuro can be approximately 10--300 times faster than some of the
other simulators for small sparse networks. On large sparse and large dense
networks, SuperNeuro can be approximately 2.2 and 3.4 times faster than the
other simulators respectively.
Related papers
- Neurosim: A Fast Simulator for Neuromorphic Robot Perception [18.380205726829356]
Neurosim is a high-performance library for simulating sensors.<n>It can achieve frame rates as high as FPS on a desktop GPU.<n>It integrates with a ZeroMQ-based communication library called Cortex.
arXiv Detail & Related papers (2026-02-16T18:57:04Z) - Channel-wise Parallelizable Spiking Neuron with Multiplication-free Dynamics and Large Temporal Receptive Fields [32.349167886062105]
Spiking Neural Networks (SNNs) are distinguished from Artificial Neural Networks (ANNs) for their sophisticated neuronal dynamics and sparse binary activations (spikes) inspired by the biological neural system.
Traditional neuron models use iterative step-by-step dynamics, resulting in serial computation and slow training speed of SNNs.
Recent parallelizable spiking neuron models have been proposed to fully utilize the massive parallel computing ability of graphics processing units to accelerate the training of SNNs.
arXiv Detail & Related papers (2025-01-24T13:44:08Z) - An Integrated Toolbox for Creating Neuromorphic Edge Applications [3.671692919685993]
Spiking Neural Networks (SNNs) and neuromorphic models are more efficient and have more biological realism.
CARLsim++ is an integrated toolbox that enables fast and easy creation of neuromorphic applications.
arXiv Detail & Related papers (2024-04-12T16:34:55Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - SparseProp: Efficient Event-Based Simulation and Training of Sparse
Recurrent Spiking Neural Networks [4.532517021515834]
Spiking Neural Networks (SNNs) are biologically-inspired models that are capable of processing information in streams of action potentials.
We introduce SparseProp, a novel event-based algorithm for simulating and training sparse SNNs.
arXiv Detail & Related papers (2023-12-28T18:48:10Z) - Addressing the speed-accuracy simulation trade-off for adaptive spiking
neurons [0.0]
We present an algorithmically reinterpreted the adaptive integrate-and-fire (ALIF) model.
We obtain over a $50times$ training speedup using small DTs on synthetic benchmarks.
We also showcase how our model makes it possible to quickly and accurately fit real electrophysiological recordings of cortical neurons.
arXiv Detail & Related papers (2023-11-19T18:21:45Z) - NeuralClothSim: Neural Deformation Fields Meet the Thin Shell Theory [70.10550467873499]
We propose NeuralClothSim, a new quasistatic cloth simulator using thin shells.
Our memory-efficient solver operates on a new continuous coordinate-based surface representation called neural deformation fields.
arXiv Detail & Related papers (2023-08-24T17:59:54Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - Neuroevolution of a Recurrent Neural Network for Spatial and Working
Memory in a Simulated Robotic Environment [57.91534223695695]
We evolved weights in a biologically plausible recurrent neural network (RNN) using an evolutionary algorithm to replicate the behavior and neural activity observed in rats.
Our method demonstrates how the dynamic activity in evolved RNNs can capture interesting and complex cognitive behavior.
arXiv Detail & Related papers (2021-02-25T02:13:52Z) - Adaptive Neural Network-Based Approximation to Accelerate Eulerian Fluid
Simulation [9.576796509480445]
We introduce Smartfluidnet, a framework that automates model generation and application.
Smartfluidnet generates multiple neural networks before the simulation to meet the execution time and simulation quality requirement.
We show that Smartfluidnet achieves 1.46x and 590x speedup compared with a state-of-the-art neural network model and the original fluid simulation respectively.
arXiv Detail & Related papers (2020-08-26T21:44:44Z) - Non-linear Neurons with Human-like Apical Dendrite Activations [81.18416067005538]
We show that a standard neuron followed by our novel apical dendrite activation (ADA) can learn the XOR logical function with 100% accuracy.
We conduct experiments on six benchmark data sets from computer vision, signal processing and natural language processing.
arXiv Detail & Related papers (2020-02-02T21:09:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.