Spiking Hyperdimensional Network: Neuromorphic Models Integrated with
Memory-Inspired Framework
- URL: http://arxiv.org/abs/2110.00214v1
- Date: Fri, 1 Oct 2021 05:01:21 GMT
- Title: Spiking Hyperdimensional Network: Neuromorphic Models Integrated with
Memory-Inspired Framework
- Authors: Zhuowen Zou, Haleh Alimohamadi, Farhad Imani, Yeseong Kim, Mohsen
Imani
- Abstract summary: We propose SpikeHD, the first framework that fundamentally combines Spiking neural network and hyperdimensional computing.
SpikeHD exploits spiking neural networks to extract low-level features by preserving the spatial and temporal correlation of raw event-based spike data.
Our evaluation on a set of benchmark classification problems shows that SpikeHD provides the following benefit compared to SNN architecture.
- Score: 8.910420030964172
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, brain-inspired computing models have shown great potential to
outperform today's deep learning solutions in terms of robustness and energy
efficiency. Particularly, Spiking Neural Networks (SNNs) and HyperDimensional
Computing (HDC) have shown promising results in enabling efficient and robust
cognitive learning. Despite the success, these two brain-inspired models have
different strengths. While SNN mimics the physical properties of the human
brain, HDC models the brain on a more abstract and functional level. Their
design philosophies demonstrate complementary patterns that motivate their
combination. With the help of the classical psychological model on memory, we
propose SpikeHD, the first framework that fundamentally combines Spiking neural
network and hyperdimensional computing. SpikeHD generates a scalable and strong
cognitive learning system that better mimics brain functionality. SpikeHD
exploits spiking neural networks to extract low-level features by preserving
the spatial and temporal correlation of raw event-based spike data. Then, it
utilizes HDC to operate over SNN output by mapping the signal into
high-dimensional space, learning the abstract information, and classifying the
data. Our extensive evaluation on a set of benchmark classification problems
shows that SpikeHD provides the following benefit compared to SNN architecture:
(1) significantly enhance learning capability by exploiting two-stage
information processing, (2) enables substantial robustness to noise and
failure, and (3) reduces the network size and required parameters to learn
complex information.
Related papers
- Unsupervised representation learning with Hebbian synaptic and structural plasticity in brain-like feedforward neural networks [0.0]
We introduce and evaluate a brain-like neural network model capable of unsupervised representation learning.
The model was tested on a diverse set of popular machine learning benchmarks.
arXiv Detail & Related papers (2024-06-07T08:32:30Z) - Spiking representation learning for associative memories [0.0]
We introduce a novel artificial spiking neural network (SNN) that performs unsupervised representation learning and associative memory operations.
The architecture of our model derives from the neocortical columnar organization and combines feedforward projections for learning hidden representations and recurrent projections for forming associative memories.
arXiv Detail & Related papers (2024-06-05T08:30:11Z) - SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence [51.6943465041708]
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
arXiv Detail & Related papers (2023-10-25T13:15:17Z) - To Spike or Not To Spike: A Digital Hardware Perspective on Deep
Learning Acceleration [4.712922151067433]
As deep learning models scale, they become increasingly competitive from domains spanning from computer vision to natural language processing.
The power efficiency of the biological brain outperforms any large-scale deep learning ( DL ) model.
Neuromorphic computing tries to mimic the brain operations to improve the efficiency of DL models.
arXiv Detail & Related papers (2023-06-27T19:04:00Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Exploiting High Performance Spiking Neural Networks with Efficient
Spiking Patterns [4.8416725611508244]
Spiking Neural Networks (SNNs) use discrete spike sequences to transmit information, which significantly mimics the information transmission of the brain.
This paper introduces the dynamic Burst pattern and designs the Leaky Integrate and Fire or Burst (LIFB) neuron that can make a trade-off between short-time performance and dynamic temporal performance.
arXiv Detail & Related papers (2023-01-29T04:22:07Z) - Hyperdimensional Computing vs. Neural Networks: Comparing Architecture
and Learning Process [3.244375684001034]
We make a comparative study between HDC and neural network to provide a different angle where HDC can be derived from an extremely compact neural network trained upfront.
Experimental results show such neural network-derived HDC model can achieve up to 21% and 5% accuracy increase from conventional and learning-based HDC models respectively.
arXiv Detail & Related papers (2022-07-24T21:23:50Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - CogNGen: Constructing the Kernel of a Hyperdimensional Predictive
Processing Cognitive Architecture [79.07468367923619]
We present a new cognitive architecture that combines two neurobiologically plausible, computational models.
We aim to develop a cognitive architecture that has the power of modern machine learning techniques.
arXiv Detail & Related papers (2022-03-31T04:44:28Z) - Deep Reinforcement Learning with Spiking Q-learning [51.386945803485084]
spiking neural networks (SNNs) are expected to realize artificial intelligence (AI) with less energy consumption.
It provides a promising energy-efficient way for realistic control tasks by combining SNNs with deep reinforcement learning (RL)
arXiv Detail & Related papers (2022-01-21T16:42:11Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.