Neuromorphic Computing for Content-based Image Retrieval
- URL: http://arxiv.org/abs/2008.01380v2
- Date: Tue, 17 Aug 2021 22:59:17 GMT
- Title: Neuromorphic Computing for Content-based Image Retrieval
- Authors: Te-Yuan Liu, Ata Mahjoubfar, Daniel Prusinski, Luis Stevens
- Abstract summary: We explore the application of Loihi, a neuromorphic computing chip developed by Intel, for the computer vision task of image retrieval.
Our results show that the neuromorphic solution is about 2.5 times more energy-efficient compared with an ARM Cortex-A72 CPU and 12.5 times more energy-efficient compared with a lightweight convolutional neural network.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neuromorphic computing mimics the neural activity of the brain through
emulating spiking neural networks. In numerous machine learning tasks,
neuromorphic chips are expected to provide superior solutions in terms of cost
and power efficiency. Here, we explore the application of Loihi, a neuromorphic
computing chip developed by Intel, for the computer vision task of image
retrieval. We evaluated the functionalities and the performance metrics that
are critical in content-based visual search and recommender systems using
deep-learning embeddings. Our results show that the neuromorphic solution is
about 2.5 times more energy-efficient compared with an ARM Cortex-A72 CPU and
12.5 times more energy-efficient compared with NVIDIA T4 GPU for inference by a
lightweight convolutional neural network without batching while maintaining the
same level of matching accuracy. The study validates the potential of
neuromorphic computing in low-power image retrieval, as a complementary
paradigm to the existing von Neumann architectures.
Related papers
- Resistive Memory-based Neural Differential Equation Solver for Score-based Diffusion Model [55.116403765330084]
Current AIGC methods, such as score-based diffusion, are still deficient in terms of rapidity and efficiency.
We propose a time-continuous and analog in-memory neural differential equation solver for score-based diffusion.
We experimentally validate our solution with 180 nm resistive memory in-memory computing macros.
arXiv Detail & Related papers (2024-04-08T16:34:35Z) - Hebbian Learning based Orthogonal Projection for Continual Learning of
Spiking Neural Networks [74.3099028063756]
We develop a new method with neuronal operations based on lateral connections and Hebbian learning.
We show that Hebbian and anti-Hebbian learning on recurrent lateral connections can effectively extract the principal subspace of neural activities.
Our method consistently solves for spiking neural networks with nearly zero forgetting.
arXiv Detail & Related papers (2024-02-19T09:29:37Z) - Sparse Multitask Learning for Efficient Neural Representation of Motor
Imagery and Execution [30.186917337606477]
We introduce a sparse multitask learning framework for motor imagery (MI) and motor execution (ME) tasks.
Given a dual-task CNN model for MI-ME classification, we apply a saliency-based sparsification approach to prune superfluous connections.
Our results indicate that this tailored sparsity can mitigate the overfitting problem and improve the test performance with small amount of data.
arXiv Detail & Related papers (2023-12-10T09:06:16Z) - Spike-based Neuromorphic Computing for Next-Generation Computer Vision [1.2367795537503197]
Neuromorphic Computing promises orders of magnitude improvement in energy efficiency compared to traditional von Neumann computing paradigm.
The goal is to develop an adaptive, fault-tolerant, low-footprint, fast, low-energy intelligent system by learning and emulating brain functionality.
arXiv Detail & Related papers (2023-10-15T01:05:35Z) - Computational and Storage Efficient Quadratic Neurons for Deep Neural
Networks [10.379191500493503]
Experimental results have demonstrated that the proposed quadratic neuron structure exhibits superior computational and storage efficiency across various tasks.
This work introduces an efficient quadratic neuron architecture distinguished by its enhanced utilization of second-order computational information.
arXiv Detail & Related papers (2023-06-10T11:25:31Z) - Benchmarking the human brain against computational architectures [0.0]
We report a new methodological framework for benchmarking cognitive performance.
We determine computational efficiencies in experiments with human participants.
We show that a neuromorphic architecture with limited field-of-view size and added noise provides a good approximation to our results.
arXiv Detail & Related papers (2023-05-15T08:00:26Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - Spiking Neural Networks Hardware Implementations and Challenges: a
Survey [53.429871539789445]
Spiking Neural Networks are cognitive algorithms mimicking neuron and synapse operational principles.
We present the state of the art of hardware implementations of spiking neural networks.
We discuss the strategies employed to leverage the characteristics of these event-driven algorithms at the hardware level.
arXiv Detail & Related papers (2020-05-04T13:24:00Z) - Non-linear Neurons with Human-like Apical Dendrite Activations [81.18416067005538]
We show that a standard neuron followed by our novel apical dendrite activation (ADA) can learn the XOR logical function with 100% accuracy.
We conduct experiments on six benchmark data sets from computer vision, signal processing and natural language processing.
arXiv Detail & Related papers (2020-02-02T21:09:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.