From Convolutions towards Spikes: The Environmental Metric that the
Community currently Misses
- URL: http://arxiv.org/abs/2111.08361v1
- Date: Tue, 16 Nov 2021 11:04:42 GMT
- Title: From Convolutions towards Spikes: The Environmental Metric that the
Community currently Misses
- Authors: Aviral Chharia, Shivu Chauhan, Rahul Upadhyay, Vinay Kumar
- Abstract summary: We show that currently used ANNs are not what we find in nature, and why, although having lower performance, spiking neural networks have attracted much interest.
We highlight the hardware gaps restricting the researchers from using spike-based computation for developing neuromorphic energy-efficient microchips.
We also define a new evaluation metric 'NATURE' for reporting the carbon footprint of AI models.
- Score: 3.498371632913735
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Today, the AI community is obsessed with 'state-of-the-art' scores (80%
papers in NeurIPS) as the major performance metrics, due to which an important
parameter, i.e., the environmental metric, remains unreported. Computational
capabilities were a limiting factor a decade ago; however, in foreseeable
future circumstances, the challenge will be to develop environment-friendly and
power-efficient algorithms. The human brain, which has been optimizing itself
for almost a million years, consumes the same amount of power as a typical
laptop. Therefore, developing nature-inspired algorithms is one solution to it.
In this study, we show that currently used ANNs are not what we find in nature,
and why, although having lower performance, spiking neural networks, which
mirror the mammalian visual cortex, have attracted much interest. We further
highlight the hardware gaps restricting the researchers from using spike-based
computation for developing neuromorphic energy-efficient microchips on a large
scale. Using neuromorphic processors instead of traditional GPUs might be more
environment friendly and efficient. These processors will turn SNNs into an
ideal solution for the problem. This paper presents in-depth attention
highlighting the current gaps, the lack of comparative research, while
proposing new research directions at the intersection of two fields --
neuroscience and deep learning. Further, we define a new evaluation metric
'NATURE' for reporting the carbon footprint of AI models.
Related papers
- Topology Optimization of Random Memristors for Input-Aware Dynamic SNN [44.38472635536787]
We introduce pruning optimization for input-aware dynamic memristive spiking neural network (PRIME)
Signal representation-wise, PRIME employs leaky integrate-and-fire neurons to emulate the brain's inherent spiking mechanism.
For reconfigurability, inspired by the brain's dynamic adjustment of computational depth, PRIME employs an input-aware dynamic early stop policy.
arXiv Detail & Related papers (2024-07-26T09:35:02Z) - On the Opportunities of Green Computing: A Survey [80.21955522431168]
Artificial Intelligence (AI) has achieved significant advancements in technology and research with the development over several decades.
The needs for high computing power brings higher carbon emission and undermines research fairness.
To tackle the challenges of computing resources and environmental impact of AI, Green Computing has become a hot research topic.
arXiv Detail & Related papers (2023-11-01T11:16:41Z) - Spike-based Neuromorphic Computing for Next-Generation Computer Vision [1.2367795537503197]
Neuromorphic Computing promises orders of magnitude improvement in energy efficiency compared to traditional von Neumann computing paradigm.
The goal is to develop an adaptive, fault-tolerant, low-footprint, fast, low-energy intelligent system by learning and emulating brain functionality.
arXiv Detail & Related papers (2023-10-15T01:05:35Z) - Brain-Inspired Computational Intelligence via Predictive Coding [89.6335791546526]
Predictive coding (PC) has shown promising performance in machine intelligence tasks.
PC can model information processing in different brain areas, can be used in cognitive control and robotics.
arXiv Detail & Related papers (2023-08-15T16:37:16Z) - To Spike or Not To Spike: A Digital Hardware Perspective on Deep
Learning Acceleration [4.712922151067433]
As deep learning models scale, they become increasingly competitive from domains spanning from computer vision to natural language processing.
The power efficiency of the biological brain outperforms any large-scale deep learning ( DL ) model.
Neuromorphic computing tries to mimic the brain operations to improve the efficiency of DL models.
arXiv Detail & Related papers (2023-06-27T19:04:00Z) - Neuromorphic Computing and Sensing in Space [69.34740063574921]
Neuromorphic computer chips are designed to mimic the architecture of a biological brain.
The emphasis on low power and energy efficiency of neuromorphic devices is a perfect match for space applications.
arXiv Detail & Related papers (2022-12-10T07:46:29Z) - Benchmarking energy consumption and latency for neuromorphic computing
in condensed matter and particle physics [0.309894133212992]
We present a methodology for measuring the energy cost and compute time for inference tasks with artificial neural networks (ANNs) on conventional hardware.
We estimate the same metrics based on a state-of-the-art analog in-memory computing platform, one of the key paradigms in neuromorphic computing.
We find that AIMC can achieve up to one order of magnitude shorter times than conventional hardware, at an energy cost that is up to three orders of magnitude smaller.
arXiv Detail & Related papers (2022-09-21T16:33:44Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z) - Neuromorphic Computing for Content-based Image Retrieval [0.0]
We explore the application of Loihi, a neuromorphic computing chip developed by Intel, for the computer vision task of image retrieval.
Our results show that the neuromorphic solution is about 2.5 times more energy-efficient compared with an ARM Cortex-A72 CPU and 12.5 times more energy-efficient compared with a lightweight convolutional neural network.
arXiv Detail & Related papers (2020-08-04T07:34:07Z) - Optimizing Memory Placement using Evolutionary Graph Reinforcement
Learning [56.83172249278467]
We introduce Evolutionary Graph Reinforcement Learning (EGRL), a method designed for large search spaces.
We train and validate our approach directly on the Intel NNP-I chip for inference.
We additionally achieve 28-78% speed-up compared to the native NNP-I compiler on all three workloads.
arXiv Detail & Related papers (2020-07-14T18:50:12Z) - Spiking Neural Networks Hardware Implementations and Challenges: a
Survey [53.429871539789445]
Spiking Neural Networks are cognitive algorithms mimicking neuron and synapse operational principles.
We present the state of the art of hardware implementations of spiking neural networks.
We discuss the strategies employed to leverage the characteristics of these event-driven algorithms at the hardware level.
arXiv Detail & Related papers (2020-05-04T13:24:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.