Synaptic Scaling and Optimal Bias Adjustments for Power Reduction in
Neuromorphic Systems
- URL: http://arxiv.org/abs/2306.07416v1
- Date: Mon, 12 Jun 2023 20:47:59 GMT
- Title: Synaptic Scaling and Optimal Bias Adjustments for Power Reduction in
Neuromorphic Systems
- Authors: Cory Merkel
- Abstract summary: Recent animal studies have shown that biological brains can enter a low power mode in times of food scarcity.
This paper explores the possibility of applying similar mechanisms to a broad class of neuromorphic systems.
We show through mathematical models and simulations that careful scaling of synaptic weights can significantly reduce power consumption.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent animal studies have shown that biological brains can enter a low power
mode in times of food scarcity. This paper explores the possibility of applying
similar mechanisms to a broad class of neuromorphic systems where power
consumption is strongly dependent on the magnitude of synaptic weights. In
particular, we show through mathematical models and simulations that careful
scaling of synaptic weights can significantly reduce power consumption (by over
80\% in some of the cases tested) while having a relatively small impact on
accuracy. These results uncover an exciting opportunity to design neuromorphic
systems for edge AI applications, where power consumption can be dynamically
adjusted based on energy availability and performance requirements.
Related papers
- Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - Homeostatic Adaptation of Optimal Population Codes under Metabolic Stress [2.388418486046813]
We show that neurons in mouse visual cortex go into a "low power mode" in which they maintain firing rate homeostasis while expending less energy.<n>This adaptation leads to increased neuronal noise and tuning curve flattening in response to metabolic stress.<n>We analytically derive the optimal coding strategy for neurons under varying energy budgets and coding goals.
arXiv Detail & Related papers (2025-07-10T15:58:57Z) - NOBLE -- Neural Operator with Biologically-informed Latent Embeddings to Capture Experimental Variability in Biological Neuron Models [68.89389652724378]
NOBLE is a neural operator framework that learns a mapping from a continuous frequency-modulated embedding of interpretable neuron features to the somatic voltage response induced by current injection.<n>It predicts distributions of neural dynamics accounting for the intrinsic experimental variability.<n>NOBLE is the first scaled-up deep learning framework validated on real experimental data.
arXiv Detail & Related papers (2025-06-05T01:01:18Z) - Understanding Artificial Neural Network's Behavior from Neuron Activation Perspective [8.251799609350725]
This paper explores the intricate behavior of deep neural networks (DNNs) through the lens of neuron activation dynamics.<n>We propose a probabilistic framework that can analyze models' neuron activation patterns as a process.
arXiv Detail & Related papers (2024-12-24T01:01:06Z) - Demonstrating the Advantages of Analog Wafer-Scale Neuromorphic Hardware [1.6218106536237746]
We show the capabilities and advantages of the BrainScaleS-1 system and how it can be used in combination with conventional software simulations.
We report the emulation time and energy consumption for two biologically inspired networks adapted to the neuromorphic hardware substrate.
arXiv Detail & Related papers (2024-12-03T17:46:43Z) - Bio-Inspired Adaptive Neurons for Dynamic Weighting in Artificial Neural Networks [6.931200003384122]
Traditional neural networks employ fixed weights during inference, limiting their ability to adapt to changing input conditions.
We propose a novel framework for adaptive neural networks, where neuron weights are modeled as functions of the input signal.
arXiv Detail & Related papers (2024-12-02T12:45:30Z) - Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - The Potential of Combined Learning Strategies to Enhance Energy Efficiency of Spiking Neuromorphic Systems [0.0]
This manuscript focuses on enhancing brain-inspired perceptual computing machines through a novel combined learning approach for Convolutional Spiking Neural Networks (CSNNs)
CSNNs present a promising alternative to traditional power-intensive and complex machine learning methods like backpropagation, offering energy-efficient spiking neuron processing inspired by the human brain.
arXiv Detail & Related papers (2024-08-13T18:40:50Z) - Expressivity of Neural Networks with Random Weights and Learned Biases [44.02417750529102]
Recent work has pushed the bounds of universal approximation by showing that arbitrary functions can similarly be learned by tuning smaller subsets of parameters.
We provide theoretical and numerical evidence demonstrating that feedforward neural networks with fixed random weights can be trained to perform multiple tasks by learning biases only.
Our results are relevant to neuroscience, where they demonstrate the potential for behaviourally relevant changes in dynamics without modifying synaptic weights.
arXiv Detail & Related papers (2024-07-01T04:25:49Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - An Adiabatic Capacitive Artificial Neuron with RRAM-based Threshold
Detection for Energy-Efficient Neuromorphic Computing [62.997667081978825]
We present an artificial neuron featuring adiabatic synapse capacitors to produce membrane potentials for the somas of neurons.
Our initial 4-bit adiabatic capacitive neuron proof-of-concept example shows 90% synaptic energy saving.
arXiv Detail & Related papers (2022-02-02T17:12:22Z) - Increasing Liquid State Machine Performance with Edge-of-Chaos Dynamics
Organized by Astrocyte-modulated Plasticity [0.0]
Liquid state machine (LSM) tunes internal weights without backpropagation of gradients.
Recent findings suggest that astrocytes, a long-neglected non-neuronal brain cell, modulate synaptic plasticity and brain dynamics.
We propose the neuron-astrocyte liquid state machine (NALSM) that addresses under-performance through self-organized near-critical dynamics.
arXiv Detail & Related papers (2021-10-26T23:04:40Z) - Evolving spiking neuron cellular automata and networks to emulate in
vitro neuronal activity [0.0]
We produce spiking neural systems that emulate the patterns of behavior of biological neurons in vitro.
Our models were able to produce a level of network-wide synchrony.
The genomes of the top-performing models indicate the excitability and density of connections in the model play an important role in determining the complexity of the produced activity.
arXiv Detail & Related papers (2021-10-15T17:55:04Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.