The Neuron as a Direct Data-Driven Controller
- URL: http://arxiv.org/abs/2401.01489v1
- Date: Wed, 3 Jan 2024 01:24:10 GMT
- Title: The Neuron as a Direct Data-Driven Controller
- Authors: Jason Moore, Alexander Genkin, Magnus Tournoy, Joshua Pughe-Sanford,
Rob R. de Ruyter van Steveninck, and Dmitri B. Chklovskii
- Abstract summary: This study extends the current normative models, which primarily optimize prediction, by conceptualizing neurons as optimal feedback controllers.
We model neurons as biologically feasible controllers which implicitly identify loop dynamics, infer latent states and optimize control.
Our model presents a significant departure from the traditional, feedforward, instant-response McCulloch-Pitts-Rosenblatt neuron, offering a novel and biologically-informed fundamental unit for constructing neural networks.
- Score: 43.8450722109081
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In the quest to model neuronal function amidst gaps in physiological data, a
promising strategy is to develop a normative theory that interprets neuronal
physiology as optimizing a computational objective. This study extends the
current normative models, which primarily optimize prediction, by
conceptualizing neurons as optimal feedback controllers. We posit that neurons,
especially those beyond early sensory areas, act as controllers, steering their
environment towards a specific desired state through their output. This
environment comprises both synaptically interlinked neurons and external motor
sensory feedback loops, enabling neurons to evaluate the effectiveness of their
control via synaptic feedback. Utilizing the novel Direct Data-Driven Control
(DD-DC) framework, we model neurons as biologically feasible controllers which
implicitly identify loop dynamics, infer latent states and optimize control.
Our DD-DC neuron model explains various neurophysiological phenomena: the shift
from potentiation to depression in Spike-Timing-Dependent Plasticity (STDP)
with its asymmetry, the duration and adaptive nature of feedforward and
feedback neuronal filters, the imprecision in spike generation under constant
stimulation, and the characteristic operational variability and noise in the
brain. Our model presents a significant departure from the traditional,
feedforward, instant-response McCulloch-Pitts-Rosenblatt neuron, offering a
novel and biologically-informed fundamental unit for constructing neural
networks.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Contribute to balance, wire in accordance: Emergence of backpropagation from a simple, bio-plausible neuroplasticity rule [0.0]
We introduce a novel neuroplasticity rule that offers a potential mechanism for implementing BP in the brain.
We demonstrate mathematically that our learning rule precisely replicates BP in layered neural networks without any approximations.
arXiv Detail & Related papers (2024-05-23T03:28:52Z) - Learning Control Policies of Hodgkin-Huxley Neuronal Dynamics [1.629803445577911]
We approximate the value function offline using a neural network to enable generating controls (stimuli) in real time via the feedback form.
Our numerical experiments illustrate the accuracy of our approach for out-of-distribution samples and the robustness to moderate shocks and disturbances in the system.
arXiv Detail & Related papers (2023-11-13T18:53:50Z) - WaLiN-GUI: a graphical and auditory tool for neuron-based encoding [73.88751967207419]
Neuromorphic computing relies on spike-based, energy-efficient communication.
We develop a tool to identify suitable configurations for neuron-based encoding of sample-based data into spike trains.
The WaLiN-GUI is provided open source and with documentation.
arXiv Detail & Related papers (2023-10-25T20:34:08Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Constraints on the design of neuromorphic circuits set by the properties
of neural population codes [61.15277741147157]
In the brain, information is encoded, transmitted and used to inform behaviour.
Neuromorphic circuits need to encode information in a way compatible to that used by populations of neuron in the brain.
arXiv Detail & Related papers (2022-12-08T15:16:04Z) - STNDT: Modeling Neural Population Activity with a Spatiotemporal
Transformer [19.329190789275565]
We introduce SpatioTemporal Neural Data Transformer (STNDT), an NDT-based architecture that explicitly models responses of individual neurons.
We show that our model achieves state-of-the-art performance on ensemble level in estimating neural activities across four neural datasets.
arXiv Detail & Related papers (2022-06-09T18:54:23Z) - Improving Spiking Neural Network Accuracy Using Time-based Neurons [0.24366811507669117]
Research on neuromorphic computing systems based on low-power spiking neural networks using analog neurons is in the spotlight.
As technology scales down, analog neurons are difficult to scale, and they suffer from reduced voltage headroom/dynamic range and circuit nonlinearities.
This paper first models the nonlinear behavior of existing current-mirror-based voltage-domain neurons designed in a 28nm process, and show SNN inference accuracy can be severely degraded by the effect of neuron's nonlinearity.
We propose a novel neuron, which processes incoming spikes in the time domain and greatly improves the linearity, thereby improving the inference accuracy compared to the
arXiv Detail & Related papers (2022-01-05T00:24:45Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.