Theory of coupled neuronal-synaptic dynamics
- URL: http://arxiv.org/abs/2302.08985v2
- Date: Wed, 10 Jan 2024 22:41:29 GMT
- Title: Theory of coupled neuronal-synaptic dynamics
- Authors: David G. Clark, L.F. Abbott
- Abstract summary: In neural circuits, synaptic strengths influence neuronal activity by shaping network dynamics.
We study a recurrent-network model in which neuronal units and synaptic couplings are interacting dynamic variables.
We show that adding Hebbian plasticity slows activity in chaotic networks and can induce chaos.
- Score: 3.626013617212667
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In neural circuits, synaptic strengths influence neuronal activity by shaping
network dynamics, and neuronal activity influences synaptic strengths through
activity-dependent plasticity. Motivated by this fact, we study a
recurrent-network model in which neuronal units and synaptic couplings are
interacting dynamic variables, with couplings subject to Hebbian modification
with decay around quenched random strengths. Rather than assigning a specific
role to the plasticity, we use dynamical mean-field theory and other techniques
to systematically characterize the neuronal-synaptic dynamics, revealing a rich
phase diagram. Adding Hebbian plasticity slows activity in chaotic networks and
can induce chaos in otherwise quiescent networks. Anti-Hebbian plasticity
quickens activity and produces an oscillatory component. Analysis of the
Jacobian shows that Hebbian and anti-Hebbian plasticity push locally unstable
modes toward the real and imaginary axes, explaining these behaviors. Both
random-matrix and Lyapunov analysis show that strong Hebbian plasticity
segregates network timescales into two bands with a slow, synapse-dominated
band driving the dynamics, suggesting a flipped view of the network as synapses
connected by neurons. For increasing strength, Hebbian plasticity initially
raises the complexity of the dynamics, measured by the maximum Lyapunov
exponent and attractor dimension, but then decreases these metrics, likely due
to the proliferation of stable fixed points. We compute the marginally stable
spectra of such fixed points as well as their number, showing exponential
growth with network size. In chaotic states with strong Hebbian plasticity, a
stable fixed point of neuronal dynamics is destabilized by synaptic dynamics,
allowing any neuronal state to be stored as a stable fixed point by halting the
plasticity. This phase of freezable chaos offers a new mechanism for working
memory.
Related papers
- Artificial Kuramoto Oscillatory Neurons [65.16453738828672]
We introduce Artificial Kuramotoy Neurons (AKOrN) as a dynamical alternative to threshold units.
We show that this idea provides performance improvements across a wide spectrum of tasks.
We believe that these empirical results show the importance of our assumptions at the most basic neuronal level of neural representation.
arXiv Detail & Related papers (2024-10-17T17:47:54Z) - Confidence Regulation Neurons in Language Models [91.90337752432075]
This study investigates the mechanisms by which large language models represent and regulate uncertainty in next-token predictions.
Entropy neurons are characterized by an unusually high weight norm and influence the final layer normalization (LayerNorm) scale to effectively scale down the logits.
token frequency neurons, which we describe here for the first time, boost or suppress each token's logit proportionally to its log frequency, thereby shifting the output distribution towards or away from the unigram distribution.
arXiv Detail & Related papers (2024-06-24T01:31:03Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - Learning with Chemical versus Electrical Synapses -- Does it Make a
Difference? [61.85704286298537]
Bio-inspired neural networks have the potential to advance our understanding of neural computation and improve the state-of-the-art of AI systems.
We conduct experiments with autonomous lane-keeping through a photorealistic autonomous driving simulator to evaluate their performance under diverse conditions.
arXiv Detail & Related papers (2023-11-21T13:07:20Z) - Equivalence of Additive and Multiplicative Coupling in Spiking Neural
Networks [0.0]
Spiking neural network models characterize the emergent collective dynamics of circuits of biological neurons.
We show that spiking neural network models with additive coupling are equivalent to models with multiplicative coupling.
arXiv Detail & Related papers (2023-03-31T20:19:11Z) - A Step Towards Uncovering The Structure of Multistable Neural Networks [1.14219428942199]
We study the structure of multistable recurrent neural networks.
The activation function is simplified by a nonsmooth Heaviside step function.
We derive how multistability is encoded within the network architecture.
arXiv Detail & Related papers (2022-10-06T22:54:17Z) - Probing dynamics of a two-dimensional dipolar spin ensemble using single
qubit sensor [62.997667081978825]
We experimentally investigate individual spin dynamics in a two-dimensional ensemble of electron spins on the surface of a diamond crystal.
We show that this anomalously slow relaxation rate is due to the presence of strong dynamical disorder.
Our work paves the way towards microscopic study and control of quantum thermalization in strongly interacting disordered spin ensembles.
arXiv Detail & Related papers (2022-07-21T18:00:17Z) - Latent Equilibrium: A unified learning theory for arbitrarily fast
computation with arbitrarily slow neurons [0.7340017786387767]
We introduce Latent Equilibrium, a new framework for inference and learning in networks of slow components.
We derive disentangled neuron and synapse dynamics from a prospective energy function.
We show how our principle can be applied to detailed models of cortical microcircuitry.
arXiv Detail & Related papers (2021-10-27T16:15:55Z) - SpikePropamine: Differentiable Plasticity in Spiking Neural Networks [0.0]
We introduce a framework for learning the dynamics of synaptic plasticity and neuromodulated synaptic plasticity in Spiking Neural Networks (SNNs)
We show that SNNs augmented with differentiable plasticity are sufficient for solving a set of challenging temporal learning tasks.
These networks are also shown to be capable of producing locomotion on a high-dimensional robotic learning task.
arXiv Detail & Related papers (2021-06-04T19:29:07Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.