Firing Rate Models as Associative Memory: Excitatory-Inhibitory Balance for Robust Retrieval
- URL: http://arxiv.org/abs/2411.07388v1
- Date: Mon, 11 Nov 2024 21:40:57 GMT
- Title: Firing Rate Models as Associative Memory: Excitatory-Inhibitory Balance for Robust Retrieval
- Authors: Simone Betteti, Giacomo Baggio, Francesco Bullo, Sandro Zampieri,
- Abstract summary: Firing rate models are dynamical systems widely used in applied and theoretical neuroscience to describe local cortical dynamics in neuronal populations.
We propose a general framework that ensures the emergence of re-scaled memory patterns as stable equilibria in the firing rate dynamics.
We analyze the conditions under which the memories are locally and globally stable, providing insights into constructing biologically-plausible and robust systems for associative memory retrieval.
- Score: 3.961279440272764
- License:
- Abstract: Firing rate models are dynamical systems widely used in applied and theoretical neuroscience to describe local cortical dynamics in neuronal populations. By providing a macroscopic perspective of neuronal activity, these models are essential for investigating oscillatory phenomena, chaotic behavior, and associative memory processes. Despite their widespread use, the application of firing rate models to associative memory networks has received limited mathematical exploration, and most existing studies are focused on specific models. Conversely, well-established associative memory designs, such as Hopfield networks, lack key biologically-relevant features intrinsic to firing rate models, including positivity and interpretable synaptic matrices that reflect excitatory and inhibitory interactions. To address this gap, we propose a general framework that ensures the emergence of re-scaled memory patterns as stable equilibria in the firing rate dynamics. Furthermore, we analyze the conditions under which the memories are locally and globally asymptotically stable, providing insights into constructing biologically-plausible and robust systems for associative memory retrieval.
Related papers
- Temporal Model On Quantum Logic [0.0]
The framework formalizes the evolution of propositions over time using linear and branching temporal models.
The hierarchical organization of memory is represented using directed acyclic graphs.
arXiv Detail & Related papers (2025-02-09T17:16:53Z) - Generative Modeling of Neural Dynamics via Latent Stochastic Differential Equations [1.5467259918426441]
We propose a framework for developing computational models of biological neural systems.
We employ a system of coupled differential equations with differentiable drift and diffusion functions.
We show that these hybrid models achieve competitive performance in predicting stimulus-evoked neural and behavioral responses.
arXiv Detail & Related papers (2024-12-01T09:36:03Z) - Artificial Kuramoto Oscillatory Neurons [65.16453738828672]
It has long been known in both neuroscience and AI that ''binding'' between neurons leads to a form of competitive learning.
We introduce Artificial rethinking together with arbitrary connectivity designs such as fully connected convolutional, or attentive mechanisms.
We show that this idea provides performance improvements across a wide spectrum of tasks such as unsupervised object discovery, adversarial robustness, uncertainty, and reasoning.
arXiv Detail & Related papers (2024-10-17T17:47:54Z) - Explosive neural networks via higher-order interactions in curved statistical manifolds [43.496401697112695]
We introduce curved neural networks as a class of prototypical models with a limited number of parameters.
We show that these curved neural networks implement a self-regulating process that can accelerate memory retrieval.
We analytically explore their memory-retrieval capacity using the replica trick near ferromagnetic and spin-glass phase boundaries.
arXiv Detail & Related papers (2024-08-05T09:10:29Z) - Interpretable Spatio-Temporal Embedding for Brain Structural-Effective Network with Ordinary Differential Equation [56.34634121544929]
In this study, we first construct the brain-effective network via the dynamic causal model.
We then introduce an interpretable graph learning framework termed Spatio-Temporal Embedding ODE (STE-ODE)
This framework incorporates specifically designed directed node embedding layers, aiming at capturing the dynamic interplay between structural and effective networks.
arXiv Detail & Related papers (2024-05-21T20:37:07Z) - Exploring neural oscillations during speech perception via surrogate gradient spiking neural networks [59.38765771221084]
We present a physiologically inspired speech recognition architecture compatible and scalable with deep learning frameworks.
We show end-to-end gradient descent training leads to the emergence of neural oscillations in the central spiking neural network.
Our findings highlight the crucial inhibitory role of feedback mechanisms, such as spike frequency adaptation and recurrent connections, in regulating and synchronising neural activity to improve recognition performance.
arXiv Detail & Related papers (2024-04-22T09:40:07Z) - Semantically-correlated memories in a dense associative model [2.7195102129095003]
I introduce a novel associative memory model named Correlated Associative Memory (CDAM)
CDAM integrates both auto- and hetero-association in a unified framework for continuous-valued memory patterns.
It is theoretically and numerically analysed, revealing four distinct dynamical modes.
arXiv Detail & Related papers (2024-04-10T16:04:07Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Cross-Frequency Coupling Increases Memory Capacity in Oscillatory Neural
Networks [69.42260428921436]
Cross-frequency coupling (CFC) is associated with information integration across populations of neurons.
We construct a model of CFC which predicts a computational role for observed $theta - gamma$ oscillatory circuits in the hippocampus and cortex.
We show that the presence of CFC increases the memory capacity of a population of neurons connected by plastic synapses.
arXiv Detail & Related papers (2022-04-05T17:13:36Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z) - Slow manifolds in recurrent networks encode working memory efficiently
and robustly [0.0]
Working memory is a cognitive function involving the storage and manipulation of latent information over brief intervals of time.
We use a top-down modeling approach to examine network-level mechanisms of working memory.
arXiv Detail & Related papers (2021-01-08T18:47:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.