Biological learning in key-value memory networks
- URL: http://arxiv.org/abs/2110.13976v1
- Date: Tue, 26 Oct 2021 19:26:53 GMT
- Title: Biological learning in key-value memory networks
- Authors: Danil Tyulmankov, Ching Fang, Annapurna Vadaparty, Guangyu Robert Yang
- Abstract summary: Memory-augmented neural networks in machine learning commonly use a key-value mechanism to store and read out memories in a single step.
We propose an implementation of basic key-value memory that stores inputs using a combination of biologically plausible three-factor plasticity rules.
Our results suggest a compelling alternative to the classical Hopfield network as a model of biological long-term memory.
- Score: 0.45880283710344055
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In neuroscience, classical Hopfield networks are the standard biologically
plausible model of long-term memory, relying on Hebbian plasticity for storage
and attractor dynamics for recall. In contrast, memory-augmented neural
networks in machine learning commonly use a key-value mechanism to store and
read out memories in a single step. Such augmented networks achieve impressive
feats of memory compared to traditional variants, yet their biological
relevance is unclear. We propose an implementation of basic key-value memory
that stores inputs using a combination of biologically plausible three-factor
plasticity rules. The same rules are recovered when network parameters are
meta-learned. Our network performs on par with classical Hopfield networks on
autoassociative memory tasks and can be naturally extended to continual recall,
heteroassociative memory, and sequence learning. Our results suggest a
compelling alternative to the classical Hopfield network as a model of
biological long-term memory.
Related papers
- Dense Associative Memory Through the Lens of Random Features [48.17520168244209]
Dense Associative Memories are high storage capacity variants of the Hopfield networks.
We show that this novel network closely approximates the energy function and dynamics of conventional Dense Associative Memories.
arXiv Detail & Related papers (2024-10-31T17:10:57Z) - Sequential Learning in the Dense Associative Memory [1.2289361708127877]
We investigate the performance of the Dense Associative Memory in sequential learning problems.
We show that existing sequential learning methods can be applied to the Dense Associative Memory to improve sequential learning performance.
arXiv Detail & Related papers (2024-09-24T04:23:00Z) - In search of dispersed memories: Generative diffusion models are
associative memory networks [6.4322891559626125]
Generative diffusion models are a type of generative machine learning techniques that have shown great performance in many tasks.
We show that generative diffusion models can be interpreted as energy-based models and that, when trained on discrete patterns, their energy function is identical to that of modern Hopfield networks.
This equivalence allows us to interpret the supervised training of diffusion models as a synaptic learning process that encodes the associative dynamics of a modern Hopfield network in the weight structure of a deep neural network.
arXiv Detail & Related papers (2023-09-29T14:48:24Z) - Memory-enriched computation and learning in spiking neural networks
through Hebbian plasticity [9.453554184019108]
Hebbian plasticity is believed to play a pivotal role in biological memory.
We introduce a novel spiking neural network architecture that is enriched by Hebbian synaptic plasticity.
We show that Hebbian enrichment renders spiking neural networks surprisingly versatile in terms of their computational as well as learning capabilities.
arXiv Detail & Related papers (2022-05-23T12:48:37Z) - Pin the Memory: Learning to Generalize Semantic Segmentation [68.367763672095]
We present a novel memory-guided domain generalization method for semantic segmentation based on meta-learning framework.
Our method abstracts the conceptual knowledge of semantic classes into categorical memory which is constant beyond the domains.
arXiv Detail & Related papers (2022-04-07T17:34:01Z) - Hierarchical Associative Memory [2.66512000865131]
Associative Memories or Modern Hopfield Networks have many appealing properties.
They can do pattern completion, store a large number of memories, and can be described using a recurrent neural network.
This paper tackles a gap and describes a fully recurrent model of associative memory with an arbitrary large number of layers.
arXiv Detail & Related papers (2021-07-14T01:38:40Z) - Slow manifolds in recurrent networks encode working memory efficiently
and robustly [0.0]
Working memory is a cognitive function involving the storage and manipulation of latent information over brief intervals of time.
We use a top-down modeling approach to examine network-level mechanisms of working memory.
arXiv Detail & Related papers (2021-01-08T18:47:02Z) - Learning to Learn Variational Semantic Memory [132.39737669936125]
We introduce variational semantic memory into meta-learning to acquire long-term knowledge for few-shot learning.
The semantic memory is grown from scratch and gradually consolidated by absorbing information from tasks it experiences.
We formulate memory recall as the variational inference of a latent memory variable from addressed contents.
arXiv Detail & Related papers (2020-10-20T15:05:26Z) - Reservoir Memory Machines as Neural Computers [70.5993855765376]
Differentiable neural computers extend artificial neural networks with an explicit memory without interference.
We achieve some of the computational capabilities of differentiable neural computers with a model that can be trained very efficiently.
arXiv Detail & Related papers (2020-09-14T12:01:30Z) - Self-Attentive Associative Memory [69.40038844695917]
We propose to separate the storage of individual experiences (item memory) and their occurring relationships (relational memory)
We achieve competitive results with our proposed two-memory model in a diversity of machine learning tasks.
arXiv Detail & Related papers (2020-02-10T03:27:48Z) - Encoding-based Memory Modules for Recurrent Neural Networks [79.42778415729475]
We study the memorization subtask from the point of view of the design and training of recurrent neural networks.
We propose a new model, the Linear Memory Network, which features an encoding-based memorization component built with a linear autoencoder for sequences.
arXiv Detail & Related papers (2020-01-31T11:14:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.