Large Associative Memory Problem in Neurobiology and Machine Learning
- URL: http://arxiv.org/abs/2008.06996v3
- Date: Tue, 27 Apr 2021 22:20:05 GMT
- Title: Large Associative Memory Problem in Neurobiology and Machine Learning
- Authors: Dmitry Krotov, John Hopfield
- Abstract summary: We present a valid model of large associative memory with a degree of biological plausibility.
The dynamics of our network and its reduced dimensional equivalent both minimize energy (Lyapunov) functions.
- Score: 6.41804410246642
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Dense Associative Memories or modern Hopfield networks permit storage and
reliable retrieval of an exponentially large (in the dimension of feature
space) number of memories. At the same time, their naive implementation is
non-biological, since it seemingly requires the existence of many-body synaptic
junctions between the neurons. We show that these models are effective
descriptions of a more microscopic (written in terms of biological degrees of
freedom) theory that has additional (hidden) neurons and only requires two-body
interactions between them. For this reason our proposed microscopic theory is a
valid model of large associative memory with a degree of biological
plausibility. The dynamics of our network and its reduced dimensional
equivalent both minimize energy (Lyapunov) functions. When certain dynamical
variables (hidden neurons) are integrated out from our microscopic theory, one
can recover many of the models that were previously discussed in the
literature, e.g. the model presented in "Hopfield Networks is All You Need"
paper. We also provide an alternative derivation of the energy function and the
update rule proposed in the aforementioned paper and clarify the relationships
between various models of this class.
Related papers
- Dense Associative Memory Through the Lens of Random Features [48.17520168244209]
Dense Associative Memories are high storage capacity variants of the Hopfield networks.
We show that this novel network closely approximates the energy function and dynamics of conventional Dense Associative Memories.
arXiv Detail & Related papers (2024-10-31T17:10:57Z) - Don't Cut Corners: Exact Conditions for Modularity in Biologically Inspired Representations [52.48094670415497]
We develop a theory of when biologically inspired representations modularise with respect to source variables (sources)
We derive necessary and sufficient conditions on a sample of sources that determine whether the neurons in an optimal biologically-inspired linear autoencoder modularise.
Our theory applies to any dataset, extending far beyond the case of statistical independence studied in previous work.
arXiv Detail & Related papers (2024-10-08T17:41:37Z) - Exploring Biological Neuronal Correlations with Quantum Generative Models [0.0]
We introduce a quantum generative model framework for generating synthetic data that captures the spatial and temporal correlations of biological neuronal activity.
Our model demonstrates the ability to achieve reliable outcomes with fewer trainable parameters compared to classical methods.
arXiv Detail & Related papers (2024-09-13T18:00:06Z) - MindBridge: A Cross-Subject Brain Decoding Framework [60.58552697067837]
Brain decoding aims to reconstruct stimuli from acquired brain signals.
Currently, brain decoding is confined to a per-subject-per-model paradigm.
We present MindBridge, that achieves cross-subject brain decoding by employing only one model.
arXiv Detail & Related papers (2024-04-11T15:46:42Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - Bridging Associative Memory and Probabilistic Modeling [29.605203018237457]
Associative memory and probabilistic modeling are two fundamental topics in artificial intelligence.
We build a bridge between the two that enables useful flow of ideas in both directions.
arXiv Detail & Related papers (2024-02-15T18:56:46Z) - In search of dispersed memories: Generative diffusion models are
associative memory networks [6.4322891559626125]
Generative diffusion models are a type of generative machine learning techniques that have shown great performance in many tasks.
We show that generative diffusion models can be interpreted as energy-based models and that, when trained on discrete patterns, their energy function is identical to that of modern Hopfield networks.
This equivalence allows us to interpret the supervised training of diffusion models as a synaptic learning process that encodes the associative dynamics of a modern Hopfield network in the weight structure of a deep neural network.
arXiv Detail & Related papers (2023-09-29T14:48:24Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Cones: Concept Neurons in Diffusion Models for Customized Generation [41.212255848052514]
This paper finds a small cluster of neurons in a diffusion model corresponding to a particular subject.
The concept neurons demonstrate magnetic properties in interpreting and manipulating generation results.
For large-scale applications, the concept neurons are environmentally friendly as we only need to store a sparse cluster of int index instead of dense float32 values.
arXiv Detail & Related papers (2023-03-09T09:16:04Z) - Cross-Frequency Coupling Increases Memory Capacity in Oscillatory Neural
Networks [69.42260428921436]
Cross-frequency coupling (CFC) is associated with information integration across populations of neurons.
We construct a model of CFC which predicts a computational role for observed $theta - gamma$ oscillatory circuits in the hippocampus and cortex.
We show that the presence of CFC increases the memory capacity of a population of neurons connected by plastic synapses.
arXiv Detail & Related papers (2022-04-05T17:13:36Z) - Hierarchical Associative Memory [2.66512000865131]
Associative Memories or Modern Hopfield Networks have many appealing properties.
They can do pattern completion, store a large number of memories, and can be described using a recurrent neural network.
This paper tackles a gap and describes a fully recurrent model of associative memory with an arbitrary large number of layers.
arXiv Detail & Related papers (2021-07-14T01:38:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.