Latent Structured Hopfield Network for Semantic Association and Retrieval
- URL: http://arxiv.org/abs/2506.01303v2
- Date: Sun, 15 Jun 2025 16:12:36 GMT
- Title: Latent Structured Hopfield Network for Semantic Association and Retrieval
- Authors: Chong Li, Xiangyang Xue, Jianfeng Feng, Taiping Zeng,
- Abstract summary: Episodic memory enables humans to recall past experiences by associating semantic elements such as objects, locations, and time into coherent event representations.<n>We propose the Latent Structured Hopfield Network (LSHN), a framework that integrates continuous Hopfield attractor dynamics into an autoencoder architecture.<n>Unlike traditional Hopfield networks, our model is trained end-to-end with gradient descent, achieving scalable and robust memory retrieval.
- Score: 52.634915010996835
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Episodic memory enables humans to recall past experiences by associating semantic elements such as objects, locations, and time into coherent event representations. While large pretrained models have shown remarkable progress in modeling semantic memory, the mechanisms for forming associative structures that support episodic memory remain underexplored. Inspired by hippocampal CA3 dynamics and its role in associative memory, we propose the Latent Structured Hopfield Network (LSHN), a biologically inspired framework that integrates continuous Hopfield attractor dynamics into an autoencoder architecture. LSHN mimics the cortical-hippocampal pathway: a semantic encoder extracts compact latent representations, a latent Hopfield network performs associative refinement through attractor convergence, and a decoder reconstructs perceptual input. Unlike traditional Hopfield networks, our model is trained end-to-end with gradient descent, achieving scalable and robust memory retrieval. Experiments on MNIST, CIFAR-10, and a simulated episodic memory task demonstrate superior performance in recalling corrupted inputs under occlusion and noise, outperforming existing associative memory models. Our work provides a computational perspective on how semantic elements can be dynamically bound into episodic memory traces through biologically grounded attractor mechanisms. Code: https://github.com/fudan-birlab/LSHN.
Related papers
- Retrospective Memory for Camouflaged Object Detection [18.604039107883317]
We propose a recall-augmented COD architecture, namely RetroMem, which dynamically modulates camouflage pattern perception and inference.<n>In the recall stage, we propose a dynamic memory mechanism and an inference pattern reconstruction.<n>Our RetroMem significantly outperforms existing state-of-the-art methods.
arXiv Detail & Related papers (2025-06-18T08:22:19Z) - Input-Driven Dynamics for Robust Memory Retrieval in Hopfield Networks [3.961279440272764]
The Hopfield model provides a mathematically idealized yet insightful framework for understanding the mechanisms of memory storage and retrieval in the human brain.
We propose a novel system framework in which the external input directly influences the neural synapses and shapes the energy landscape of the Hopfield model.
This plasticity-based mechanism provides a clear energetic interpretation of the memory retrieval process and proves effective at correctly classifying highly mixed inputs.
arXiv Detail & Related papers (2024-11-06T17:24:25Z) - Modern Hopfield Networks meet Encoded Neural Representations -- Addressing Practical Considerations [5.272882258282611]
This paper introduces Hopfield HEN, a framework that integrates encoded representations into MHNs to improve pattern separability and reduce meta-stable states.
We show that HEN can also be used for retrieval in the context of hetero association of images with natural language queries, thus removing the limitation of requiring access to partial content in the same domain.
arXiv Detail & Related papers (2024-09-24T19:17:15Z) - B'MOJO: Hybrid State Space Realizations of Foundation Models with Eidetic and Fading Memory [91.81390121042192]
We develop a class of models called B'MOJO to seamlessly combine eidetic and fading memory within an composable module.
B'MOJO's ability to modulate eidetic and fading memory results in better inference on longer sequences tested up to 32K tokens.
arXiv Detail & Related papers (2024-07-08T18:41:01Z) - In search of dispersed memories: Generative diffusion models are
associative memory networks [6.4322891559626125]
Generative diffusion models are a type of generative machine learning techniques that have shown great performance in many tasks.
We show that generative diffusion models can be interpreted as energy-based models and that, when trained on discrete patterns, their energy function is identical to that of modern Hopfield networks.
This equivalence allows us to interpret the supervised training of diffusion models as a synaptic learning process that encodes the associative dynamics of a modern Hopfield network in the weight structure of a deep neural network.
arXiv Detail & Related papers (2023-09-29T14:48:24Z) - On the Relationship Between Variational Inference and Auto-Associative
Memory [68.8204255655161]
We study how different neural network approaches to variational inference can be applied in this framework.
We evaluate the obtained algorithms on the CIFAR10 and CLEVR image datasets and compare them with other associative memory models.
arXiv Detail & Related papers (2022-10-14T14:18:47Z) - Brain-like combination of feedforward and recurrent network components
achieves prototype extraction and robust pattern recognition [0.0]
Associative memory has been a prominent candidate for the computation performed by the massively recurrent neocortical networks.
We combine a recurrent attractor network with a feedforward network that learns distributed representations using an unsupervised Hebbian-Bayesian learning rule.
We demonstrate that the recurrent attractor component implements associative memory when trained on the feedforward-driven internal (hidden) representations.
arXiv Detail & Related papers (2022-06-30T06:03:11Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Kanerva++: extending The Kanerva Machine with differentiable, locally
block allocated latent memory [75.65949969000596]
Episodic and semantic memory are critical components of the human memory model.
We develop a new principled Bayesian memory allocation scheme that bridges the gap between episodic and semantic memory.
We demonstrate that this allocation scheme improves performance in memory conditional image generation.
arXiv Detail & Related papers (2021-02-20T18:40:40Z) - Incremental Training of a Recurrent Neural Network Exploiting a
Multi-Scale Dynamic Memory [79.42778415729475]
We propose a novel incrementally trained recurrent architecture targeting explicitly multi-scale learning.
We show how to extend the architecture of a simple RNN by separating its hidden state into different modules.
We discuss a training algorithm where new modules are iteratively added to the model to learn progressively longer dependencies.
arXiv Detail & Related papers (2020-06-29T08:35:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.