Entropic Hetero-Associative Memory
- URL: http://arxiv.org/abs/2411.02438v1
- Date: Sat, 02 Nov 2024 08:30:57 GMT
- Title: Entropic Hetero-Associative Memory
- Authors: Rafael Morales, Luis A. Pineda,
- Abstract summary: The Entropic Associative Memory holds objects in a 2D relation or memory plane'' using a finite table as the medium.
Stored objects are overlapped'' on the medium, hence the memory is indeterminate and has an entropy value at each state.
- Score: 0.8287206589886881
- License:
- Abstract: The Entropic Associative Memory holds objects in a 2D relation or ``memory plane'' using a finite table as the medium. Memory objects are stored by reinforcing simultaneously the cells used by the cue, implementing a form of Hebb's learning rule. Stored objects are ``overlapped'' on the medium, hence the memory is indeterminate and has an entropy value at each state. The retrieval operation constructs an object from the cue and such indeterminate content. In this paper we present the extension to the hetero-associative case in which these properties are preserved. Pairs of hetero-associated objects, possibly of different domain and/or modalities, are held in a 4D relation. The memory retrieval operation selects a largely indeterminate 2D memory plane that is specific to the input cue; however, there is no cue left to retrieve an object from such latter plane. We propose three incremental methods to address such missing cue problem, which we call random, sample and test, and search and test. The model is assessed with composite recollections consisting of manuscripts digits and letters selected from the MNIST and the EMNIST corpora, respectively, such that cue digits retrieve their associated letters and vice versa. We show the memory performance and illustrate the memory retrieval operation using all three methods. The system shows promise for storing, recognizing and retrieving very large sets of object with very limited computing resources.
Related papers
- Associative Memories in the Feature Space [68.1903319310263]
We propose a class of memory models that only stores low-dimensional semantic embeddings, and uses them to retrieve similar, but not identical, memories.
We demonstrate a proof of concept of this method on a simple task on the MNIST dataset.
arXiv Detail & Related papers (2024-02-16T16:37:48Z) - Lift Yourself Up: Retrieval-augmented Text Generation with Self Memory [72.36736686941671]
We propose a novel framework, selfmem, for improving retrieval-augmented generation models.
Selfmem iteratively employs a retrieval-augmented generator to create an unbounded memory pool and using a memory selector to choose one output as memory for the subsequent generation round.
We evaluate the effectiveness of selfmem on three distinct text generation tasks.
arXiv Detail & Related papers (2023-05-03T21:40:54Z) - Classification and Generation of real-world data with an Associative
Memory Model [0.0]
We extend the capabilities of the basic Associative Memory Model by using a Multiple-Modality framework.
By storing both the images and labels as modalities, a single Memory can be used to retrieve and complete patterns.
arXiv Detail & Related papers (2022-07-11T12:51:27Z) - LaMemo: Language Modeling with Look-Ahead Memory [50.6248714811912]
We propose Look-Ahead Memory (LaMemo) that enhances the recurrence memory by incrementally attending to the right-side tokens.
LaMemo embraces bi-directional attention and segment recurrence with an additional overhead only linearly proportional to the memory length.
Experiments on widely used language modeling benchmarks demonstrate its superiority over the baselines equipped with different types of memory.
arXiv Detail & Related papers (2022-04-15T06:11:25Z) - MeMOT: Multi-Object Tracking with Memory [97.48960039220823]
Our model, called MeMOT, consists of three main modules that are all Transformer-based.
MeMOT observes very competitive performance on widely adopted MOT datasets.
arXiv Detail & Related papers (2022-03-31T02:33:20Z) - Entropic Associative Memory for Manuscript Symbols [0.0]
Manuscript symbols can be stored, recognized and retrieved from an entropic digital memory that is associative and distributed but yet declarative.
We discuss the operational characteristics of the entropic associative memory for retrieving objects with both complete and incomplete information.
arXiv Detail & Related papers (2022-02-17T02:29:33Z) - Rethinking Space-Time Networks with Improved Memory Coverage for
Efficient Video Object Segmentation [68.45737688496654]
We establish correspondences directly between frames without re-encoding the mask features for every object.
With the correspondences, every node in the current query frame is inferred by aggregating features from the past in an associative fashion.
We validated that every memory node now has a chance to contribute, and experimentally showed that such diversified voting is beneficial to both memory efficiency and inference accuracy.
arXiv Detail & Related papers (2021-06-09T16:50:57Z) - Self-Attentive Associative Memory [69.40038844695917]
We propose to separate the storage of individual experiences (item memory) and their occurring relationships (relational memory)
We achieve competitive results with our proposed two-memory model in a diversity of machine learning tasks.
arXiv Detail & Related papers (2020-02-10T03:27:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.