'Memory States' from Almost Nothing: Representing and Computing in a Non-associative Algebra
- URL: http://arxiv.org/abs/2506.13768v1
- Date: Tue, 13 May 2025 08:43:02 GMT
- Title: 'Memory States' from Almost Nothing: Representing and Computing in a Non-associative Algebra
- Authors: Stefan Reimann,
- Abstract summary: This note presents a non-associative framework for the representation and computation of information items in high-dimensional space.<n>It is consistent with the principles of spatial computing and with the empirical findings in cognitive science about memory.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This note presents a non-associative algebraic framework for the representation and computation of information items in high-dimensional space. This framework is consistent with the principles of spatial computing and with the empirical findings in cognitive science about memory. Computations are performed through a process of multiplication-like binding and non-associative interference-like bundling. Models that rely on associative bundling typically lose order information, which necessitates the use of auxiliary order structures, such as position markers, to represent sequential information that is important for cognitive tasks. In contrast, the non-associative bundling proposed allows the construction of sparse representations of arbitrarily long sequences that maintain their temporal structure across arbitrary lengths. In this operation, noise is a constituent element of the representation of order information, rather than a means of obscuring it. The non-associative nature of the proposed framework results in the representation of a single sequence by two distinct states. The L-state, generated through left-associative bundling, continuously updates and emphasises a recency effect, while the R-state, formed through right-associative bundling, encodes finite sequences or chunks, capturing a primacy effect. The construction of these states may be associated with activity in the prefrontal cortex in relation to short-term memory and hippocampal encoding in long-term memory, respectively. The accuracy of retrieval is contingent upon a decision-making process that is based on the mutual information between the memory states and the cue. The model is able to replicate the Serial Position Curve, which reflects the empirical recency and primacy effects observed in cognitive experiments.
Related papers
- Sequential Memory with Temporal Predictive Coding [6.228559238589584]
We propose a PC-based model for emphsequential memory, called emphtemporal predictive coding (tPC)
We show that our tPC models can memorize and retrieve sequential inputs accurately with a biologically plausible neural implementation.
arXiv Detail & Related papers (2023-05-19T20:03:31Z) - Neural Eigenfunctions Are Structured Representation Learners [93.53445940137618]
This paper introduces a structured, adaptive-length deep representation called Neural Eigenmap.
We show that, when the eigenfunction is derived from positive relations in a data augmentation setup, applying NeuralEF results in an objective function.
We demonstrate using such representations as adaptive-length codes in image retrieval systems.
arXiv Detail & Related papers (2022-10-23T07:17:55Z) - On the Relationship Between Variational Inference and Auto-Associative
Memory [68.8204255655161]
We study how different neural network approaches to variational inference can be applied in this framework.
We evaluate the obtained algorithms on the CIFAR10 and CLEVR image datasets and compare them with other associative memory models.
arXiv Detail & Related papers (2022-10-14T14:18:47Z) - Learning Sequence Representations by Non-local Recurrent Neural Memory [61.65105481899744]
We propose a Non-local Recurrent Neural Memory (NRNM) for supervised sequence representation learning.
Our model is able to capture long-range dependencies and latent high-level features can be distilled by our model.
Our model compares favorably against other state-of-the-art methods specifically designed for each of these sequence applications.
arXiv Detail & Related papers (2022-07-20T07:26:15Z) - Hybrid Predictive Coding: Inferring, Fast and Slow [62.997667081978825]
We propose a hybrid predictive coding network that combines both iterative and amortized inference in a principled manner.
We demonstrate that our model is inherently sensitive to its uncertainty and adaptively balances balances to obtain accurate beliefs using minimum computational expense.
arXiv Detail & Related papers (2022-04-05T12:52:45Z) - A novel HD Computing Algebra: Non-associative superposition of states
creating sparse bundles representing order information [0.0]
Cognitive computing requires to represent item information as well as sequential information.
A simple binary bundling rule inspired by the summation of neuronal activities allows the resulting memory state to represent both, item information as well as sequential information.
The memory state resulting from bundling together an arbitrary number of items is non-homogeneous and has a degree of sparseness, which is controlled by the activation threshold in summation.
arXiv Detail & Related papers (2022-02-17T12:40:32Z) - Entropic Associative Memory for Manuscript Symbols [0.0]
Manuscript symbols can be stored, recognized and retrieved from an entropic digital memory that is associative and distributed but yet declarative.
We discuss the operational characteristics of the entropic associative memory for retrieving objects with both complete and incomplete information.
arXiv Detail & Related papers (2022-02-17T02:29:33Z) - Rethinking Space-Time Networks with Improved Memory Coverage for
Efficient Video Object Segmentation [68.45737688496654]
We establish correspondences directly between frames without re-encoding the mask features for every object.
With the correspondences, every node in the current query frame is inferred by aggregating features from the past in an associative fashion.
We validated that every memory node now has a chance to contribute, and experimentally showed that such diversified voting is beneficial to both memory efficiency and inference accuracy.
arXiv Detail & Related papers (2021-06-09T16:50:57Z) - A STDP-based Encoding Algorithm for Associative and Composite Data [0.0]
This work proposes a practical memory model based on STDP that can store and retrieve high-dimensional associative data.
The model combines STDP dynamics with an encoding scheme for distributed representations and can handle multiple composite data in a continuous manner.
arXiv Detail & Related papers (2021-04-25T20:26:52Z) - Representation Learning for Sequence Data with Deep Autoencoding
Predictive Components [96.42805872177067]
We propose a self-supervised representation learning method for sequence data, based on the intuition that useful representations of sequence data should exhibit a simple structure in the latent space.
We encourage this latent structure by maximizing an estimate of predictive information of latent feature sequences, which is the mutual information between past and future windows at each time step.
We demonstrate that our method recovers the latent space of noisy dynamical systems, extracts predictive features for forecasting tasks, and improves automatic speech recognition when used to pretrain the encoder on large amounts of unlabeled data.
arXiv Detail & Related papers (2020-10-07T03:34:01Z) - An Entropic Associative Memory [0.0]
We use intrinsic-Indeterminate Computing to retrieve associative memory registers that hold representations of individual objects.
The system has been used to model a visual memory holding the representations of hand-written digits.
The similarity between the cue and the object recovered in memory operations depends on the entropy of the memory register.
arXiv Detail & Related papers (2020-09-28T04:24:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.