Memory, Space, and Planning: Multiscale Predictive Representations
- URL: http://arxiv.org/abs/2401.09491v2
- Date: Mon, 19 Feb 2024 21:01:23 GMT
- Title: Memory, Space, and Planning: Multiscale Predictive Representations
- Authors: Ida Momennejad
- Abstract summary: Flexible behavior in biological and artificial agents depends on the interplay of learning from the past and predicting the future in ever-changing environments.
This chapter reviews computational, behavioral, and neural evidence suggesting these processes rely on learning the structure of experiences, known as cognitive maps.
- Score: 5.572701755354684
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Memory is inherently entangled with prediction and planning. Flexible
behavior in biological and artificial agents depends on the interplay of
learning from the past and predicting the future in ever-changing environments.
This chapter reviews computational, behavioral, and neural evidence suggesting
these processes rely on learning the relational structure of experiences, known
as cognitive maps, and draws two key takeaways. First, that these memory
structures are organized as multiscale, compact predictive representations in
hippocampal and prefrontal cortex, or PFC, hierarchies. Second, we argue that
such predictive memory structures are crucial to the complementary functions of
the hippocampus and PFC, both for enabling a recall of detailed and coherent
past episodes as well as generalizing experiences at varying scales for
efficient prediction and planning. These insights advance our understanding of
memory and planning mechanisms in the brain and hold significant implications
for advancing artificial intelligence systems.
Related papers
- Neuron: Learning Context-Aware Evolving Representations for Zero-Shot Skeleton Action Recognition [64.56321246196859]
We propose a novel dyNamically Evolving dUal skeleton-semantic syneRgistic framework.
We first construct the spatial-temporal evolving micro-prototypes and integrate dynamic context-aware side information.
We introduce the spatial compression and temporal memory mechanisms to guide the growth of spatial-temporal micro-prototypes.
arXiv Detail & Related papers (2024-11-18T05:16:11Z) - Predictive Attractor Models [9.947717243638289]
We propose textitPredictive Attractor Models (PAM), a novel sequence memory architecture with desirable generative properties.
PAM avoids catastrophic forgetting by uniquely representing past context through lateral inhibition in cortical minicolumns.
We show that PAM is trained with local computations through Hebbian plasticity rules in a biologically plausible framework.
arXiv Detail & Related papers (2024-10-03T12:25:01Z) - Hierarchical Working Memory and a New Magic Number [1.024113475677323]
We propose a recurrent neural network model for chunking within the framework of the synaptic theory of working memory.
Our work provides a novel conceptual and analytical framework for understanding the on-the-fly organization of information in the brain that is crucial for cognition.
arXiv Detail & Related papers (2024-08-14T16:03:47Z) - Spatially-Aware Transformer for Embodied Agents [20.498778205143477]
This paper explores the use of Spatially-Aware Transformer models that incorporate spatial information.
We demonstrate that memory utilization efficiency can be improved, leading to enhanced accuracy in various place-centric downstream tasks.
We also propose the Adaptive Memory Allocator, a memory management method based on reinforcement learning.
arXiv Detail & Related papers (2024-02-23T07:46:30Z) - A Framework for Inference Inspired by Human Memory Mechanisms [9.408704431898279]
We propose a PMI framework that consists of perception, memory and inference components.
The memory module comprises working and long-term memory, with the latter endowed with a higher-order structure to retain extensive and complex relational knowledge and experience.
We apply our PMI to improve prevailing Transformers and CNN models on question-answering tasks like bAbI-20k and Sort-of-CLEVR datasets.
arXiv Detail & Related papers (2023-10-01T08:12:55Z) - Memory-and-Anticipation Transformer for Online Action Understanding [52.24561192781971]
We propose a novel memory-anticipation-based paradigm to model an entire temporal structure, including the past, present, and future.
We present Memory-and-Anticipation Transformer (MAT), a memory-anticipation-based approach, to address the online action detection and anticipation tasks.
arXiv Detail & Related papers (2023-08-15T17:34:54Z) - Sequential Memory with Temporal Predictive Coding [6.228559238589584]
We propose a PC-based model for emphsequential memory, called emphtemporal predictive coding (tPC)
We show that our tPC models can memorize and retrieve sequential inputs accurately with a biologically plausible neural implementation.
arXiv Detail & Related papers (2023-05-19T20:03:31Z) - On the Relationship Between Variational Inference and Auto-Associative
Memory [68.8204255655161]
We study how different neural network approaches to variational inference can be applied in this framework.
We evaluate the obtained algorithms on the CIFAR10 and CLEVR image datasets and compare them with other associative memory models.
arXiv Detail & Related papers (2022-10-14T14:18:47Z) - CogNGen: Constructing the Kernel of a Hyperdimensional Predictive
Processing Cognitive Architecture [79.07468367923619]
We present a new cognitive architecture that combines two neurobiologically plausible, computational models.
We aim to develop a cognitive architecture that has the power of modern machine learning techniques.
arXiv Detail & Related papers (2022-03-31T04:44:28Z) - Long-range and hierarchical language predictions in brains and
algorithms [82.81964713263483]
We show that while deep language algorithms are optimized to predict adjacent words, the human brain would be tuned to make long-range and hierarchical predictions.
This study strengthens predictive coding theory and suggests a critical role of long-range and hierarchical predictions in natural language processing.
arXiv Detail & Related papers (2021-11-28T20:26:07Z) - Towards a Neural Model for Serial Order in Frontal Cortex: a Brain
Theory from Memory Development to Higher-Level Cognition [53.816853325427424]
We propose that the immature prefrontal cortex (PFC) use its primary functionality of detecting hierarchical patterns in temporal signals.
Our hypothesis is that the PFC detects the hierarchical structure in temporal sequences in the form of ordinal patterns and use them to index information hierarchically in different parts of the brain.
By doing so, it gives the tools to the language-ready brain for manipulating abstract knowledge and planning temporally ordered information.
arXiv Detail & Related papers (2020-05-22T14:29:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.