Structured Episodic Event Memory
- URL: http://arxiv.org/abs/2601.06411v1
- Date: Sat, 10 Jan 2026 03:17:25 GMT
- Title: Structured Episodic Event Memory
- Authors: Zhengxuan Lu, Dongfang Li, Yukun Shi, Beilun Wang, Longyue Wang, Baotian Hu,
- Abstract summary: We propose Structured Episodic Event Memory (SEEM), a hierarchical framework that synergizes a graph memory layer for relational facts with a dynamic episodic memory layer for narrative progression.<n> Experimental results on the LoCoMo and LongMemEval benchmarks demonstrate that SEEM significantly outperforms baselines.
- Score: 37.643537420763344
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Current approaches to memory in Large Language Models (LLMs) predominantly rely on static Retrieval-Augmented Generation (RAG), which often results in scattered retrieval and fails to capture the structural dependencies required for complex reasoning. For autonomous agents, these passive and flat architectures lack the cognitive organization necessary to model the dynamic and associative nature of long-term interaction. To address this, we propose Structured Episodic Event Memory (SEEM), a hierarchical framework that synergizes a graph memory layer for relational facts with a dynamic episodic memory layer for narrative progression. Grounded in cognitive frame theory, SEEM transforms interaction streams into structured Episodic Event Frames (EEFs) anchored by precise provenance pointers. Furthermore, we introduce an agentic associative fusion and Reverse Provenance Expansion (RPE) mechanism to reconstruct coherent narrative contexts from fragmented evidence. Experimental results on the LoCoMo and LongMemEval benchmarks demonstrate that SEEM significantly outperforms baselines, enabling agents to maintain superior narrative coherence and logical consistency.
Related papers
- Understand Then Memory: A Cognitive Gist-Driven RAG Framework with Global Semantic Diffusion [14.538534837583931]
Retrieval-Augmented Generation (RAG) effectively mitigates hallucinations in LLMs by incorporating external knowledge.<n>We propose CogitoRAG, a RAG framework that simulates human cognitive memory processes.<n>We show that CogitoRAG significantly outperforms state-of-the-art RAG methods, showcasing superior capabilities in complex knowledge integration and reasoning.
arXiv Detail & Related papers (2026-02-11T12:58:08Z) - E-mem: Multi-agent based Episodic Context Reconstruction for LLM Agent Memory [4.8183840404266185]
E-mem is a framework shifting from Memory Preprocessing to Episodic Context Reconstruction.<n>E-mem achieves over 54% F1, surpassing the state-of-the-art GAM by 7.75%, while reducing token cost by over 70%.
arXiv Detail & Related papers (2026-01-29T13:42:42Z) - Temporal Complexity and Self-Organization in an Exponential Dense Associative Memory Model [0.0]
Temporal Complexity (TC) is a framework that characterizes complex systems by intermittent transition events between order and disorder.<n>Our results reveal that the SEDAM model exhibits regimes of complex intermittency characterized by nontrivial temporal correlations and scale-free behavior.<n>This study highlights the relevance of TC as a complementary framework for understanding learning and information processing in artificial and biological neural systems.
arXiv Detail & Related papers (2026-01-16T18:01:14Z) - The AI Hippocampus: How Far are We From Human Memory? [77.04745635827278]
Implicit memory refers to the knowledge embedded within the internal parameters of pre-trained transformers.<n>Explicit memory involves external storage and retrieval components designed to augment model outputs with dynamic, queryable knowledge representations.<n>Agentic memory introduces persistent, temporally extended memory structures within autonomous agents.
arXiv Detail & Related papers (2026-01-14T03:24:08Z) - Disco-RAG: Discourse-Aware Retrieval-Augmented Generation [81.53888908988756]
We propose Disco-RAG, a discourse-aware framework that injects discourse signals into the generation process.<n>Our method constructs intra-chunk discourse trees to capture local hierarchies and builds inter-chunk rhetorical graphs to model cross-passage coherence.<n>Experiments on question answering and long-document summarization benchmarks show the efficacy of our approach.
arXiv Detail & Related papers (2026-01-07T20:32:50Z) - Improving Multi-step RAG with Hypergraph-based Memory for Long-Context Complex Relational Modeling [83.29209853451697]
Multi-step retrieval-augmented generation (RAG) has become a widely adopted strategy for enhancing large language models (LLMs)<n>We introduce HGMem, a hypergraph-based memory mechanism that extends the concept of memory into a dynamic, expressive structure for complex reasoning and global understanding.<n>In our approach, memory is represented as a hypergraph whose hyperedges correspond to distinct memory units, enabling the progressive formation of higher-order interactions within memory.
arXiv Detail & Related papers (2025-12-30T03:13:10Z) - CAM: A Constructivist View of Agentic Memory for LLM-Based Reading Comprehension [55.29309306566238]
Current Large Language Models (LLMs) are confronted with overwhelming information volume when comprehending long-form documents.<n>This challenge raises the imperative of a cohesive memory module, which can elevate vanilla LLMs into autonomous reading agents.<n>We draw inspiration from Jean Piaget's Constructivist Theory, illuminating three traits of the agentic memory -- structured schemata, flexible assimilation, and dynamic accommodation.
arXiv Detail & Related papers (2025-10-07T02:16:30Z) - Cognitive Weave: Synthesizing Abstracted Knowledge with a Spatio-Temporal Resonance Graph [2.800801614127705]
This paper introduces Cognitive Weave, a memory framework centered around a multi-layered dynamic resonance graph (GSTR)<n>GSTR manages information as semantically rich insight particles (IPs), which are enriched with resonance keys, signifiers, and situational imprints via a dedicated semantic oracle interface (ISO)<n>A key component of Cognitive Weave is the cognitive process, which includes the synthesis of insight aggregates (AsI) condensed, higher-level knowledge structures.
arXiv Detail & Related papers (2025-06-09T18:00:46Z) - Latent Structured Hopfield Network for Semantic Association and Retrieval [52.634915010996835]
Episodic memory enables humans to recall past experiences by associating semantic elements such as objects, locations, and time into coherent event representations.<n>We propose the Latent Structured Hopfield Network (LSHN), a framework that integrates continuous Hopfield attractor dynamics into an autoencoder architecture.<n>Unlike traditional Hopfield networks, our model is trained end-to-end with gradient descent, achieving scalable and robust memory retrieval.
arXiv Detail & Related papers (2025-06-02T04:24:36Z) - Temporal Model On Quantum Logic [0.0]
The framework formalizes the evolution of propositions over time using linear and branching temporal models.<n>The hierarchical organization of memory is represented using directed acyclic graphs.
arXiv Detail & Related papers (2025-02-09T17:16:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.