Integrating Temporal Representations for Dynamic Memory Retrieval and Management in Large Language Models
- URL: http://arxiv.org/abs/2410.13553v1
- Date: Thu, 17 Oct 2024 13:51:03 GMT
- Title: Integrating Temporal Representations for Dynamic Memory Retrieval and Management in Large Language Models
- Authors: Yuki Hou, Haruki Tamoto, Homei Miyashita,
- Abstract summary: We propose SynapticRAG, a novel approach integrating synaptic dynamics into Retrieval-Augmented Generation (RAG)
Our approach advances context-aware dialogue AI systems by enhancing long-term context maintenance and specific information extraction from conversations.
- Score: 8.943924354248622
- License:
- Abstract: Conventional dialogue agents often struggle with effective memory recall, leading to redundant retrieval and inadequate management of unique user associations. To address this, we propose SynapticRAG, a novel approach integrating synaptic dynamics into Retrieval-Augmented Generation (RAG). SynapticRAG integrates temporal representations into memory vectors, mimicking biological synapses by differentiating events based on occurrence times and dynamically updating memory significance. This model employs temporal scoring for memory connections and a synaptic-inspired propagation control mechanism. Experiments across English, Japanese, and Chinese datasets demonstrate SynapticRAG's superiority over existing methods, including traditional RAG, with up to 14.66\% improvement in memory retrieval accuracy. Our approach advances context-aware dialogue AI systems by enhancing long-term context maintenance and specific information extraction from conversations.
Related papers
- Embodied-RAG: General Non-parametric Embodied Memory for Retrieval and Generation [65.23793829741014]
Embodied-RAG is a framework that enhances the model of an embodied agent with a non-parametric memory system.
At its core, Embodied-RAG's memory is structured as a semantic forest, storing language descriptions at varying levels of detail.
We demonstrate that Embodied-RAG effectively bridges RAG to the robotics domain, successfully handling over 200 explanation and navigation queries.
arXiv Detail & Related papers (2024-09-26T21:44:11Z) - DyG-Mamba: Continuous State Space Modeling on Dynamic Graphs [59.434893231950205]
Dynamic graph learning aims to uncover evolutionary laws in real-world systems.
We propose DyG-Mamba, a new continuous state space model for dynamic graph learning.
We show that DyG-Mamba achieves state-of-the-art performance on most datasets.
arXiv Detail & Related papers (2024-08-13T15:21:46Z) - Hello Again! LLM-powered Personalized Agent for Long-term Dialogue [63.65128176360345]
We introduce a model-agnostic framework, the Long-term Dialogue Agent (LD-Agent)
It incorporates three independently tunable modules dedicated to event perception, persona extraction, and response generation.
The effectiveness, generality, and cross-domain capabilities of LD-Agent are empirically demonstrated.
arXiv Detail & Related papers (2024-06-09T21:58:32Z) - Semantically-correlated memories in a dense associative model [2.7195102129095003]
I introduce a novel associative memory model named Correlated Associative Memory (CDAM)
CDAM integrates both auto- and hetero-association in a unified framework for continuous-valued memory patterns.
It is theoretically and numerically analysed, revealing four distinct dynamical modes.
arXiv Detail & Related papers (2024-04-10T16:04:07Z) - A Framework for Inference Inspired by Human Memory Mechanisms [9.408704431898279]
We propose a PMI framework that consists of perception, memory and inference components.
The memory module comprises working and long-term memory, with the latter endowed with a higher-order structure to retain extensive and complex relational knowledge and experience.
We apply our PMI to improve prevailing Transformers and CNN models on question-answering tasks like bAbI-20k and Sort-of-CLEVR datasets.
arXiv Detail & Related papers (2023-10-01T08:12:55Z) - Relational Temporal Graph Reasoning for Dual-task Dialogue Language
Understanding [39.76268402567324]
Dual-task dialog understanding language aims to tackle two correlative dialog language understanding tasks simultaneously via their inherent correlations.
We put forward a new framework, whose core is relational temporal graph reasoning.
Our models outperform state-of-the-art models by a large margin.
arXiv Detail & Related papers (2023-06-15T13:19:08Z) - Sparse Coding in a Dual Memory System for Lifelong Learning [13.041607703862724]
Brain efficiently encodes information in non-overlapping sparse codes.
We employ sparse coding in a multiple-memory replay mechanism.
Our method maintains an additional long-term semantic memory that aggregates and consolidates information encoded in the synaptic weights of the working model.
arXiv Detail & Related papers (2022-12-28T12:56:15Z) - Canonical Cortical Graph Neural Networks and its Application for Speech
Enhancement in Future Audio-Visual Hearing Aids [0.726437825413781]
This paper proposes a more biologically plausible self-supervised machine learning approach that combines multimodal information using intra-layer modulations together with canonical correlation analysis (CCA)
The approach outperformed recent state-of-the-art results considering both better clean audio reconstruction and energy efficiency, described by a reduced and smother neuron firing rate distribution.
arXiv Detail & Related papers (2022-06-06T15:20:07Z) - Temporal Memory Relation Network for Workflow Recognition from Surgical
Video [53.20825496640025]
We propose a novel end-to-end temporal memory relation network (TMNet) for relating long-range and multi-scale temporal patterns.
We have extensively validated our approach on two benchmark surgical video datasets.
arXiv Detail & Related papers (2021-03-30T13:20:26Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Sequential Recommender via Time-aware Attentive Memory Network [67.26862011527986]
We propose a temporal gating methodology to improve attention mechanism and recurrent units.
We also propose a Multi-hop Time-aware Attentive Memory network to integrate long-term and short-term preferences.
Our approach is scalable for candidate retrieval tasks and can be viewed as a non-linear generalization of latent factorization for dot-product based Top-K recommendation.
arXiv Detail & Related papers (2020-05-18T11:29:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.