Transfer between long-term and short-term memory using Conceptors
- URL: http://arxiv.org/abs/2003.11640v1
- Date: Wed, 11 Mar 2020 09:13:58 GMT
- Title: Transfer between long-term and short-term memory using Conceptors
- Authors: Anthony Strock (Mnemosyne, LaBRI, IMN), Nicolas Rougier (Mnemosyne,
LaBRI, IMN), Xavier Hinaut (Mnemosyne, LaBRI, IMN)
- Abstract summary: We introduce a recurrent neural network model of working memory combining short-term and long-term components.
We show how standard operations on conceptors allow to combine long-term memories and describe their effect on short-term memory.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce a recurrent neural network model of working memory combining
short-term and long-term components. e short-term component is modelled using a
gated reservoir model that is trained to hold a value from an input stream when
a gate signal is on. e long-term component is modelled using conceptors in
order to store inner temporal patterns (that corresponds to values). We combine
these two components to obtain a model where information can go from long-term
memory to short-term memory and vice-versa and we show how standard operations
on conceptors allow to combine long-term memories and describe their effect on
short-term memory.
Related papers
- MeMSVD: Long-Range Temporal Structure Capturing Using Incremental SVD [27.472705540825316]
This paper is on long-term video understanding where the goal is to recognise human actions over long temporal windows (up to minutes long)
We propose an alternative to attention-based schemes which is based on a low-rank approximation of the memory obtained using Singular Value Decomposition.
Our scheme has two advantages: (a) it reduces complexity by more than an order of magnitude, and (b) it is amenable to an efficient implementation for the calculation of the memory bases.
arXiv Detail & Related papers (2024-06-11T12:03:57Z) - MemoNav: Selecting Informative Memories for Visual Navigation [43.185016165039116]
We present the MemoNav, a novel memory mechanism for image-goal navigation.
The MemoNav retains the agent's informative short-term memory and long-term memory to improve the navigation performance.
We evaluate our model on a new multi-goal navigation dataset.
arXiv Detail & Related papers (2022-08-20T05:57:21Z) - XMem: Long-Term Video Object Segmentation with an Atkinson-Shiffrin
Memory Model [137.50614198301733]
We present XMem, a video object segmentation architecture for long videos with unified feature memory stores.
We develop an architecture that incorporates multiple independent yet deeply-connected feature memory stores.
XMem greatly exceeds state-of-the-art performance on long-video datasets.
arXiv Detail & Related papers (2022-07-14T17:59:37Z) - LaMemo: Language Modeling with Look-Ahead Memory [50.6248714811912]
We propose Look-Ahead Memory (LaMemo) that enhances the recurrence memory by incrementally attending to the right-side tokens.
LaMemo embraces bi-directional attention and segment recurrence with an additional overhead only linearly proportional to the memory length.
Experiments on widely used language modeling benchmarks demonstrate its superiority over the baselines equipped with different types of memory.
arXiv Detail & Related papers (2022-04-15T06:11:25Z) - RotLSTM: Rotating Memories in Recurrent Neural Networks [0.0]
We introduce the concept of modifying the cell state (memory) of LSTMs using rotation matrices parametrised by a new set of trainable weights.
This addition shows significant increases of performance on some of the tasks from the bAbI dataset.
arXiv Detail & Related papers (2021-05-01T23:48:58Z) - Temporal Memory Relation Network for Workflow Recognition from Surgical
Video [53.20825496640025]
We propose a novel end-to-end temporal memory relation network (TMNet) for relating long-range and multi-scale temporal patterns.
We have extensively validated our approach on two benchmark surgical video datasets.
arXiv Detail & Related papers (2021-03-30T13:20:26Z) - Dynamic Memory based Attention Network for Sequential Recommendation [79.5901228623551]
We propose a novel long sequential recommendation model called Dynamic Memory-based Attention Network (DMAN)
It segments the overall long behavior sequence into a series of sub-sequences, then trains the model and maintains a set of memory blocks to preserve long-term interests of users.
Based on the dynamic memory, the user's short-term and long-term interests can be explicitly extracted and combined for efficient joint recommendation.
arXiv Detail & Related papers (2021-02-18T11:08:54Z) - Memformer: A Memory-Augmented Transformer for Sequence Modeling [55.780849185884996]
We present Memformer, an efficient neural network for sequence modeling.
Our model achieves linear time complexity and constant memory space complexity when processing long sequences.
arXiv Detail & Related papers (2020-10-14T09:03:36Z) - Sequential Recommender via Time-aware Attentive Memory Network [67.26862011527986]
We propose a temporal gating methodology to improve attention mechanism and recurrent units.
We also propose a Multi-hop Time-aware Attentive Memory network to integrate long-term and short-term preferences.
Our approach is scalable for candidate retrieval tasks and can be viewed as a non-linear generalization of latent factorization for dot-product based Top-K recommendation.
arXiv Detail & Related papers (2020-05-18T11:29:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.