Memory-Modular Classification: Learning to Generalize with Memory Replacement
- URL: http://arxiv.org/abs/2504.06021v1
- Date: Tue, 08 Apr 2025 13:26:24 GMT
- Title: Memory-Modular Classification: Learning to Generalize with Memory Replacement
- Authors: Dahyun Kang, Ahmet Iscen, Eunchan Jo, Sua Choi, Minsu Cho, Cordelia Schmid,
- Abstract summary: We propose a memory-modular learner for image classification that separates knowledge memorization from reasoning.<n>Our model enables effective generalization to new classes by simply replacing the memory contents.<n> Experimental results demonstrate the promising performance and versatility of our approach.
- Score: 79.772454831493
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a novel memory-modular learner for image classification that separates knowledge memorization from reasoning. Our model enables effective generalization to new classes by simply replacing the memory contents, without the need for model retraining. Unlike traditional models that encode both world knowledge and task-specific skills into their weights during training, our model stores knowledge in the external memory of web-crawled image and text data. At inference time, the model dynamically selects relevant content from the memory based on the input image, allowing it to adapt to arbitrary classes by simply replacing the memory contents. The key differentiator that our learner meta-learns to perform classification tasks with noisy web data from unseen classes, resulting in robust performance across various classification scenarios. Experimental results demonstrate the promising performance and versatility of our approach in handling diverse classification tasks, including zero-shot/few-shot classification of unseen classes, fine-grained classification, and class-incremental classification.
Related papers
- Improving Image Recognition by Retrieving from Web-Scale Image-Text Data [68.63453336523318]
We introduce an attention-based memory module, which learns the importance of each retrieved example from the memory.
Compared to existing approaches, our method removes the influence of the irrelevant retrieved examples, and retains those that are beneficial to the input query.
We show that it achieves state-of-the-art accuracies in ImageNet-LT, Places-LT and Webvision datasets.
arXiv Detail & Related papers (2023-04-11T12:12:05Z) - A Model or 603 Exemplars: Towards Memory-Efficient Class-Incremental
Learning [56.450090618578]
Class-Incremental Learning (CIL) aims to train a model with limited memory size to meet this requirement.
We show that when counting the model size into the total budget and comparing methods with aligned memory size, saving models do not consistently work.
We propose a simple yet effective baseline, denoted as MEMO for Memory-efficient Expandable MOdel.
arXiv Detail & Related papers (2022-05-26T08:24:01Z) - Hierarchical Variational Memory for Few-shot Learning Across Domains [120.87679627651153]
We introduce a hierarchical prototype model, where each level of the prototype fetches corresponding information from the hierarchical memory.
The model is endowed with the ability to flexibly rely on features at different semantic levels if the domain shift circumstances so demand.
We conduct thorough ablation studies to demonstrate the effectiveness of each component in our model.
arXiv Detail & Related papers (2021-12-15T15:01:29Z) - Memory Wrap: a Data-Efficient and Interpretable Extension to Image
Classification Models [9.848884631714451]
Memory Wrap is a plug-and-play extension to any image classification model.
It improves both data-efficiency and model interpretability, adopting a content-attention mechanism.
We show that Memory Wrap outperforms standard classifiers when it learns from a limited set of data.
arXiv Detail & Related papers (2021-06-01T07:24:19Z) - Fine-grained Classification via Categorical Memory Networks [42.413523046712896]
We present a class-specific memory module for fine-grained feature learning.
The memory module stores the prototypical feature representation for each category as a moving average.
We integrate our class-specific memory module into a standard convolutional neural network, yielding a Categorical Memory Network.
arXiv Detail & Related papers (2020-12-12T11:50:13Z) - Learning to Learn Variational Semantic Memory [132.39737669936125]
We introduce variational semantic memory into meta-learning to acquire long-term knowledge for few-shot learning.
The semantic memory is grown from scratch and gradually consolidated by absorbing information from tasks it experiences.
We formulate memory recall as the variational inference of a latent memory variable from addressed contents.
arXiv Detail & Related papers (2020-10-20T15:05:26Z) - Memory-Efficient Incremental Learning Through Feature Adaptation [71.1449769528535]
We introduce an approach for incremental learning that preserves feature descriptors of training images from previously learned classes.
Keeping the much lower-dimensional feature embeddings of images reduces the memory footprint significantly.
Experimental results show that our method achieves state-of-the-art classification accuracy in incremental learning benchmarks.
arXiv Detail & Related papers (2020-04-01T21:16:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.