Class Distribution Shifts in Zero-Shot Learning: Learning Robust Representations
- URL: http://arxiv.org/abs/2311.18575v3
- Date: Mon, 27 May 2024 21:19:20 GMT
- Title: Class Distribution Shifts in Zero-Shot Learning: Learning Robust Representations
- Authors: Yuli Slavutsky, Yuval Benjamini,
- Abstract summary: We propose a model that assumes that the attribute responsible for the shift is unknown in advance.
We show that our approach improves generalization on diverse class distributions in both simulations and real-world datasets.
- Score: 3.8980564330208662
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Zero-shot learning methods typically assume that the new, unseen classes that are encountered at deployment, come from the same distribution as training classes. However, real-world scenarios often involve class distribution shifts (e.g., in age or gender for person identification), posing challenges for zero-shot classifiers that rely on learned representations from training classes. In this work, we propose a model that assumes that the attribute responsible for the shift is unknown in advance, and show that standard training may lead to non-robust representations. To mitigate this, we propose an algorithm for learning robust representations by (a) constructing synthetic data environments via hierarchical sampling and (b) applying environment balancing penalization, inspired by out-of-distribution problems. We show that our approach improves generalization on diverse class distributions in both simulations and real-world datasets.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.