'Less Than One'-Shot Learning: Learning N Classes From M
  • URL: http://arxiv.org/abs/2009.08449v1
  • Date: Thu, 17 Sep 2020 17:55:29 GMT
  • Title: 'Less Than One'-Shot Learning: Learning N Classes From M<N Samples
  • Authors: Ilia Sucholutsky, Matthias Schonlau
  • Abstract summary: In the few-shot learning setting, a model must learn a new class given only a small number of samples from that class. We propose the less than one'-shot learning task where models must learn $N$ new classes given only $MN$ examples.
  • Score: 13.70633147306388
  • License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
  • Abstract: Deep neural networks require large training sets but suffer from high computational cost and long training times. Training on much smaller training sets while maintaining nearly the same accuracy would be very beneficial. In the few-shot learning setting, a model must learn a new class given only a small number of samples from that class. One-shot learning is an extreme form of few-shot learning where the model must learn a new class from a single example. We propose the `less than one'-shot learning task where models must learn $N$ new classes given only $M<N$ examples and we show that this is achievable with the help of soft labels. We use a soft-label generalization of the k-Nearest Neighbors classifier to explore the intricate decision landscapes that can be created in the `less than one'-shot learning setting. We analyze these decision landscapes to derive theoretical lower bounds for separating $N$ classes using $M<N$ soft-label samples and investigate the robustness of the resulting systems.

Related papers

This list is automatically generated from the titles and abstracts of the papers in this site.

This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.