Abstract: Few-shot learning (FSL) aims to learn a classifier that can be easily adapted
to accommodate new tasks not seen during training, given only a few examples.
To handle the limited-data problem in few-shot regimes, recent methods tend to
collectively use a set of local features to densely represent an image instead
of using a mixed global feature. They generally explore a unidirectional
query-to-support paradigm in FSL, e.g., find the nearest/optimal support
feature for each query feature and aggregate these local matches for a joint
classification. In this paper, we propose a new method Mutual Centralized
Learning (MCL) to fully affiliate the two disjoint sets of dense features in a
bidirectional paradigm. We associate each local feature with a particle that
can bidirectionally random walk in a discrete feature space by the
affiliations. To estimate the class probability, we propose the features'
accessibility that measures the expected number of visits to the support
features of that class in a Markov process. We relate our method to learning a
centrality on an affiliation network and demonstrate its capability to be
plugged in existing methods by highlighting centralized local features.
Experiments show that our method achieves the state-of-the-art on both
miniImageNet and tieredImageNet.