DMN4: Few-shot Learning via Discriminative Mutual Nearest Neighbor
Neural Network
- URL: http://arxiv.org/abs/2103.08160v1
- Date: Mon, 15 Mar 2021 06:57:09 GMT
- Title: DMN4: Few-shot Learning via Discriminative Mutual Nearest Neighbor
Neural Network
- Authors: Yang Liu, Tu Zheng, Jie Song, Deng Cai, Xiaofei He
- Abstract summary: Few-shot learning aims to classify images under low-data regimes.
Recent work has achieved promising performances by using deep descriptors.
We propose a Mutual Nearest Neighbor (MNN) relation to explicitly select the query descriptors that are most relevant to each task.
- Score: 40.18613107849442
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Few-shot learning (FSL) aims to classify images under low-data regimes, where
the conventional pooled global representation is likely to lose useful local
characteristics. Recent work has achieved promising performances by using deep
descriptors. They generally take all deep descriptors from neural networks into
consideration while ignoring that some of them are useless in classification
due to their limited receptive field, e.g., task-irrelevant descriptors could
be misleading and multiple aggregative descriptors from background clutter
could even overwhelm the object's presence. In this paper, we argue that a
Mutual Nearest Neighbor (MNN) relation should be established to explicitly
select the query descriptors that are most relevant to each task and discard
less relevant ones from aggregative clutters in FSL. Specifically, we propose
Discriminative Mutual Nearest Neighbor Neural Network (DMN4) for FSL. Extensive
experiments demonstrate that our method not only qualitatively selects
task-relevant descriptors but also quantitatively outperforms the existing
state-of-the-arts by a large margin of 1.8~4.9% on fine-grained CUB, a
considerable margin of 1.4~2.2% on both supervised and semi-supervised
miniImagenet, and ~1.4% on challenging tieredimagenet.
Related papers
- Nearest Neighbor-Based Contrastive Learning for Hyperspectral and LiDAR
Data Classification [45.026868970899514]
We propose a Nearest Neighbor-based Contrastive Learning Network (NNCNet) to learn discriminative feature representations.
Specifically, we propose a nearest neighbor-based data augmentation scheme to use enhanced semantic relationships among nearby regions.
In addition, we design a bilinear attention module to exploit the second-order and even high-order feature interactions between the HSI and LiDAR data.
arXiv Detail & Related papers (2023-01-09T13:43:54Z) - Neural Implicit Dictionary via Mixture-of-Expert Training [111.08941206369508]
We present a generic INR framework that achieves both data and training efficiency by learning a Neural Implicit Dictionary (NID)
Our NID assembles a group of coordinate-based Impworks which are tuned to span the desired function space.
Our experiments show that, NID can improve reconstruction of 2D images or 3D scenes by 2 orders of magnitude faster with up to 98% less input data.
arXiv Detail & Related papers (2022-07-08T05:07:19Z) - Denoised Non-Local Neural Network for Semantic Segmentation [18.84185406522064]
We propose a Denoised Non-Local Network (Denoised NL) to eliminate the inter-class and intra-class noises respectively.
Our proposed NL can achieve the state-of-the-art performance of 83.5% and 46.69% mIoU on Cityscapes and ADE20K, respectively.
arXiv Detail & Related papers (2021-10-27T06:16:31Z) - Dual-Neighborhood Deep Fusion Network for Point Cloud Analysis [7.696435157444049]
Dual-Neighborhood Deep Fusion Network (DNDFN) is proposed to deal with this problem.
DNDFN has two key points. One is combination of local neighborhood and global neigh-borhood.
TN-Learning is combined with them to obtain richer neighborhood information.
The other is information transfer convolution (IT-Conv) which can learn the structural information between two points and transfer features through it.
arXiv Detail & Related papers (2021-08-20T15:37:13Z) - External-Memory Networks for Low-Shot Learning of Targets in
Forward-Looking-Sonar Imagery [8.767175335575386]
We propose a memory-based framework for real-time, data-efficient target analysis in forward-looking-sonar (FLS) imagery.
Our framework relies on first removing non-discriminative details from the imagery using a small-scale DenseNet-inspired network.
We then cascade the filtered imagery into a novel NeuralRAM-based convolutional matching network, NRMN, for low-shot target recognition.
arXiv Detail & Related papers (2021-07-22T07:50:44Z) - Discriminative Nearest Neighbor Few-Shot Intent Detection by
Transferring Natural Language Inference [150.07326223077405]
Few-shot learning is attracting much attention to mitigate data scarcity.
We present a discriminative nearest neighbor classification with deep self-attention.
We propose to boost the discriminative ability by transferring a natural language inference (NLI) model.
arXiv Detail & Related papers (2020-10-25T00:39:32Z) - ReMarNet: Conjoint Relation and Margin Learning for Small-Sample Image
Classification [49.87503122462432]
We introduce a novel neural network termed Relation-and-Margin learning Network (ReMarNet)
Our method assembles two networks of different backbones so as to learn the features that can perform excellently in both of the aforementioned two classification mechanisms.
Experiments on four image datasets demonstrate that our approach is effective in learning discriminative features from a small set of labeled samples.
arXiv Detail & Related papers (2020-06-27T13:50:20Z) - OSLNet: Deep Small-Sample Classification with an Orthogonal Softmax
Layer [77.90012156266324]
This paper aims to find a subspace of neural networks that can facilitate a large decision margin.
We propose the Orthogonal Softmax Layer (OSL), which makes the weight vectors in the classification layer remain during both the training and test processes.
Experimental results demonstrate that the proposed OSL has better performance than the methods used for comparison on four small-sample benchmark datasets.
arXiv Detail & Related papers (2020-04-20T02:41:01Z) - Weakly-Supervised Semantic Segmentation by Iterative Affinity Learning [86.45526827323954]
Weakly-supervised semantic segmentation is a challenging task as no pixel-wise label information is provided for training.
We propose an iterative algorithm to learn such pairwise relations.
We show that the proposed algorithm performs favorably against the state-of-the-art methods.
arXiv Detail & Related papers (2020-02-19T10:32:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.