PLATINUM: Semi-Supervised Model Agnostic Meta-Learning using Submodular
Mutual Information
- URL: http://arxiv.org/abs/2201.12928v1
- Date: Sun, 30 Jan 2022 22:07:17 GMT
- Title: PLATINUM: Semi-Supervised Model Agnostic Meta-Learning using Submodular
Mutual Information
- Authors: Changbin Li, Suraj Kothawade, Feng Chen, Rishabh Iyer
- Abstract summary: Few-shot classification (FSC) requires training models using a few (typically one to five) data points per class.
We propose PLATINUM, a novel semi-supervised model agnostic meta-learning framework that uses the submodular mutual information (SMI) functions to boost the performance of FSC.
- Score: 3.1845305066053347
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Few-shot classification (FSC) requires training models using a few (typically
one to five) data points per class. Meta learning has proven to be able to
learn a parametrized model for FSC by training on various other classification
tasks. In this work, we propose PLATINUM (semi-suPervised modeL Agnostic
meTa-learnIng usiNg sUbmodular Mutual information), a novel semi-supervised
model agnostic meta-learning framework that uses the submodular mutual
information (SMI) functions to boost the performance of FSC. PLATINUM leverages
unlabeled data in the inner and outer loop using SMI functions during
meta-training and obtains richer meta-learned parameterizations for meta-test.
We study the performance of PLATINUM in two scenarios - 1) where the unlabeled
data points belong to the same set of classes as the labeled set of a certain
episode, and 2) where there exist out-of-distribution classes that do not
belong to the labeled set. We evaluate our method on various settings on the
miniImageNet, tieredImageNet and Fewshot-CIFAR100 datasets. Our experiments
show that PLATINUM outperforms MAML and semi-supervised approaches like
pseduo-labeling for semi-supervised FSC, especially for small ratio of labeled
examples per class.
Related papers
- Meta-UAD: A Meta-Learning Scheme for User-level Network Traffic Anomaly Detection [15.038762892493219]
We propose textitMeta-UAD, a Meta-learning scheme for User-level network traffic Anomaly Detection.
We use the CICFlowMeter to extract 81 flow-level statistical features and remove some invalid ones.
Compared with existing models, the results further demonstrate the superiority of Meta-UAD with 15% - 43% gains in F1-score.
arXiv Detail & Related papers (2024-08-30T06:05:15Z) - Architecture, Dataset and Model-Scale Agnostic Data-free Meta-Learning [119.70303730341938]
We propose ePisode cUrriculum inveRsion (ECI) during data-free meta training and invErsion calibRation following inner loop (ICFIL) during meta testing.
ECI adaptively increases the difficulty level of pseudo episodes according to the real-time feedback of the meta model.
We formulate the optimization process of meta training with ECI as an adversarial form in an end-to-end manner.
arXiv Detail & Related papers (2023-03-20T15:10:41Z) - Pseudo-Labeling Based Practical Semi-Supervised Meta-Training for Few-Shot Learning [93.63638405586354]
We propose a simple and effective meta-training framework, called pseudo-labeling based meta-learning (PLML)
Firstly, we train a classifier via common semi-supervised learning (SSL) and use it to obtain the pseudo-labels of unlabeled data.
We build few-shot tasks from labeled and pseudo-labeled data and design a novel finetuning method with feature smoothing and noise suppression.
arXiv Detail & Related papers (2022-07-14T10:53:53Z) - Self-Adaptive Label Augmentation for Semi-supervised Few-shot
Classification [121.63992191386502]
Few-shot classification aims to learn a model that can generalize well to new tasks when only a few labeled samples are available.
We propose a semi-supervised few-shot classification method that assigns an appropriate label to each unlabeled sample by a manually defined metric.
A major novelty of SALA is the task-adaptive metric, which can learn the metric adaptively for different tasks in an end-to-end fashion.
arXiv Detail & Related papers (2022-06-16T13:14:03Z) - CIM: Class-Irrelevant Mapping for Few-Shot Classification [58.02773394658623]
Few-shot classification (FSC) is one of the most concerned hot issues in recent years.
How to appraise the pre-trained FEM is the most crucial focus in the FSC community.
We propose a simple, flexible method, dubbed as Class-Irrelevant Mapping (CIM)
arXiv Detail & Related papers (2021-09-07T03:26:24Z) - Explaining the Performance of Multi-label Classification Methods with
Data Set Properties [1.1278903078792917]
We present a comprehensive meta-learning study of data sets and methods for multi-label classification (MLC)
Here, we analyze 40 MLC data sets by using 50 meta features describing different properties of the data.
The most prominent meta features that describe the space of MLC data sets are the ones assessing different aspects of the label space.
arXiv Detail & Related papers (2021-06-28T11:00:05Z) - MM-FSOD: Meta and metric integrated few-shot object detection [14.631208179789583]
We present an effective object detection framework (MM-FSOD) that integrates metric learning and meta-learning.
Our model is a class-agnostic detection model that can accurately recognize new categories, which are not appearing in training samples.
arXiv Detail & Related papers (2020-12-30T14:02:52Z) - Explanation-Guided Training for Cross-Domain Few-Shot Classification [96.12873073444091]
Cross-domain few-shot classification task (CD-FSC) combines few-shot classification with the requirement to generalize across domains represented by datasets.
We introduce a novel training approach for existing FSC models.
We show that explanation-guided training effectively improves the model generalization.
arXiv Detail & Related papers (2020-07-17T07:28:08Z) - Incremental Meta-Learning via Indirect Discriminant Alignment [118.61152684795178]
We develop a notion of incremental learning during the meta-training phase of meta-learning.
Our approach performs favorably at test time as compared to training a model with the full meta-training set.
arXiv Detail & Related papers (2020-02-11T01:39:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.