Few-Shot Hyperspectral Image Classification With Unknown Classes Using
Multitask Deep Learning
- URL: http://arxiv.org/abs/2009.03508v1
- Date: Tue, 8 Sep 2020 03:53:10 GMT
- Title: Few-Shot Hyperspectral Image Classification With Unknown Classes Using
Multitask Deep Learning
- Authors: Shengjie Liu, Qian Shi, and Liangpei Zhang
- Abstract summary: Current hyperspectral image classification assumes that a predefined classification system is closed and complete.
We propose a deep learning method that simultaneously conducts classification and reconstruction in the open world.
Our method achieved more accurate hyperspectral image classification, especially under the few-shot context.
- Score: 24.02524697784525
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Current hyperspectral image classification assumes that a predefined
classification system is closed and complete, and there are no unknown or novel
classes in the unseen data. However, this assumption may be too strict for the
real world. Often, novel classes are overlooked when the classification system
is constructed. The closed nature forces a model to assign a label given a new
sample and may lead to overestimation of known land covers (e.g., crop area).
To tackle this issue, we propose a multitask deep learning method that
simultaneously conducts classification and reconstruction in the open world
(named MDL4OW) where unknown classes may exist. The reconstructed data are
compared with the original data; those failing to be reconstructed are
considered unknown, based on the assumption that they are not well represented
in the latent features due to the lack of labels. A threshold needs to be
defined to separate the unknown and known classes; we propose two strategies
based on the extreme value theory for few-shot and many-shot scenarios. The
proposed method was tested on real-world hyperspectral images; state-of-the-art
results were achieved, e.g., improving the overall accuracy by 4.94% for the
Salinas data. By considering the existence of unknown classes in the open
world, our method achieved more accurate hyperspectral image classification,
especially under the few-shot context.
Related papers
- Open-world Semi-supervised Novel Class Discovery [12.910670907071523]
We introduce a new open-world semi-supervised novel class discovery approach named OpenNCD.
The proposed method is composed of two reciprocally enhanced parts. First, a bi-level contrastive learning method is introduced, which maintains the pair-wise similarity of the prototypes.
The results show the effectiveness of the proposed method in open-world scenarios, especially with scarce known classes and labels.
arXiv Detail & Related papers (2023-05-22T14:59:50Z) - Fine-Grained ImageNet Classification in the Wild [0.0]
Robustness tests can uncover several vulnerabilities and biases which go unnoticed during the typical model evaluation stage.
In our work, we perform fine-grained classification on closely related categories, which are identified with the help of hierarchical knowledge.
arXiv Detail & Related papers (2023-03-04T12:25:07Z) - Novel Class Discovery without Forgetting [72.52222295216062]
We identify and formulate a new, pragmatic problem setting of NCDwF: Novel Class Discovery without Forgetting.
We propose a machine learning model to incrementally discover novel categories of instances from unlabeled data.
We introduce experimental protocols based on CIFAR-10, CIFAR-100 and ImageNet-1000 to measure the trade-off between knowledge retention and novel class discovery.
arXiv Detail & Related papers (2022-07-21T17:54:36Z) - Few-shot Open-set Recognition Using Background as Unknowns [58.04165813493666]
Few-shot open-set recognition aims to classify both seen and novel images given only limited training data of seen classes.
Our proposed method not only outperforms multiple baselines but also sets new results on three popular benchmarks.
arXiv Detail & Related papers (2022-07-19T04:19:29Z) - Learning from Multiple Unlabeled Datasets with Partial Risk
Regularization [80.54710259664698]
In this paper, we aim to learn an accurate classifier without any class labels.
We first derive an unbiased estimator of the classification risk that can be estimated from the given unlabeled sets.
We then find that the classifier obtained as such tends to cause overfitting as its empirical risks go negative during training.
Experiments demonstrate that our method effectively mitigates overfitting and outperforms state-of-the-art methods for learning from multiple unlabeled sets.
arXiv Detail & Related papers (2022-07-04T16:22:44Z) - Generalized Category Discovery [148.32255950504182]
We consider a highly general image recognition setting wherein, given a labelled and unlabelled set of images, the task is to categorize all images in the unlabelled set.
Here, the unlabelled images may come from labelled classes or from novel ones.
We first establish strong baselines by taking state-of-the-art algorithms from novel category discovery and adapting them for this task.
We then introduce a simple yet effective semi-supervised $k$-means method to cluster the unlabelled data into seen and unseen classes.
arXiv Detail & Related papers (2022-01-07T18:58:35Z) - Open-World Semi-Supervised Learning [66.90703597468377]
We introduce a new open-world semi-supervised learning setting in which the model is required to recognize previously seen classes.
We propose ORCA, an approach that learns to simultaneously classify and cluster the data.
We demonstrate that ORCA accurately discovers novel classes and assigns samples to previously seen classes on benchmark image classification datasets.
arXiv Detail & Related papers (2021-02-06T07:11:07Z) - Entropy-Based Uncertainty Calibration for Generalized Zero-Shot Learning [49.04790688256481]
The goal of generalized zero-shot learning (GZSL) is to recognise both seen and unseen classes.
Most GZSL methods typically learn to synthesise visual representations from semantic information on the unseen classes.
We propose a novel framework that leverages dual variational autoencoders with a triplet loss to learn discriminative latent features.
arXiv Detail & Related papers (2021-01-09T05:21:27Z) - No Subclass Left Behind: Fine-Grained Robustness in Coarse-Grained
Classification Problems [20.253644336965042]
In real-world classification tasks, each class often comprises multiple finer-grained "subclasses"
As the subclass labels are frequently unavailable, models trained using only the coarser-grained class labels often exhibit highly variable performance across different subclasses.
We propose GEORGE, a method to both measure and mitigate hidden stratification even when subclass labels are unknown.
arXiv Detail & Related papers (2020-11-25T18:50:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.