HOpenCls: Training Hyperspectral Image Open-Set Classifiers in Their Living Environments
- URL: http://arxiv.org/abs/2502.15163v1
- Date: Fri, 21 Feb 2025 02:59:18 GMT
- Title: HOpenCls: Training Hyperspectral Image Open-Set Classifiers in Their Living Environments
- Authors: Hengwei Zhao, Xinyu Wang, Zhuo Zheng, Jingtao Li, Yanfei Zhong,
- Abstract summary: Hyperspectral image (HSI) open-set classification is critical for HSI classification models deployed in real-world environments.<n>This paper proposes a novel framework, HOpenCls, to leverage the unlabeled wild data.<n>Experiment results demonstrate that wild data has the potential to significantly enhance open-set HSI classification in complex real-world scenarios.
- Score: 12.486470619228776
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hyperspectral image (HSI) open-set classification is critical for HSI classification models deployed in real-world environments, where classifiers must simultaneously classify known classes and reject unknown classes. Recent methods utilize auxiliary unknown classes data to improve classification performance. However, the auxiliary unknown classes data is strongly assumed to be completely separable from known classes and requires labor-intensive annotation. To address this limitation, this paper proposes a novel framework, HOpenCls, to leverage the unlabeled wild data-that is the mixture of known and unknown classes. Such wild data is abundant and can be collected freely during deploying classifiers in their living environments. The key insight is reformulating the open-set HSI classification with unlabeled wild data as a positive-unlabeled (PU) learning problem. Specifically, the multi-label strategy is introduced to bridge the PU learning and open-set HSI classification, and then the proposed gradient contraction and gradient expansion module to make this PU learning problem tractable from the observation of abnormal gradient weights associated with wild data. Extensive experiment results demonstrate that incorporating wild data has the potential to significantly enhance open-set HSI classification in complex real-world scenarios.
Related papers
- Robust Semi-Supervised Learning for Self-learning Open-World Classes [5.714673612282175]
In real-world applications, unlabeled data always contain classes not present in the labeled set.
We propose an open-world SSL method for Self-learning Open-world Classes (SSOC), which can explicitly self-learn multiple unknown classes.
SSOC outperforms the state-of-the-art baselines on multiple popular classification benchmarks.
arXiv Detail & Related papers (2024-01-15T09:27:46Z) - Learning Large Margin Sparse Embeddings for Open Set Medical Diagnosis [8.131130865777346]
Open set recognition (OSR) states that categories unseen in training could appear in testing.
OSR requires an algorithm to not only correctly classify known classes, but also recognize unknown classes and forward them to experts for further diagnosis.
We propose Open Margin Cosine Loss (OMCL) unifying two mechanisms. The former, called Margin Loss with Adaptive Scale (MLAS), introduces angular margin for reinforcing intra-class compactness and inter-class separability.
The latter, called Open-Space Suppression (OSS), opens the classifier by recognizing sparse embedding space as unknowns using proposed feature space descriptors.
arXiv Detail & Related papers (2023-07-10T13:09:42Z) - Open World Classification with Adaptive Negative Samples [89.2422451410507]
Open world classification is a task in natural language processing with key practical relevance and impact.
We propose an approach based on underlineadaptive underlinesamples (ANS) designed to generate effective synthetic open category samples in the training stage.
ANS achieves significant improvements over state-of-the-art methods.
arXiv Detail & Related papers (2023-03-09T21:12:46Z) - Parametric Classification for Generalized Category Discovery: A Baseline
Study [70.73212959385387]
Generalized Category Discovery (GCD) aims to discover novel categories in unlabelled datasets using knowledge learned from labelled samples.
We investigate the failure of parametric classifiers, verify the effectiveness of previous design choices when high-quality supervision is available, and identify unreliable pseudo-labels as a key problem.
We propose a simple yet effective parametric classification method that benefits from entropy regularisation, achieves state-of-the-art performance on multiple GCD benchmarks and shows strong robustness to unknown class numbers.
arXiv Detail & Related papers (2022-11-21T18:47:11Z) - Generalized Category Discovery [148.32255950504182]
We consider a highly general image recognition setting wherein, given a labelled and unlabelled set of images, the task is to categorize all images in the unlabelled set.
Here, the unlabelled images may come from labelled classes or from novel ones.
We first establish strong baselines by taking state-of-the-art algorithms from novel category discovery and adapting them for this task.
We then introduce a simple yet effective semi-supervised $k$-means method to cluster the unlabelled data into seen and unseen classes.
arXiv Detail & Related papers (2022-01-07T18:58:35Z) - No Fear of Heterogeneity: Classifier Calibration for Federated Learning
with Non-IID Data [78.69828864672978]
A central challenge in training classification models in the real-world federated system is learning with non-IID data.
We propose a novel and simple algorithm called Virtual Representations (CCVR), which adjusts the classifier using virtual representations sampled from an approximated ssian mixture model.
Experimental results demonstrate that CCVR state-of-the-art performance on popular federated learning benchmarks including CIFAR-10, CIFAR-100, and CINIC-10.
arXiv Detail & Related papers (2021-06-09T12:02:29Z) - Learning Placeholders for Open-Set Recognition [38.57786747665563]
We propose PlaceholdeRs for Open-SEt Recognition (Proser) to maintain classification performance on known classes and reject unknowns.
Proser efficiently generates novel class by manifold mixup, and adaptively sets the value of reserved open-set classifier during training.
arXiv Detail & Related papers (2021-03-28T09:18:15Z) - Learning Open Set Network with Discriminative Reciprocal Points [70.28322390023546]
Open set recognition aims to simultaneously classify samples from predefined classes and identify the rest as 'unknown'
In this paper, we propose a new concept, Reciprocal Point, which is the potential representation of the extra-class space corresponding to each known category.
Based on the bounded space constructed by reciprocal points, the risk of unknown is reduced through multi-category interaction.
arXiv Detail & Related papers (2020-10-31T03:20:31Z) - Open Set Recognition with Conditional Probabilistic Generative Models [51.40872765917125]
We propose Conditional Probabilistic Generative Models (CPGM) for open set recognition.
CPGM can detect unknown samples but also classify known classes by forcing different latent features to approximate conditional Gaussian distributions.
Experiment results on multiple benchmark datasets reveal that the proposed method significantly outperforms the baselines.
arXiv Detail & Related papers (2020-08-12T06:23:49Z) - Classify and Generate Reciprocally: Simultaneous Positive-Unlabelled
Learning and Conditional Generation with Extra Data [77.31213472792088]
The scarcity of class-labeled data is a ubiquitous bottleneck in many machine learning problems.
We address this problem by leveraging Positive-Unlabeled(PU) classification and the conditional generation with extra unlabeled data.
We present a novel training framework to jointly target both PU classification and conditional generation when exposed to extra data.
arXiv Detail & Related papers (2020-06-14T08:27:40Z) - Fuzziness-based Spatial-Spectral Class Discriminant Information
Preserving Active Learning for Hyperspectral Image Classification [0.456877715768796]
This work proposes a novel fuzziness-based spatial-spectral within and between for both local and global class discriminant information preserving method.
Experimental results on benchmark HSI datasets demonstrate the effectiveness of the FLG method on Generative, Extreme Learning Machine and Sparse Multinomial Logistic Regression.
arXiv Detail & Related papers (2020-05-28T18:58:11Z) - Conditional Gaussian Distribution Learning for Open Set Recognition [10.90687687505665]
We propose Conditional Gaussian Distribution Learning (CGDL) for open set recognition.
In addition to detecting unknown samples, this method can also classify known samples by forcing different latent features to approximate different Gaussian models.
Experiments on several standard image reveal that the proposed method significantly outperforms the baseline method and achieves new state-of-the-art results.
arXiv Detail & Related papers (2020-03-19T14:32:08Z) - Open-set learning with augmented categories by exploiting unlabelled
data [1.2691047660244337]
This research is the first to generalise between observed-novel and unobserved-novel categories within a new learning policy called open-set learning with augmented category.
We introduce Open-LACU as a unified policy of positive and unlabelled learning, semi-supervised learning and open-set recognition.
The proposed Open-LACU achieves state-of-the-art and first-of-its-kind results.
arXiv Detail & Related papers (2020-02-04T15:32:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.