Confidence-Guided Semi-supervised Learning in Land Cover Classification
- URL: http://arxiv.org/abs/2305.10344v2
- Date: Tue, 30 May 2023 21:15:10 GMT
- Title: Confidence-Guided Semi-supervised Learning in Land Cover Classification
- Authors: Wanli Ma, Oktay Karakus, Paul L. Rosin
- Abstract summary: We develop a confidence-guided semi-supervised learning (CGSSL) approach to make use of high-confidence pseudo labels.
The proposed semi-supervised learning approach significantly improves the performance of land cover classification.
It even outperforms fully supervised learning with a complete set of labelled imagery of the benchmark Potsdam land cover dataset.
- Score: 31.28174940820374
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Semi-supervised learning has been well developed to help reduce the cost of
manual labelling by exploiting a large quantity of unlabelled data. Especially
in the application of land cover classification, pixel-level manual labelling
in large-scale imagery is labour-intensive, time-consuming and expensive.
However, existing semi-supervised learning methods pay limited attention to the
quality of pseudo-labels during training even though the quality of training
data is one of the critical factors determining network performance. In order
to fill this gap, we develop a confidence-guided semi-supervised learning
(CGSSL) approach to make use of high-confidence pseudo labels and reduce the
negative effect of low-confidence ones for land cover classification.
Meanwhile, the proposed semi-supervised learning approach uses multiple network
architectures to increase the diversity of pseudo labels. The proposed
semi-supervised learning approach significantly improves the performance of
land cover classification compared to the classic semi-supervised learning
methods and even outperforms fully supervised learning with a complete set of
labelled imagery of the benchmark Potsdam land cover dataset.
Related papers
- Performance Evaluation of Semi-supervised Learning Frameworks for
Multi-Class Weed Detection [15.828967396019143]
Effective weed control plays a crucial role in optimizing crop yield and enhancing agricultural product quality.
Recent advances in precision weed management enabled by ML and DL provide a sustainable alternative.
Semi-supervised learning methods, especially semi-supervised learning, have gained increased attention in the broader domain of computer vision.
arXiv Detail & Related papers (2024-03-06T00:59:51Z) - A Self Supervised StyleGAN for Image Annotation and Classification with
Extremely Limited Labels [35.43549147657739]
We propose SS-StyleGAN, a self-supervised approach for image annotation and classification suitable for extremely small annotated datasets.
We show that the proposed method attains strong classification results using small labeled datasets of sizes 50 and even 10.
arXiv Detail & Related papers (2023-12-26T09:46:50Z) - PCA: Semi-supervised Segmentation with Patch Confidence Adversarial
Training [52.895952593202054]
We propose a new semi-supervised adversarial method called Patch Confidence Adrial Training (PCA) for medical image segmentation.
PCA learns the pixel structure and context information in each patch to get enough gradient feedback, which aids the discriminator in convergent to an optimal state.
Our method outperforms the state-of-the-art semi-supervised methods, which demonstrates its effectiveness for medical image segmentation.
arXiv Detail & Related papers (2022-07-24T07:45:47Z) - Class-Aware Contrastive Semi-Supervised Learning [51.205844705156046]
We propose a general method named Class-aware Contrastive Semi-Supervised Learning (CCSSL) to improve pseudo-label quality and enhance the model's robustness in the real-world setting.
Our proposed CCSSL has significant performance improvements over the state-of-the-art SSL methods on the standard datasets CIFAR100 and STL10.
arXiv Detail & Related papers (2022-03-04T12:18:23Z) - STEdge: Self-training Edge Detection with Multi-layer Teaching and
Regularization [15.579360385857129]
We study the problem of self-training edge detection, leveraging the untapped wealth of large-scale unlabeled image datasets.
We design a self-supervised framework with multi-layer regularization and self-teaching.
Our method attains 4.8% improvement for ODS and 5.8% for OIS when tested on the unseen BIPED dataset.
arXiv Detail & Related papers (2022-01-13T18:26:36Z) - Guided Point Contrastive Learning for Semi-supervised Point Cloud
Semantic Segmentation [90.2445084743881]
We present a method for semi-supervised point cloud semantic segmentation to adopt unlabeled point clouds in training to boost the model performance.
Inspired by the recent contrastive loss in self-supervised tasks, we propose the guided point contrastive loss to enhance the feature representation and model generalization ability.
arXiv Detail & Related papers (2021-10-15T16:38:54Z) - Self-supervised learning for joint SAR and multispectral land cover
classification [38.8529535887097]
We present a framework and specific tasks for self-supervised training of multichannel models.
We show that the proposed self-supervised approach is highly effective at learning features that correlate with the labels for land cover classification.
arXiv Detail & Related papers (2021-08-20T09:02:07Z) - Flip Learning: Erase to Segment [65.84901344260277]
Weakly-supervised segmentation (WSS) can help reduce time-consuming and cumbersome manual annotation.
We propose a novel and general WSS framework called Flip Learning, which only needs the box annotation.
Our proposed approach achieves competitive performance and shows great potential to narrow the gap between fully-supervised and weakly-supervised learning.
arXiv Detail & Related papers (2021-08-02T09:56:10Z) - SCARF: Self-Supervised Contrastive Learning using Random Feature
Corruption [72.35532598131176]
We propose SCARF, a technique for contrastive learning, where views are formed by corrupting a random subset of features.
We show that SCARF complements existing strategies and outperforms alternatives like autoencoders.
arXiv Detail & Related papers (2021-06-29T08:08:33Z) - A Realistic Evaluation of Semi-Supervised Learning for Fine-Grained
Classification [38.68079253627819]
Our benchmark consists of two fine-grained classification datasets obtained by sampling classes from the Aves and Fungi taxonomy.
We find that recently proposed SSL methods provide significant benefits, and can effectively use out-of-class data to improve performance when deep networks are trained from scratch.
Our work suggests that semi-supervised learning with experts on realistic datasets may require different strategies than those currently prevalent in the literature.
arXiv Detail & Related papers (2021-04-01T17:59:41Z) - Deep Semi-supervised Knowledge Distillation for Overlapping Cervical
Cell Instance Segmentation [54.49894381464853]
We propose to leverage both labeled and unlabeled data for instance segmentation with improved accuracy by knowledge distillation.
We propose a novel Mask-guided Mean Teacher framework with Perturbation-sensitive Sample Mining.
Experiments show that the proposed method improves the performance significantly compared with the supervised method learned from labeled data only.
arXiv Detail & Related papers (2020-07-21T13:27:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.