PixelDINO: Semi-Supervised Semantic Segmentation for Detecting
Permafrost Disturbances
- URL: http://arxiv.org/abs/2401.09271v1
- Date: Wed, 17 Jan 2024 15:20:10 GMT
- Title: PixelDINO: Semi-Supervised Semantic Segmentation for Detecting
Permafrost Disturbances
- Authors: Konrad Heidler, Ingmar Nitze, Guido Grosse, Xiao Xiang Zhu
- Abstract summary: We focus on the remote detection of retrogressive thaw slumps (RTS), a permafrost disturbance comparable to landslides induced by thawing.
We present a semi-supervised learning approach to train semantic segmentation models to detect RTS.
Our framework called PixelDINO is trained in parallel on labelled data as well as unlabelled data.
- Score: 15.78884578132055
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Arctic Permafrost is facing significant changes due to global climate change.
As these regions are largely inaccessible, remote sensing plays a crucial rule
in better understanding the underlying processes not just on a local scale, but
across the Arctic. In this study, we focus on the remote detection of
retrogressive thaw slumps (RTS), a permafrost disturbance comparable to
landslides induced by thawing. For such analyses from space, deep learning has
become an indispensable tool, but limited labelled training data remains a
challenge for training accurate models. To improve model generalization across
the Arctic without the need for additional labelled data, we present a
semi-supervised learning approach to train semantic segmentation models to
detect RTS. Our framework called PixelDINO is trained in parallel on labelled
data as well as unlabelled data. For the unlabelled data, the model segments
the imagery into self-taught pseudo-classes and the training procedure ensures
consistency of these pseudo-classes across strong augmentations of the input
data. Our experimental results demonstrate that PixelDINO can improve model
performance both over supervised baseline methods as well as existing
semi-supervised semantic segmentation approaches, highlighting its potential
for training robust models that generalize well to regions that were not
included in the training data. The project page containing code and other
materials for this study can be found at
\url{https://khdlr.github.io/PixelDINO/}.
Related papers
- Exploring Beyond Logits: Hierarchical Dynamic Labeling Based on Embeddings for Semi-Supervised Classification [49.09505771145326]
We propose a Hierarchical Dynamic Labeling (HDL) algorithm that does not depend on model predictions and utilizes image embeddings to generate sample labels.
Our approach has the potential to change the paradigm of pseudo-label generation in semi-supervised learning.
arXiv Detail & Related papers (2024-04-26T06:00:27Z) - Learning Cross-view Visual Geo-localization without Ground Truth [48.51859322439286]
Cross-View Geo-Localization (CVGL) involves determining the geographical location of a query image by matching it with a corresponding GPS-tagged reference image.
Current state-of-the-art methods rely on training models with labeled paired images, incurring substantial annotation costs and training burdens.
We investigate the adaptation of frozen models for CVGL without requiring ground truth pair labels.
arXiv Detail & Related papers (2024-03-19T13:01:57Z) - Group Distributionally Robust Dataset Distillation with Risk
Minimization [18.07189444450016]
We introduce an algorithm that combines clustering with the minimization of a risk measure on the loss to conduct DD.
We demonstrate its effective generalization and robustness across subgroups through numerical experiments.
arXiv Detail & Related papers (2024-02-07T09:03:04Z) - Learn to Unlearn for Deep Neural Networks: Minimizing Unlearning
Interference with Gradient Projection [56.292071534857946]
Recent data-privacy laws have sparked interest in machine unlearning.
Challenge is to discard information about the forget'' data without altering knowledge about remaining dataset.
We adopt a projected-gradient based learning method, named as Projected-Gradient Unlearning (PGU)
We provide empirically evidence to demonstrate that our unlearning method can produce models that behave similar to models retrained from scratch across various metrics even when the training dataset is no longer accessible.
arXiv Detail & Related papers (2023-12-07T07:17:24Z) - Robust Source-Free Domain Adaptation for Fundus Image Segmentation [3.585032903685044]
Unlabelled Domain Adaptation (UDA) is a learning technique that transfers knowledge learned in the source domain from labelled data to the target domain with only unlabelled data.
In this study, we propose a two-stage training stage for robust domain adaptation.
We propose a novel robust pseudo-label and pseudo-boundary (PLPB) method, which effectively utilizes unlabeled target data to generate pseudo labels and pseudo boundaries.
arXiv Detail & Related papers (2023-10-25T14:25:18Z) - Uncertainty-Aware Semi-Supervised Learning for Prostate MRI Zonal
Segmentation [0.9176056742068814]
We propose a novel semi-supervised learning (SSL) approach that requires only a relatively small number of annotations.
Our method uses a pseudo-labeling technique that employs recent deep learning uncertainty estimation models.
Our proposed model outperformed the semi-supervised model in experiments with the ProstateX dataset and an external test set.
arXiv Detail & Related papers (2023-05-10T08:50:04Z) - Few-Shot Non-Parametric Learning with Deep Latent Variable Model [50.746273235463754]
We propose Non-Parametric learning by Compression with Latent Variables (NPC-LV)
NPC-LV is a learning framework for any dataset with abundant unlabeled data but very few labeled ones.
We show that NPC-LV outperforms supervised methods on all three datasets on image classification in low data regime.
arXiv Detail & Related papers (2022-06-23T09:35:03Z) - Semi-supervised Deep Learning for Image Classification with Distribution
Mismatch: A Survey [1.5469452301122175]
Deep learning models rely on the abundance of labelled observations to train a prospective model.
It is expensive to gather labelled observations of data, making the usage of deep learning models not ideal.
In many situations different unlabelled data sources might be available.
This raises the risk of a significant distribution mismatch between the labelled and unlabelled datasets.
arXiv Detail & Related papers (2022-03-01T02:46:00Z) - Graph Embedding with Data Uncertainty [113.39838145450007]
spectral-based subspace learning is a common data preprocessing step in many machine learning pipelines.
Most subspace learning methods do not take into consideration possible measurement inaccuracies or artifacts that can lead to data with high uncertainty.
arXiv Detail & Related papers (2020-09-01T15:08:23Z) - Deep Semi-supervised Knowledge Distillation for Overlapping Cervical
Cell Instance Segmentation [54.49894381464853]
We propose to leverage both labeled and unlabeled data for instance segmentation with improved accuracy by knowledge distillation.
We propose a novel Mask-guided Mean Teacher framework with Perturbation-sensitive Sample Mining.
Experiments show that the proposed method improves the performance significantly compared with the supervised method learned from labeled data only.
arXiv Detail & Related papers (2020-07-21T13:27:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.