A semi-supervised Teacher-Student framework for surgical tool detection
and localization
- URL: http://arxiv.org/abs/2208.09926v1
- Date: Sun, 21 Aug 2022 17:21:31 GMT
- Title: A semi-supervised Teacher-Student framework for surgical tool detection
and localization
- Authors: Mansoor Ali and Gilberto Ochoa-Ruiz and Sharib Ali
- Abstract summary: We introduce a semi-supervised learning (SSL) framework in surgical tool detection paradigm.
In the proposed work, we train a model with labeled data which initialises the Teacher-Student joint learning.
Our results on m2cai16-tool-locations dataset indicate the superiority of our approach on different supervised data settings.
- Score: 2.41710192205034
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Surgical tool detection in minimally invasive surgery is an essential part of
computer-assisted interventions. Current approaches are mostly based on
supervised methods which require large fully labeled data to train supervised
models and suffer from pseudo label bias because of class imbalance issues.
However large image datasets with bounding box annotations are often scarcely
available. Semi-supervised learning (SSL) has recently emerged as a means for
training large models using only a modest amount of annotated data; apart from
reducing the annotation cost. SSL has also shown promise to produce models that
are more robust and generalizable. Therefore, in this paper we introduce a
semi-supervised learning (SSL) framework in surgical tool detection paradigm
which aims to mitigate the scarcity of training data and the data imbalance
through a knowledge distillation approach. In the proposed work, we train a
model with labeled data which initialises the Teacher-Student joint learning,
where the Student is trained on Teacher-generated pseudo labels from unlabeled
data. We propose a multi-class distance with a margin based classification loss
function in the region-of-interest head of the detector to effectively
segregate foreground classes from background region. Our results on
m2cai16-tool-locations dataset indicate the superiority of our approach on
different supervised data settings (1%, 2%, 5%, 10% of annotated data) where
our model achieves overall improvements of 8%, 12% and 27% in mAP (on 1%
labeled data) over the state-of-the-art SSL methods and a fully supervised
baseline, respectively. The code is available at
https://github.com/Mansoor-at/Semi-supervised-surgical-tool-det
Related papers
- A Closer Look at Benchmarking Self-Supervised Pre-training with Image Classification [51.35500308126506]
Self-supervised learning (SSL) is a machine learning approach where the data itself provides supervision, eliminating the need for external labels.
We study how classification-based evaluation protocols for SSL correlate and how well they predict downstream performance on different dataset types.
arXiv Detail & Related papers (2024-07-16T23:17:36Z) - Class-Imbalanced Semi-Supervised Learning for Large-Scale Point Cloud
Semantic Segmentation via Decoupling Optimization [64.36097398869774]
Semi-supervised learning (SSL) has been an active research topic for large-scale 3D scene understanding.
The existing SSL-based methods suffer from severe training bias due to class imbalance and long-tail distributions of the point cloud data.
We introduce a new decoupling optimization framework, which disentangles feature representation learning and classifier in an alternative optimization manner to shift the bias decision boundary effectively.
arXiv Detail & Related papers (2024-01-13T04:16:40Z) - Progressive Feature Adjustment for Semi-supervised Learning from
Pretrained Models [39.42802115580677]
Semi-supervised learning (SSL) can leverage both labeled and unlabeled data to build a predictive model.
Recent literature suggests that naively applying state-of-the-art SSL with a pretrained model fails to unleash the full potential of training data.
We propose to use pseudo-labels from the unlabelled data to update the feature extractor that is less sensitive to incorrect labels.
arXiv Detail & Related papers (2023-09-09T01:57:14Z) - Uncertainty-Aware Semi-Supervised Learning for Prostate MRI Zonal
Segmentation [0.9176056742068814]
We propose a novel semi-supervised learning (SSL) approach that requires only a relatively small number of annotations.
Our method uses a pseudo-labeling technique that employs recent deep learning uncertainty estimation models.
Our proposed model outperformed the semi-supervised model in experiments with the ProstateX dataset and an external test set.
arXiv Detail & Related papers (2023-05-10T08:50:04Z) - Universal Semi-Supervised Learning for Medical Image Classification [21.781201758182135]
Semi-supervised learning (SSL) has attracted much attention since it reduces the expensive costs of collecting adequate well-labeled training data.
Traditional SSL is built upon an assumption that labeled and unlabeled data should be from the same distribution.
We propose a unified framework to leverage unseen unlabeled data for open-scenario semi-supervised medical image classification.
arXiv Detail & Related papers (2023-04-08T16:12:36Z) - PCA: Semi-supervised Segmentation with Patch Confidence Adversarial
Training [52.895952593202054]
We propose a new semi-supervised adversarial method called Patch Confidence Adrial Training (PCA) for medical image segmentation.
PCA learns the pixel structure and context information in each patch to get enough gradient feedback, which aids the discriminator in convergent to an optimal state.
Our method outperforms the state-of-the-art semi-supervised methods, which demonstrates its effectiveness for medical image segmentation.
arXiv Detail & Related papers (2022-07-24T07:45:47Z) - Training image classifiers using Semi-Weak Label Data [26.04162590798731]
In Multiple Instance learning (MIL), weak labels are provided at the bag level with only presence/absence information known.
This paper introduces a novel semi-weak label learning paradigm as a middle ground to mitigate the problem.
We propose a two-stage framework to address the problem of learning from semi-weak labels.
arXiv Detail & Related papers (2021-03-19T03:06:07Z) - Unbiased Teacher for Semi-Supervised Object Detection [50.0087227400306]
We revisit the Semi-Supervised Object Detection (SS-OD) and identify the pseudo-labeling bias issue in SS-OD.
We introduce Unbiased Teacher, a simple yet effective approach that jointly trains a student and a gradually progressing teacher in a mutually-beneficial manner.
arXiv Detail & Related papers (2021-02-18T17:02:57Z) - Neural Semi-supervised Learning for Text Classification Under
Large-Scale Pretraining [51.19885385587916]
We conduct studies on semi-supervised learning in the task of text classification under the context of large-scale LM pretraining.
Our work marks an initial step in understanding the behavior of semi-supervised learning models under the context of large-scale pretraining.
arXiv Detail & Related papers (2020-11-17T13:39:05Z) - Semi-Automatic Data Annotation guided by Feature Space Projection [117.9296191012968]
We present a semi-automatic data annotation approach based on suitable feature space projection and semi-supervised label estimation.
We validate our method on the popular MNIST dataset and on images of human intestinal parasites with and without fecal impurities.
Our results demonstrate the added-value of visual analytics tools that combine complementary abilities of humans and machines for more effective machine learning.
arXiv Detail & Related papers (2020-07-27T17:03:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.