Semi-Supervised Semantic Segmentation with Cross Teacher Training
- URL: http://arxiv.org/abs/2209.01327v1
- Date: Sat, 3 Sep 2022 05:02:03 GMT
- Title: Semi-Supervised Semantic Segmentation with Cross Teacher Training
- Authors: Hui Xiao, Li Dong, Kangkang Song, Hao Xu, Shuibo Fu, Diqun Yan,
Chengbin Peng
- Abstract summary: This work proposes a cross-teacher training framework with three modules that significantly improves traditional semi-supervised learning approaches.
The core is a cross-teacher module, which could simultaneously reduce the coupling among peer networks and the error accumulation between teacher and student networks.
The high-level module can transfer high-quality knowledge from labeled data to unlabeled ones and promote separation between classes in feature space.
The low-level module can encourage low-quality features learning from the high-quality features among peer networks.
- Score: 14.015346488847902
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Convolutional neural networks can achieve remarkable performance in semantic
segmentation tasks. However, such neural network approaches heavily rely on
costly pixel-level annotation. Semi-supervised learning is a promising
resolution to tackle this issue, but its performance still far falls behind the
fully supervised counterpart. This work proposes a cross-teacher training
framework with three modules that significantly improves traditional
semi-supervised learning approaches. The core is a cross-teacher module, which
could simultaneously reduce the coupling among peer networks and the error
accumulation between teacher and student networks. In addition, we propose two
complementary contrastive learning modules. The high-level module can transfer
high-quality knowledge from labeled data to unlabeled ones and promote
separation between classes in feature space. The low-level module can encourage
low-quality features learning from the high-quality features among peer
networks. In experiments, the cross-teacher module significantly improves the
performance of traditional student-teacher approaches, and our framework
outperforms stateof-the-art methods on benchmark datasets. Our source code of
CTT will be released.
Related papers
- Exploiting Minority Pseudo-Labels for Semi-Supervised Semantic Segmentation in Autonomous Driving [2.638145329894673]
We propose a professional training module to enhance minority class learning and a general training module to learn more comprehensive semantic information.
In experiments, our framework demonstrates superior performance compared to state-of-the-art methods on benchmark datasets.
arXiv Detail & Related papers (2024-09-19T11:47:25Z) - HPFF: Hierarchical Locally Supervised Learning with Patch Feature Fusion [7.9514535887836795]
We propose a novel model that performs hierarchical locally supervised learning and patch-level feature on auxiliary networks.
We conduct experiments on CIFAR-10, STL-10, SVHN, and ImageNet datasets, and the results demonstrate that our proposed HPFF significantly outperforms previous approaches.
arXiv Detail & Related papers (2024-07-08T06:05:19Z) - Competitive Ensembling Teacher-Student Framework for Semi-Supervised
Left Atrium MRI Segmentation [8.338801567668233]
Semi-supervised learning has greatly advanced medical image segmentation since it effectively alleviates the need of acquiring abundant annotations from experts.
In this paper, we present a simple yet efficient competitive ensembling teacher student framework for semi-supervised for left atrium segmentation from 3D MR images.
arXiv Detail & Related papers (2023-10-21T09:23:34Z) - Deep Image Clustering with Contrastive Learning and Multi-scale Graph
Convolutional Networks [58.868899595936476]
This paper presents a new deep clustering approach termed image clustering with contrastive learning and multi-scale graph convolutional networks (IcicleGCN)
Experiments on multiple image datasets demonstrate the superior clustering performance of IcicleGCN over the state-of-the-art.
arXiv Detail & Related papers (2022-07-14T19:16:56Z) - Weakly Supervised Semantic Segmentation via Alternative Self-Dual
Teaching [82.71578668091914]
This paper establishes a compact learning framework that embeds the classification and mask-refinement components into a unified deep model.
We propose a novel alternative self-dual teaching (ASDT) mechanism to encourage high-quality knowledge interaction.
arXiv Detail & Related papers (2021-12-17T11:56:56Z) - Semi-supervised Left Atrium Segmentation with Mutual Consistency
Training [60.59108570938163]
We propose a novel Mutual Consistency Network (MC-Net) for semi-supervised left atrium segmentation from 3D MR images.
Our MC-Net consists of one encoder and two slightly different decoders, and the prediction discrepancies of two decoders are transformed as an unsupervised loss.
We evaluate our MC-Net on the public Left Atrium (LA) database and it obtains impressive performance gains by exploiting the unlabeled data effectively.
arXiv Detail & Related papers (2021-03-04T09:34:32Z) - A Combinatorial Perspective on Transfer Learning [27.7848044115664]
We study how the learning of modular solutions can allow for effective generalization to both unseen and potentially differently distributed data.
Our main postulate is that the combination of task segmentation, modular learning and memory-based ensembling can give rise to generalization on an exponentially growing number of unseen tasks.
arXiv Detail & Related papers (2020-10-23T09:53:31Z) - Neural Function Modules with Sparse Arguments: A Dynamic Approach to
Integrating Information across Layers [84.57980167400513]
Neural Function Modules (NFM) aims to introduce the same structural capability into deep learning.
Most of the work in the context of feed-forward networks combining top-down and bottom-up feedback is limited to classification problems.
The key contribution of our work is to combine attention, sparsity, top-down and bottom-up feedback, in a flexible algorithm.
arXiv Detail & Related papers (2020-10-15T20:43:17Z) - ReMarNet: Conjoint Relation and Margin Learning for Small-Sample Image
Classification [49.87503122462432]
We introduce a novel neural network termed Relation-and-Margin learning Network (ReMarNet)
Our method assembles two networks of different backbones so as to learn the features that can perform excellently in both of the aforementioned two classification mechanisms.
Experiments on four image datasets demonstrate that our approach is effective in learning discriminative features from a small set of labeled samples.
arXiv Detail & Related papers (2020-06-27T13:50:20Z) - Efficient Crowd Counting via Structured Knowledge Transfer [122.30417437707759]
Crowd counting is an application-oriented task and its inference efficiency is crucial for real-world applications.
We propose a novel Structured Knowledge Transfer framework to generate a lightweight but still highly effective student network.
Our models obtain at least 6.5$times$ speed-up on an Nvidia 1080 GPU and even achieve state-of-the-art performance.
arXiv Detail & Related papers (2020-03-23T08:05:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.