Adaptive Annotation Distribution for Weakly Supervised Point Cloud
Semantic Segmentation
- URL: http://arxiv.org/abs/2312.06259v1
- Date: Mon, 11 Dec 2023 09:57:09 GMT
- Title: Adaptive Annotation Distribution for Weakly Supervised Point Cloud
Semantic Segmentation
- Authors: Zhiyi Pan and Nan Zhang and Wei Gao and Shan Liu and Ge Li
- Abstract summary: We propose an adaptive annotation distribution method for weakly supervised point cloud semantic segmentation.
Specifically, we introduce the probability density function into the gradient sampling approximation analysis.
We design the multiplicative dynamic entropy as the gradient calibration function to mitigate the gradient bias caused by non-uniformly distributed sparse annotations.
- Score: 41.49585975597466
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Weakly supervised point cloud semantic segmentation has attracted a lot of
attention due to its ability to alleviate the heavy reliance on fine-grained
annotations of point clouds. However, in practice, sparse annotation usually
exhibits a distinct non-uniform distribution in point cloud, which poses
challenges for weak supervision. To address these issues, we propose an
adaptive annotation distribution method for weakly supervised point cloud
semantic segmentation. Specifically, we introduce the probability density
function into the gradient sampling approximation analysis and investigate the
impact of sparse annotations distributions. Based on our analysis, we propose a
label-aware point cloud downsampling strategy to increase the proportion of
annotations involved in the training stage. Furthermore, we design the
multiplicative dynamic entropy as the gradient calibration function to mitigate
the gradient bias caused by non-uniformly distributed sparse annotations and
explicitly reduce the epistemic uncertainty. Without any prior restrictions and
additional information, our proposed method achieves comprehensive performance
improvements at multiple label rates with different annotation distributions on
S3DIS, ScanNetV2 and SemanticKITTI.
Related papers
- EAUWSeg: Eliminating annotation uncertainty in weakly-supervised medical image segmentation [4.334357692599945]
Weakly-supervised medical image segmentation is gaining traction as it requires only rough annotations rather than accurate pixel-to-pixel labels.
We propose a novel weak annotation method coupled with its learning framework EAUWSeg to eliminate the annotation uncertainty.
We show that EAUWSeg outperforms existing weakly-supervised segmentation methods.
arXiv Detail & Related papers (2025-01-03T06:21:02Z) - Beyond Point Annotation: A Weakly Supervised Network Guided by Multi-Level Labels Generated from Four-Point Annotation for Thyroid Nodule Segmentation in Ultrasound Image [8.132809580086565]
We propose a weakly-supervised network that generates multi-level labels from four-point annotation to refine constraints for delicate nodule segmentation.
Our proposed network outperforms existing weakly-supervised methods on two public datasets with respect to the accuracy and robustness.
arXiv Detail & Related papers (2024-10-25T06:34:53Z) - Distribution Guidance Network for Weakly Supervised Point Cloud Semantic Segmentation [40.17482809009576]
We introduce a novel perspective that imparts auxiliary constraints by regulating the feature space under weak supervision.
We develop a Distribution Guidance Network (DGNet), which comprises a weakly supervised learning branch and a distribution alignment branch.
arXiv Detail & Related papers (2024-10-10T16:33:27Z) - Annotation-Efficient Polyp Segmentation via Active Learning [45.59503015577479]
We propose a deep active learning framework for annotation-efficient polyp segmentation.
In practice, we measure the uncertainty of each sample by examining the similarity between features masked by the prediction map of the polyp and the background area.
We show that our proposed method achieved state-of-the-art performance compared to other competitors on both a public dataset and a large-scale in-house dataset.
arXiv Detail & Related papers (2024-03-21T12:25:17Z) - Towards the Uncharted: Density-Descending Feature Perturbation for Semi-supervised Semantic Segmentation [51.66997548477913]
We propose a novel feature-level consistency learning framework named Density-Descending Feature Perturbation (DDFP)
Inspired by the low-density separation assumption in semi-supervised learning, our key insight is that feature density can shed a light on the most promising direction for the segmentation classifier to explore.
The proposed DDFP outperforms other designs on feature-level perturbations and shows state of the art performances on both Pascal VOC and Cityscapes dataset.
arXiv Detail & Related papers (2024-03-11T06:59:05Z) - Regressor-Segmenter Mutual Prompt Learning for Crowd Counting [70.49246560246736]
We propose mutual prompt learning (mPrompt) to solve bias and inaccuracy caused by annotation variance.
Experiments show that mPrompt significantly reduces the Mean Average Error (MAE)
arXiv Detail & Related papers (2023-12-04T07:53:59Z) - Multi-View Knowledge Distillation from Crowd Annotations for
Out-of-Domain Generalization [53.24606510691877]
We propose new methods for acquiring soft-labels from crowd-annotations by aggregating the distributions produced by existing methods.
We demonstrate that these aggregation methods lead to the most consistent performance across four NLP tasks on out-of-domain test sets.
arXiv Detail & Related papers (2022-12-19T12:40:18Z) - Weakly Supervised Semantic Segmentation for Large-Scale Point Cloud [69.36717778451667]
Existing methods for large-scale point cloud semantic segmentation require expensive, tedious and error-prone manual point-wise annotations.
We propose an effective weakly supervised method containing two components to solve the problem.
The experimental results show the large gain against existing weakly supervised and comparable results to fully supervised methods.
arXiv Detail & Related papers (2022-12-09T09:42:26Z) - End-to-End Label Uncertainty Modeling in Speech Emotion Recognition
using Bayesian Neural Networks and Label Distribution Learning [0.0]
We propose an end-to-end Bayesian neural network capable of being trained on a distribution of annotations to capture the subjectivity-based label uncertainty.
We show that the proposed t-distribution based approach achieves state-of-the-art uncertainty modeling results in speech emotion recognition.
arXiv Detail & Related papers (2022-09-30T12:55:43Z) - Bayesian Graph Contrastive Learning [55.36652660268726]
We propose a novel perspective of graph contrastive learning methods showing random augmentations leads to encoders.
Our proposed method represents each node by a distribution in the latent space in contrast to existing techniques which embed each node to a deterministic vector.
We show a considerable improvement in performance compared to existing state-of-the-art methods on several benchmark datasets.
arXiv Detail & Related papers (2021-12-15T01:45:32Z) - Guided Point Contrastive Learning for Semi-supervised Point Cloud
Semantic Segmentation [90.2445084743881]
We present a method for semi-supervised point cloud semantic segmentation to adopt unlabeled point clouds in training to boost the model performance.
Inspired by the recent contrastive loss in self-supervised tasks, we propose the guided point contrastive loss to enhance the feature representation and model generalization ability.
arXiv Detail & Related papers (2021-10-15T16:38:54Z) - Unsupervised Embedding Learning from Uncertainty Momentum Modeling [37.674449317054716]
We propose a novel solution to explicitly model and explore the uncertainty of the given unlabeled learning samples.
We leverage such uncertainty modeling momentum to the learning which is helpful to tackle the outliers.
arXiv Detail & Related papers (2021-07-19T14:06:19Z) - Learning from Crowds with Sparse and Imbalanced Annotations [29.596070201105274]
crowdsourcing has established itself as an efficient labeling solution through resorting to non-expert crowds.
One common practice is to distribute each instance to multiple workers, whereas each worker only annotates a subset of data, resulting in the it sparse annotation phenomenon.
We propose one self-training based approach named it Self-Crowd by progressively adding confident pseudo-annotations and rebalancing the annotation distribution.
arXiv Detail & Related papers (2021-07-11T13:06:20Z) - SQN: Weakly-Supervised Semantic Segmentation of Large-Scale 3D Point
Clouds [69.97213386812969]
We propose a new weak supervision method to implicitly augment highly sparse supervision signals.
The proposed Semantic Query Network (SQN) achieves promising performance on seven large-scale open datasets.
SQN requires only 0.1% randomly annotated points for training, greatly reducing annotation cost and effort.
arXiv Detail & Related papers (2021-04-11T01:29:50Z) - Scribble-Supervised Semantic Segmentation by Uncertainty Reduction on
Neural Representation and Self-Supervision on Neural Eigenspace [21.321005898976253]
Scribble-supervised semantic segmentation has gained much attention recently for its promising performance without high-quality annotations.
This work aims to achieve semantic segmentation by scribble annotations directly without extra information and other limitations.
We propose holistic operations, including minimizing entropy and a network embedded random walk on neural representation to reduce uncertainty.
arXiv Detail & Related papers (2021-02-19T12:33:57Z) - Deep Semi-supervised Knowledge Distillation for Overlapping Cervical
Cell Instance Segmentation [54.49894381464853]
We propose to leverage both labeled and unlabeled data for instance segmentation with improved accuracy by knowledge distillation.
We propose a novel Mask-guided Mean Teacher framework with Perturbation-sensitive Sample Mining.
Experiments show that the proposed method improves the performance significantly compared with the supervised method learned from labeled data only.
arXiv Detail & Related papers (2020-07-21T13:27:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.