Neighborhood Spatial Aggregation MC Dropout for Efficient
Uncertainty-aware Semantic Segmentation in Point Clouds
- URL: http://arxiv.org/abs/2201.07676v1
- Date: Sun, 5 Dec 2021 02:22:32 GMT
- Title: Neighborhood Spatial Aggregation MC Dropout for Efficient
Uncertainty-aware Semantic Segmentation in Point Clouds
- Authors: Chao Qi and Jianqin Yin
- Abstract summary: Uncertainty-aware semantic segmentation of point clouds includes the predictive uncertainty estimation and the uncertainty-guided model optimization.
The widely-used MC dropout establishes the distribution by computing the standard deviation of samples using multiple forward propagations.
A framework embedded with NSA-MC dropout, a variant of MC dropout, is proposed to establish distributions in just one forward pass.
- Score: 8.98036662506975
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Uncertainty-aware semantic segmentation of the point clouds includes the
predictive uncertainty estimation and the uncertainty-guided model
optimization. One key challenge in the task is the efficiency of point-wise
predictive distribution establishment. The widely-used MC dropout establishes
the distribution by computing the standard deviation of samples using multiple
stochastic forward propagations, which is time-consuming for tasks based on
point clouds containing massive points. Hence, a framework embedded with NSA-MC
dropout, a variant of MC dropout, is proposed to establish distributions in
just one forward pass. Specifically, the NSA-MC dropout samples the model many
times through a space-dependent way, outputting point-wise distribution by
aggregating stochastic inference results of neighbors. Based on this, aleatoric
and predictive uncertainties acquire from the predictive distribution. The
aleatoric uncertainty is integrated into the loss function to penalize noisy
points, avoiding the over-fitting of the model to some degree. Besides, the
predictive uncertainty quantifies the confidence degree of predictions.
Experimental results show that our framework obtains better segmentation
results of real-world point clouds and efficiently quantifies the credibility
of results. Our NSA-MC dropout is several times faster than MC dropout, and the
inference time does not establish a coupling relation with the sampling times.
The code will be available if the paper is accepted.
Related papers
- Favour: FAst Variance Operator for Uncertainty Rating [0.034530027457862]
Bayesian Neural Networks (BNN) have emerged as a crucial approach for interpreting ML predictions.
By sampling from the posterior distribution, data scientists may estimate the uncertainty of an inference.
Previous work proposed propagating the first and second moments of the posterior directly through the network.
This method is even slower than sampling, so the propagated variance needs to be approximated.
Our contribution is a more principled variance propagation framework.
arXiv Detail & Related papers (2023-11-21T22:53:20Z) - SMURF-THP: Score Matching-based UnceRtainty quantiFication for
Transformer Hawkes Process [76.98721879039559]
We propose SMURF-THP, a score-based method for learning Transformer Hawkes process and quantifying prediction uncertainty.
Specifically, SMURF-THP learns the score function of events' arrival time based on a score-matching objective.
We conduct extensive experiments in both event type prediction and uncertainty quantification of arrival time.
arXiv Detail & Related papers (2023-10-25T03:33:45Z) - Score Matching-based Pseudolikelihood Estimation of Neural Marked
Spatio-Temporal Point Process with Uncertainty Quantification [59.81904428056924]
We introduce SMASH: a Score MAtching estimator for learning markedPs with uncertainty quantification.
Specifically, our framework adopts a normalization-free objective by estimating the pseudolikelihood of markedPs through score-matching.
The superior performance of our proposed framework is demonstrated through extensive experiments in both event prediction and uncertainty quantification.
arXiv Detail & Related papers (2023-10-25T02:37:51Z) - Distributional Shift-Aware Off-Policy Interval Estimation: A Unified
Error Quantification Framework [8.572441599469597]
We study high-confidence off-policy evaluation in the context of infinite-horizon Markov decision processes.
The objective is to establish a confidence interval (CI) for the target policy value using only offline data pre-collected from unknown behavior policies.
We show that our algorithm is sample-efficient, error-robust, and provably convergent even in non-linear function approximation settings.
arXiv Detail & Related papers (2023-09-23T06:35:44Z) - ZigZag: Universal Sampling-free Uncertainty Estimation Through Two-Step Inference [54.17205151960878]
We introduce a sampling-free approach that is generic and easy to deploy.
We produce reliable uncertainty estimates on par with state-of-the-art methods at a significantly lower computational cost.
arXiv Detail & Related papers (2022-11-21T13:23:09Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Quantifying Uncertainty in Deep Spatiotemporal Forecasting [67.77102283276409]
We describe two types of forecasting problems: regular grid-based and graph-based.
We analyze UQ methods from both the Bayesian and the frequentist point view, casting in a unified framework via statistical decision theory.
Through extensive experiments on real-world road network traffic, epidemics, and air quality forecasting tasks, we reveal the statistical computational trade-offs for different UQ methods.
arXiv Detail & Related papers (2021-05-25T14:35:46Z) - Contextual Dropout: An Efficient Sample-Dependent Dropout Module [60.63525456640462]
Dropout has been demonstrated as a simple and effective module to regularize the training process of deep neural networks.
We propose contextual dropout with an efficient structural design as a simple and scalable sample-dependent dropout module.
Our experimental results show that the proposed method outperforms baseline methods in terms of both accuracy and quality of uncertainty estimation.
arXiv Detail & Related papers (2021-03-06T19:30:32Z) - A Novel Regression Loss for Non-Parametric Uncertainty Optimization [7.766663822644739]
Quantification of uncertainty is one of the most promising approaches to establish safe machine learning.
One of the most commonly used approaches so far is Monte Carlo dropout, which is computationally cheap and easy to apply in practice.
We propose a new objective, referred to as second-moment loss ( UCI), to address this issue.
arXiv Detail & Related papers (2021-01-07T19:12:06Z) - Single Shot MC Dropout Approximation [0.0]
We present a single shot MC dropout approximation that preserves the advantages of BDNNs without being slower than a DNN.
Our approach is analytically approximate for each layer in a fully connected network the expected value and the variance of the MC dropout signal.
We demonstrate that our single shot MC dropout approximation resembles the point estimate and the uncertainty estimate of the predictive distribution.
arXiv Detail & Related papers (2020-07-07T09:17:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.