Distributional Gaussian Process Layers for Outlier Detection in Image
Segmentation
- URL: http://arxiv.org/abs/2104.13756v1
- Date: Wed, 28 Apr 2021 13:37:10 GMT
- Title: Distributional Gaussian Process Layers for Outlier Detection in Image
Segmentation
- Authors: Sebastian G. Popescu, David J. Sharp, James H. Cole, Konstantinos
Kamnitsas, Ben Glocker
- Abstract summary: We propose a parameter efficient Bayesian layer for hierarchical convolutional Gaussian Processes.
Our experiments on brain tissue-segmentation show that the resulting architecture approaches the performance of well-established deterministic segmentation algorithms.
Our uncertainty estimates result in out-of-distribution detection that outperforms the capabilities of previous Bayesian networks.
- Score: 15.086527565572073
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a parameter efficient Bayesian layer for hierarchical
convolutional Gaussian Processes that incorporates Gaussian Processes operating
in Wasserstein-2 space to reliably propagate uncertainty. This directly
replaces convolving Gaussian Processes with a distance-preserving affine
operator on distributions. Our experiments on brain tissue-segmentation show
that the resulting architecture approaches the performance of well-established
deterministic segmentation algorithms (U-Net), which has never been achieved
with previous hierarchical Gaussian Processes. Moreover, by applying the same
segmentation model to out-of-distribution data (i.e., images with pathology
such as brain tumors), we show that our uncertainty estimates result in
out-of-distribution detection that outperforms the capabilities of previous
Bayesian networks and reconstruction-based approaches that learn normative
distributions.
Related papers
- Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Implicit Variational Inference for High-Dimensional Posteriors [7.924706533725115]
In variational inference, the benefits of Bayesian models rely on accurately capturing the true posterior distribution.
We propose using neural samplers that specify implicit distributions, which are well-suited for approximating complex multimodal and correlated posteriors.
Our approach introduces novel bounds for approximate inference using implicit distributions by locally linearising the neural sampler.
arXiv Detail & Related papers (2023-10-10T14:06:56Z) - Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Towards Better Certified Segmentation via Diffusion Models [62.21617614504225]
segmentation models can be vulnerable to adversarial perturbations, which hinders their use in critical-decision systems like healthcare or autonomous driving.
Recently, randomized smoothing has been proposed to certify segmentation predictions by adding Gaussian noise to the input to obtain theoretical guarantees.
In this paper, we address the problem of certifying segmentation prediction using a combination of randomized smoothing and diffusion models.
arXiv Detail & Related papers (2023-06-16T16:30:39Z) - Distributional Gaussian Processes Layers for Out-of-Distribution
Detection [18.05109901753853]
It is unsure whether out-of-distribution detection models reliant on deep neural networks are suitable for detecting domain shifts in medical imaging.
We propose a parameter efficient Bayesian layer for hierarchical convolutional Gaussian Processes that incorporates Gaussian Processes operating in Wasserstein-2 space.
Our uncertainty estimates result in out-of-distribution detection that outperforms the capabilities of previous Bayesian networks.
arXiv Detail & Related papers (2022-06-27T14:49:48Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Beyond the Mean-Field: Structured Deep Gaussian Processes Improve the
Predictive Uncertainties [12.068153197381575]
We propose a novel variational family that allows for retaining covariances between latent processes while achieving fast convergence.
We provide an efficient implementation of our new approach and apply it to several benchmark datasets.
It yields excellent results and strikes a better balance between accuracy and calibrated uncertainty estimates than its state-of-the-art alternatives.
arXiv Detail & Related papers (2020-05-22T11:10:59Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.