Effect of latent space distribution on the segmentation of images with
multiple annotations
- URL: http://arxiv.org/abs/2304.13476v1
- Date: Wed, 26 Apr 2023 12:00:00 GMT
- Title: Effect of latent space distribution on the segmentation of images with
multiple annotations
- Authors: Ishaan Bhat and Josien P.W. Pluim and Max A. Viergever and Hugo J.
Kuijf
- Abstract summary: We propose the Generalized Probabilistic U-Net, which extends the Probabilistic U-Net by allowing more general forms of the Gaussian distribution as the latent space distribution.
We study the effect the choice of latent space distribution has on capturing the variation in the reference segmentations for lung tumors and white matter hyperintensities in the brain.
- Score: 5.054729045700466
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose the Generalized Probabilistic U-Net, which extends the
Probabilistic U-Net by allowing more general forms of the Gaussian distribution
as the latent space distribution that can better approximate the uncertainty in
the reference segmentations. We study the effect the choice of latent space
distribution has on capturing the variation in the reference segmentations for
lung tumors and white matter hyperintensities in the brain. We show that the
choice of distribution affects the sample diversity of the predictions and
their overlap with respect to the reference segmentations. We have made our
implementation available at
https://github.com/ishaanb92/GeneralizedProbabilisticUNet
Related papers
- Theoretical Insights for Diffusion Guidance: A Case Study for Gaussian
Mixture Models [59.331993845831946]
Diffusion models benefit from instillation of task-specific information into the score function to steer the sample generation towards desired properties.
This paper provides the first theoretical study towards understanding the influence of guidance on diffusion models in the context of Gaussian mixture models.
arXiv Detail & Related papers (2024-03-03T23:15:48Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Investigating and Improving Latent Density Segmentation Models for Aleatoric Uncertainty Quantification in Medical Imaging [21.311726807879456]
In image segmentation, latent density models can be utilized to address this problem.
The most popular approach is the Probabilistic U-Net (PU-Net), which uses latent Normal densities to optimize the conditional data log-likelihood Evidence Lower Bound.
We introduce mutual information updates and entropy-regularized Sinkhorn updates in the latent space to promote homogeneity across all latent dimensions.
arXiv Detail & Related papers (2023-07-31T14:09:03Z) - StyleGenes: Discrete and Efficient Latent Distributions for GANs [149.0290830305808]
We propose a discrete latent distribution for Generative Adversarial Networks (GANs)
Instead of drawing latent vectors from a continuous prior, we sample from a finite set of learnable latents.
We take inspiration from the encoding of information in biological organisms.
arXiv Detail & Related papers (2023-04-30T23:28:46Z) - Generalized Probabilistic U-Net for medical image segementation [3.398241562010881]
We propose the Generalized Probabilistic U-Net, which extends the Probabilistic U-Net by allowing more general forms of the Gaussian distribution.
We study the effect the choice of latent space distribution has on capturing the uncertainty in the reference segmentations using the LIDC-IDRI dataset.
arXiv Detail & Related papers (2022-07-26T13:03:37Z) - Structured Uncertainty in the Observation Space of Variational
Autoencoders [20.709989481734794]
In image synthesis, sampling from such distributions produces spatially-incoherent results with uncorrelated pixel noise.
We propose an alternative model for the observation space, encoding spatial dependencies via a low-rank parameterisation.
In contrast to pixel-wise independent distributions, our samples seem to contain semantically meaningful variations from the mean allowing the prediction of multiple plausible outputs.
arXiv Detail & Related papers (2022-05-25T07:12:50Z) - Wrapped Distributions on homogeneous Riemannian manifolds [58.720142291102135]
Control over distributions' properties, such as parameters, symmetry and modality yield a family of flexible distributions.
We empirically validate our approach by utilizing our proposed distributions within a variational autoencoder and a latent space network model.
arXiv Detail & Related papers (2022-04-20T21:25:21Z) - Cycle Consistent Probability Divergences Across Different Spaces [38.43511529063335]
Discrepancy measures between probability distributions are at the core of statistical inference and machine learning.
This work proposes a novel unbalanced Monge optimal transport formulation for matching, up to isometries, distributions on different spaces.
arXiv Detail & Related papers (2021-11-22T16:35:58Z) - Personalized Trajectory Prediction via Distribution Discrimination [78.69458579657189]
Trarimiy prediction is confronted with the dilemma to capture the multi-modal nature of future dynamics.
We present a distribution discrimination (DisDis) method to predict personalized motion patterns.
Our method can be integrated with existing multi-modal predictive models as a plug-and-play module.
arXiv Detail & Related papers (2021-07-29T17:42:12Z) - Bayesian Deep Learning and a Probabilistic Perspective of Generalization [56.69671152009899]
We show that deep ensembles provide an effective mechanism for approximate Bayesian marginalization.
We also propose a related approach that further improves the predictive distribution by marginalizing within basins of attraction.
arXiv Detail & Related papers (2020-02-20T15:13:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.