Probabilistic Spatial Analysis in Quantitative Microscopy with
Uncertainty-Aware Cell Detection using Deep Bayesian Regression of Density
Maps
- URL: http://arxiv.org/abs/2102.11865v1
- Date: Tue, 23 Feb 2021 18:52:16 GMT
- Title: Probabilistic Spatial Analysis in Quantitative Microscopy with
Uncertainty-Aware Cell Detection using Deep Bayesian Regression of Density
Maps
- Authors: Alvaro Gomariz, Tiziano Portenier, C\'esar Nombela-Arrieta, Orcun
Goksel
- Abstract summary: 3D microscopy is key in the investigation of diverse biological systems.
We propose a deep learning-based cell detection framework that can operate on large microscopy images.
- Score: 8.534825157831387
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: 3D microscopy is key in the investigation of diverse biological systems, and
the ever increasing availability of large datasets demands automatic cell
identification methods that not only are accurate, but also can imply the
uncertainty in their predictions to inform about potential errors and hence
confidence in conclusions using them. While conventional deep learning methods
often yield deterministic results, advances in deep Bayesian learning allow for
accurate predictions with a probabilistic interpretation in numerous image
classification and segmentation tasks. It is however nontrivial to extend such
Bayesian methods to cell detection, which requires specialized learning
frameworks. In particular, regression of density maps is a popular successful
approach for extracting cell coordinates from local peaks in a postprocessing
step, which hinders any meaningful probabilistic output. We herein propose a
deep learning-based cell detection framework that can operate on large
microscopy images and outputs desired probabilistic predictions by (i)
integrating Bayesian techniques for the regression of uncertainty-aware density
maps, where peak detection can be applied to generate cell proposals, and (ii)
learning a mapping from the numerous proposals to a probabilistic space that is
calibrated, i.e. accurately represents the chances of a successful prediction.
Utilizing such calibrated predictions, we propose a probabilistic spatial
analysis with Monte-Carlo sampling. We demonstrate this in revising an existing
description of the distribution of a mesenchymal stromal cell type within the
bone marrow, where our proposed methods allow us to reveal spatial patterns
that are otherwise undetectable. Introducing such probabilistic analysis in
quantitative microscopy pipelines will allow for reporting confidence intervals
for testing biological hypotheses of spatial distributions.
Related papers
- Tractable Function-Space Variational Inference in Bayesian Neural
Networks [72.97620734290139]
A popular approach for estimating the predictive uncertainty of neural networks is to define a prior distribution over the network parameters.
We propose a scalable function-space variational inference method that allows incorporating prior information.
We show that the proposed method leads to state-of-the-art uncertainty estimation and predictive performance on a range of prediction tasks.
arXiv Detail & Related papers (2023-12-28T18:33:26Z) - Calibrating Neural Simulation-Based Inference with Differentiable
Coverage Probability [50.44439018155837]
We propose to include a calibration term directly into the training objective of the neural model.
By introducing a relaxation of the classical formulation of calibration error we enable end-to-end backpropagation.
It is directly applicable to existing computational pipelines allowing reliable black-box posterior inference.
arXiv Detail & Related papers (2023-10-20T10:20:45Z) - Uncertainty Quantification in Deep Neural Networks through Statistical
Inference on Latent Space [0.0]
We develop an algorithm that exploits the latent-space representation of data points fed into the network to assess the accuracy of their prediction.
We show on a synthetic dataset that commonly used methods are mostly overconfident.
In contrast, our method can detect such out-of-distribution data points as inaccurately predicted, thus aiding in the automatic detection of outliers.
arXiv Detail & Related papers (2023-05-18T09:52:06Z) - End-to-end cell recognition by point annotation [5.130998755172569]
In this paper, we introduce an end-to-end framework that applies direct regression and classification for preset anchor points.
Specifically, we propose a pyramidal feature aggregation strategy to combine low-level features and high-level semantics simultaneously.
In addition, an optimized cost function is designed to adapt our multi-task learning framework by matching ground truth and predicted points.
arXiv Detail & Related papers (2022-07-01T02:44:58Z) - Distributional Gaussian Processes Layers for Out-of-Distribution
Detection [18.05109901753853]
It is unsure whether out-of-distribution detection models reliant on deep neural networks are suitable for detecting domain shifts in medical imaging.
We propose a parameter efficient Bayesian layer for hierarchical convolutional Gaussian Processes that incorporates Gaussian Processes operating in Wasserstein-2 space.
Our uncertainty estimates result in out-of-distribution detection that outperforms the capabilities of previous Bayesian networks.
arXiv Detail & Related papers (2022-06-27T14:49:48Z) - Marginalization in Bayesian Networks: Integrating Exact and Approximate
Inference [0.0]
Missing data and hidden variables require calculating the marginal probability distribution of a subset of the variables.
We develop a divide-and-conquer approach using the graphical properties of Bayesian networks.
We present an efficient and scalable algorithm for estimating the marginal probability distribution for categorical variables.
arXiv Detail & Related papers (2021-12-16T21:49:52Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - PDC-Net+: Enhanced Probabilistic Dense Correspondence Network [161.76275845530964]
Enhanced Probabilistic Dense Correspondence Network, PDC-Net+, capable of estimating accurate dense correspondences.
We develop an architecture and an enhanced training strategy tailored for robust and generalizable uncertainty prediction.
Our approach obtains state-of-the-art results on multiple challenging geometric matching and optical flow datasets.
arXiv Detail & Related papers (2021-09-28T17:56:41Z) - Exploring the Intrinsic Probability Distribution for Hyperspectral
Anomaly Detection [9.653976364051564]
We propose a novel probability distribution representation detector (PDRD) that explores the intrinsic distribution of both the background and the anomalies in original data for hyperspectral anomaly detection.
We conduct the experiments on four real data sets to evaluate the performance of our proposed method.
arXiv Detail & Related papers (2021-05-14T11:42:09Z) - Scalable Marginal Likelihood Estimation for Model Selection in Deep
Learning [78.83598532168256]
Marginal-likelihood based model-selection is rarely used in deep learning due to estimation difficulties.
Our work shows that marginal likelihoods can improve generalization and be useful when validation data is unavailable.
arXiv Detail & Related papers (2021-04-11T09:50:24Z) - Improving Uncertainty Calibration via Prior Augmented Data [56.88185136509654]
Neural networks have proven successful at learning from complex data distributions by acting as universal function approximators.
They are often overconfident in their predictions, which leads to inaccurate and miscalibrated probabilistic predictions.
We propose a solution by seeking out regions of feature space where the model is unjustifiably overconfident, and conditionally raising the entropy of those predictions towards that of the prior distribution of the labels.
arXiv Detail & Related papers (2021-02-22T07:02:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.