Homodyned K-distribution: parameter estimation and uncertainty
quantification using Bayesian neural networks
- URL: http://arxiv.org/abs/2211.00175v1
- Date: Mon, 31 Oct 2022 22:38:33 GMT
- Title: Homodyned K-distribution: parameter estimation and uncertainty
quantification using Bayesian neural networks
- Authors: Ali K. Z. Tehrani, Ivan M. Rosado-Mendez, and Hassan Rivaz
- Abstract summary: The parameters of Homodyned K-distribution (HK-distribution) are the speckle statistics that can model the envelope data in diverse scattering conditions.
We propose a Bayesian Neural Network (BNN) to estimate the parameters of HK-distribution and quantify the uncertainty of the estimator.
- Score: 2.599882743586164
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Quantitative ultrasound (QUS) allows estimating the intrinsic tissue
properties. Speckle statistics are the QUS parameters that describe the first
order statistics of ultrasound (US) envelope data. The parameters of Homodyned
K-distribution (HK-distribution) are the speckle statistics that can model the
envelope data in diverse scattering conditions. However, they require a large
amount of data to be estimated reliably. Consequently, finding out the
intrinsic uncertainty of the estimated parameters can help us to have a better
understanding of the estimated parameters. In this paper, we propose a Bayesian
Neural Network (BNN) to estimate the parameters of HK-distribution and quantify
the uncertainty of the estimator.
Related papers
- Uncertainty Decomposition and Error Margin Detection of Homodyned-K Distribution in Quantitative Ultrasound [1.912429179274357]
Homodyned K-distribution (HK-distribution) parameter estimation in quantitative ultrasound (QUS) has been recently addressed using Bayesian Neural Networks (BNNs)
BNNs have been shown to significantly reduce computational time in speckle statistics-based QUS without compromising accuracy and precision.
arXiv Detail & Related papers (2024-09-17T22:16:49Z) - Tractable Function-Space Variational Inference in Bayesian Neural
Networks [72.97620734290139]
A popular approach for estimating the predictive uncertainty of neural networks is to define a prior distribution over the network parameters.
We propose a scalable function-space variational inference method that allows incorporating prior information.
We show that the proposed method leads to state-of-the-art uncertainty estimation and predictive performance on a range of prediction tasks.
arXiv Detail & Related papers (2023-12-28T18:33:26Z) - Bayesian Analysis for Over-parameterized Linear Model without Sparsity [8.1585306387285]
This study introduces a Bayesian approach that employs a prior distribution dependent on the eigenvectors of data covariance matrices without inducing parameter sparsity.
We also provide contraction rates of the derived posterior estimation and develop a truncated Gaussian approximation of the posterior distribution.
These findings suggest that Bayesian methods capable of handling data spectra and estimating non-sparse high-dimensional parameters are feasible.
arXiv Detail & Related papers (2023-05-25T06:07:47Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Non-parametric Kernel-Based Estimation of Probability Distributions for
Precipitation Modeling [0.0]
We derive non-parametric estimates of the cumulative distribution function (CDF) of precipitation amount for wet time intervals.
We show that KCDE provides better estimates of the probability distribution than the standard empirical (staircase) estimate.
arXiv Detail & Related papers (2021-09-21T04:52:00Z) - Machine Learning Based Parameter Estimation of Gaussian Quantum States [14.85374185122389]
We propose a machine learning framework for parameter estimation of single mode Gaussian quantum states.
Under a Bayesian framework, our approach estimates parameters of suitable prior distributions from measured data.
arXiv Detail & Related papers (2021-08-13T04:59:16Z) - The $s$-value: evaluating stability with respect to distributional shifts [3.330229314824913]
In practice, distributions change between locations and across time. This makes it difficult to gather knowledge that transfers across data sets.
We propose a measure of instability that quantifies the distributional instability of a statistical parameter with respect to Kullback-Leibler divergence.
We evaluate the performance of the proposed measure on real data and show that it can elucidate the distributional instability of a parameter with respect to certain shifts.
arXiv Detail & Related papers (2021-05-07T05:18:12Z) - Sampling-free Variational Inference for Neural Networks with
Multiplicative Activation Noise [51.080620762639434]
We propose a more efficient parameterization of the posterior approximation for sampling-free variational inference.
Our approach yields competitive results for standard regression problems and scales well to large-scale image classification tasks.
arXiv Detail & Related papers (2021-03-15T16:16:18Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z) - Nonparametric Estimation of the Fisher Information and Its Applications [82.00720226775964]
This paper considers the problem of estimation of the Fisher information for location from a random sample of size $n$.
An estimator proposed by Bhattacharya is revisited and improved convergence rates are derived.
A new estimator, termed a clipped estimator, is proposed.
arXiv Detail & Related papers (2020-05-07T17:21:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.