Model-based multi-parameter mapping
- URL: http://arxiv.org/abs/2102.01604v1
- Date: Tue, 2 Feb 2021 17:00:11 GMT
- Title: Model-based multi-parameter mapping
- Authors: Yael Balbastre, Mikael Brudfors, Michela Azzarito, Christian Lambert,
Martina F. Callaghan, John Ashburner
- Abstract summary: Quantitative MR imaging is increasingly favoured for its richer information content and standardised measures.
Estimations often assume noise subsets of data to solve for different quantities in isolation.
Instead, a generative model can be formulated and inverted to jointly recover parameter estimates.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Quantitative MR imaging is increasingly favoured for its richer information
content and standardised measures. However, extracting quantitative parameters
such as the longitudinal relaxation rate (R1), apparent transverse relaxation
rate (R2*), or magnetisation-transfer saturation (MTsat) involves inverting a
highly non-linear function. Estimations often assume noise-free measurements
and use subsets of the data to solve for different quantities in isolation,
with error propagating through each computation. Instead, a probabilistic
generative model of the entire dataset can be formulated and inverted to
jointly recover parameter estimates with a well-defined probabilistic meaning
(e.g., maximum likelihood or maximum a posteriori). In practice, iterative
methods must be used but convergence is difficult due to the non-convexity of
the log-likelihood; yet, we show that it can be achieved thanks to a novel
approximate Hessian and, with it, reliable parameter estimates obtained. Here,
we demonstrate the utility of this flexible framework in the context of the
popular multi-parameter mapping framework and further show how to incorporate a
denoising prior and predict posterior uncertainty. Our implementation uses a
PyTorch backend and benefits from GPU acceleration. It is available at
https://github.com/balbasty/nitorch.
Related papers
- Multivariate root-n-consistent smoothing parameter free matching estimators and estimators of inverse density weighted expectations [51.000851088730684]
We develop novel modifications of nearest-neighbor and matching estimators which converge at the parametric $sqrt n $-rate.
We stress that our estimators do not involve nonparametric function estimators and in particular do not rely on sample-size dependent parameters smoothing.
arXiv Detail & Related papers (2024-07-11T13:28:34Z) - Relaxed Quantile Regression: Prediction Intervals for Asymmetric Noise [51.87307904567702]
Quantile regression is a leading approach for obtaining such intervals via the empirical estimation of quantiles in the distribution of outputs.
We propose Relaxed Quantile Regression (RQR), a direct alternative to quantile regression based interval construction that removes this arbitrary constraint.
We demonstrate that this added flexibility results in intervals with an improvement in desirable qualities.
arXiv Detail & Related papers (2024-06-05T13:36:38Z) - A probabilistic, data-driven closure model for RANS simulations with aleatoric, model uncertainty [1.8416014644193066]
We propose a data-driven, closure model for Reynolds-averaged Navier-Stokes (RANS) simulations that incorporates aleatoric, model uncertainty.
A fully Bayesian formulation is proposed, combined with a sparsity-inducing prior in order to identify regions in the problem domain where the parametric closure is insufficient.
arXiv Detail & Related papers (2023-07-05T16:53:31Z) - Kernel-based off-policy estimation without overlap: Instance optimality
beyond semiparametric efficiency [53.90687548731265]
We study optimal procedures for estimating a linear functional based on observational data.
For any convex and symmetric function class $mathcalF$, we derive a non-asymptotic local minimax bound on the mean-squared error.
arXiv Detail & Related papers (2023-01-16T02:57:37Z) - Learning Summary Statistics for Bayesian Inference with Autoencoders [58.720142291102135]
We use the inner dimension of deep neural network based Autoencoders as summary statistics.
To create an incentive for the encoder to encode all the parameter-related information but not the noise, we give the decoder access to explicit or implicit information that has been used to generate the training data.
arXiv Detail & Related papers (2022-01-28T12:00:31Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Multivariate Probabilistic Regression with Natural Gradient Boosting [63.58097881421937]
We propose a Natural Gradient Boosting (NGBoost) approach based on nonparametrically modeling the conditional parameters of the multivariate predictive distribution.
Our method is robust, works out-of-the-box without extensive tuning, is modular with respect to the assumed target distribution, and performs competitively in comparison to existing approaches.
arXiv Detail & Related papers (2021-06-07T17:44:49Z) - Joint Total Variation ESTATICS for Robust Multi-Parameter Mapping [0.0]
ESTATICS performs a joint loglinear fit of multiple echo series to extract R2* and multiple extrapolated intercepts.
We evaluate the proposed algorithm by predicting left-out echoes in a rich single-subject dataset.
arXiv Detail & Related papers (2020-05-28T19:08:42Z) - Robust subgaussian estimation with VC-dimension [0.0]
This work proposes a new general way to bound the excess risk for MOM estimators.
The core technique is the use of VC-dimension (instead of Rademacher complexity) to measure the statistical complexity.
arXiv Detail & Related papers (2020-04-24T13:21:09Z) - Generalized Gumbel-Softmax Gradient Estimator for Various Discrete
Random Variables [16.643346012854156]
Esting the gradients of nodes is one of the crucial research questions in the deep generative modeling community.
This paper proposes a general version of the Gumbel-Softmax estimator with continuous relaxation.
arXiv Detail & Related papers (2020-03-04T01:13:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.