Gluformer: Transformer-Based Personalized Glucose Forecasting with
Uncertainty Quantification
- URL: http://arxiv.org/abs/2209.04526v1
- Date: Fri, 9 Sep 2022 21:03:43 GMT
- Title: Gluformer: Transformer-Based Personalized Glucose Forecasting with
Uncertainty Quantification
- Authors: Renat Sergazinov, Mohammadreza Armandpour, Irina Gaynanova
- Abstract summary: We propose to model the future glucose trajectory conditioned on the past as an infinite mixture of basis distributions.
This change allows us to learn the uncertainty and predict more accurately in the cases when the trajectory has a heterogeneous or multi-modal distribution.
We empirically demonstrate the superiority of our method over existing state-of-the-art techniques both in terms of accuracy and uncertainty on the synthetic and benchmark glucose data sets.
- Score: 7.451722745955049
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep learning models achieve state-of-the art results in predicting blood
glucose trajectories, with a wide range of architectures being proposed.
However, the adaptation of such models in clinical practice is slow, largely
due to the lack of uncertainty quantification of provided predictions. In this
work, we propose to model the future glucose trajectory conditioned on the past
as an infinite mixture of basis distributions (i.e., Gaussian, Laplace, etc.).
This change allows us to learn the uncertainty and predict more accurately in
the cases when the trajectory has a heterogeneous or multi-modal distribution.
To estimate the parameters of the predictive distribution, we utilize the
Transformer architecture. We empirically demonstrate the superiority of our
method over existing state-of-the-art techniques both in terms of accuracy and
uncertainty on the synthetic and benchmark glucose data sets.
Related papers
- Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Estimating Epistemic and Aleatoric Uncertainty with a Single Model [5.871583927216653]
We introduce a new approach to ensembling, hyper-diffusion models (HyperDM)
HyperDM offers prediction accuracy on par with, and in some cases superior to, multi-model ensembles.
We validate our method on two distinct real-world tasks: x-ray computed tomography reconstruction and weather temperature forecasting.
arXiv Detail & Related papers (2024-02-05T19:39:52Z) - DiffHybrid-UQ: Uncertainty Quantification for Differentiable Hybrid
Neural Modeling [4.76185521514135]
We introduce a novel method, DiffHybrid-UQ, for effective and efficient uncertainty propagation and estimation in hybrid neural differentiable models.
Specifically, our approach effectively discerns and quantifies both aleatoric uncertainties, arising from data noise, and epistemic uncertainties, resulting from model-form discrepancies and data sparsity.
arXiv Detail & Related papers (2023-12-30T07:40:47Z) - The Implicit Delta Method [61.36121543728134]
In this paper, we propose an alternative, the implicit delta method, which works by infinitesimally regularizing the training loss of uncertainty.
We show that the change in the evaluation due to regularization is consistent for the variance of the evaluation estimator, even when the infinitesimal change is approximated by a finite difference.
arXiv Detail & Related papers (2022-11-11T19:34:17Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Training on Test Data with Bayesian Adaptation for Covariate Shift [96.3250517412545]
Deep neural networks often make inaccurate predictions with unreliable uncertainty estimates.
We derive a Bayesian model that provides for a well-defined relationship between unlabeled inputs under distributional shift and model parameters.
We show that our method improves both accuracy and uncertainty estimation.
arXiv Detail & Related papers (2021-09-27T01:09:08Z) - Quantifying Model Predictive Uncertainty with Perturbation Theory [21.591460685054546]
We propose a framework for predictive uncertainty quantification of a neural network.
We use perturbation theory from quantum physics to formulate a moment decomposition problem.
Our approach provides fast model predictive uncertainty estimates with much greater precision and calibration.
arXiv Detail & Related papers (2021-09-22T17:55:09Z) - CovarianceNet: Conditional Generative Model for Correct Covariance
Prediction in Human Motion Prediction [71.31516599226606]
We present a new method to correctly predict the uncertainty associated with the predicted distribution of future trajectories.
Our approach, CovariaceNet, is based on a Conditional Generative Model with Gaussian latent variables.
arXiv Detail & Related papers (2021-09-07T09:38:24Z) - Calibration and Uncertainty Quantification of Bayesian Convolutional
Neural Networks for Geophysical Applications [0.0]
It is common to incorporate the uncertainty of predictions such subsurface models should provide calibrated probabilities and the associated uncertainties in their predictions.
It has been shown that popular Deep Learning-based models are often miscalibrated, and due to their deterministic nature, provide no means to interpret the uncertainty of their predictions.
We compare three different approaches obtaining probabilistic models based on convolutional neural networks in a Bayesian formalism.
arXiv Detail & Related papers (2021-05-25T17:54:23Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.