Incorporating Expert Guidance in Epidemic Forecasting
- URL: http://arxiv.org/abs/2101.10247v1
- Date: Thu, 24 Dec 2020 06:21:53 GMT
- Title: Incorporating Expert Guidance in Epidemic Forecasting
- Authors: Alexander Rodr\'iguez, Bijaya Adhikari, Naren Ramakrishnan, B. Aditya
Prakash
- Abstract summary: We propose a new approach leveraging the Seldonian optimization framework from AI safety.
We study two types of guidance: smoothness and regional consistency of errors.
We show that by its successful incorporation, we are able to not only bound the probability of undesirable behavior to happen, but also to reduce RMSE on test data by up to 17%.
- Score: 79.91855362871496
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Forecasting influenza like illnesses (ILI) has rapidly progressed in recent
years from an art to a science with a plethora of data-driven methods. While
these methods have achieved qualified success, their applicability is limited
due to their inability to incorporate expert feedback and guidance
systematically into the forecasting framework. We propose a new approach
leveraging the Seldonian optimization framework from AI safety and demonstrate
how it can be adapted to epidemic forecasting. We study two types of guidance:
smoothness and regional consistency of errors, where we show that by its
successful incorporation, we are able to not only bound the probability of
undesirable behavior to happen, but also to reduce RMSE on test data by up to
17%.
Related papers
- EPL: Evidential Prototype Learning for Semi-supervised Medical Image Segmentation [0.0]
We propose Evidential Prototype Learning (EPL) to fuse voxel probability predictions from different sources and prototype fusion utilization of labeled and unlabeled data.
The uncertainty not only enables the model to self-correct predictions but also improves the guided learning process with pseudo-labels and is able to feed back into the construction of hidden features.
arXiv Detail & Related papers (2024-04-09T10:04:06Z) - Is Epistemic Uncertainty Faithfully Represented by Evidential Deep Learning Methods? [26.344949402398917]
This paper presents novel theoretical insights of evidential deep learning.
It highlights the difficulties in optimizing second-order loss functions.
It provides novel insights into issues of identifiability and convergence in second-order loss minimization.
arXiv Detail & Related papers (2024-02-14T10:07:05Z) - Conservative Prediction via Data-Driven Confidence Minimization [70.93946578046003]
In safety-critical applications of machine learning, it is often desirable for a model to be conservative.
We propose the Data-Driven Confidence Minimization framework, which minimizes confidence on an uncertainty dataset.
arXiv Detail & Related papers (2023-06-08T07:05:36Z) - On the Practicality of Deterministic Epistemic Uncertainty [106.06571981780591]
deterministic uncertainty methods (DUMs) achieve strong performance on detecting out-of-distribution data.
It remains unclear whether DUMs are well calibrated and can seamlessly scale to real-world applications.
arXiv Detail & Related papers (2021-07-01T17:59:07Z) - Stratified Learning: A General-Purpose Statistical Method for Improved
Learning under Covariate Shift [1.1470070927586016]
We propose a simple, statistically principled, and theoretically justified method to improve supervised learning when the training set is not representative.
We build upon a well-established methodology in causal inference, and show that the effects of covariate shift can be reduced or eliminated by conditioning on propensity scores.
We demonstrate the effectiveness of our general-purpose method on two contemporary research questions in cosmology, outperforming state-of-the-art importance weighting methods.
arXiv Detail & Related papers (2021-06-21T15:53:20Z) - DEUP: Direct Epistemic Uncertainty Prediction [56.087230230128185]
Epistemic uncertainty is part of out-of-sample prediction error due to the lack of knowledge of the learner.
We propose a principled approach for directly estimating epistemic uncertainty by learning to predict generalization error and subtracting an estimate of aleatoric uncertainty.
arXiv Detail & Related papers (2021-02-16T23:50:35Z) - Clinical Outcome Prediction from Admission Notes using Self-Supervised
Knowledge Integration [55.88616573143478]
Outcome prediction from clinical text can prevent doctors from overlooking possible risks.
Diagnoses at discharge, procedures performed, in-hospital mortality and length-of-stay prediction are four common outcome prediction targets.
We propose clinical outcome pre-training to integrate knowledge about patient outcomes from multiple public sources.
arXiv Detail & Related papers (2021-02-08T10:26:44Z) - STELAR: Spatio-temporal Tensor Factorization with Latent Epidemiological
Regularization [76.57716281104938]
We develop a tensor method to predict the evolution of epidemic trends for many regions simultaneously.
STELAR enables long-term prediction by incorporating latent temporal regularization through a system of discrete-time difference equations.
We conduct experiments using both county- and state-level COVID-19 data and show that our model can identify interesting latent patterns of the epidemic.
arXiv Detail & Related papers (2020-12-08T21:21:47Z) - Forecasting COVID-19 daily cases using phone call data [0.0]
We propose a simple Multiple Linear Regression model optimised to use call data to forecast the number of daily confirmed cases.
Our proposed approach outperforms ARIMA, ETS and a regression model without call data, evaluated by three point forecast error metrics, one prediction interval and two probabilistic forecast accuracy measures.
arXiv Detail & Related papers (2020-10-05T18:07:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.