Deep Learning for Quantile Regression under Right Censoring:
DeepQuantreg
- URL: http://arxiv.org/abs/2007.07056v2
- Date: Mon, 12 Apr 2021 03:35:52 GMT
- Title: Deep Learning for Quantile Regression under Right Censoring:
DeepQuantreg
- Authors: Yichen Jia and Jong-Hyeon Jeong
- Abstract summary: This paper presents a novel application of the neural network to the quantile regression for survival data with right censoring.
The main purpose of this work is to show that the deep learning method could be flexible enough to predict nonlinear patterns more accurately compared to existing quantile regression methods.
- Score: 1.0152838128195467
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The computational prediction algorithm of neural network, or deep learning,
has drawn much attention recently in statistics as well as in image recognition
and natural language processing. Particularly in statistical application for
censored survival data, the loss function used for optimization has been mainly
based on the partial likelihood from Cox's model and its variations to utilize
existing neural network library such as Keras, which was built upon the open
source library of TensorFlow. This paper presents a novel application of the
neural network to the quantile regression for survival data with right
censoring, which is adjusted by the inverse of the estimated censoring
distribution in the check function. The main purpose of this work is to show
that the deep learning method could be flexible enough to predict nonlinear
patterns more accurately compared to existing quantile regression methods such
as traditional linear quantile regression and nonparametric quantile regression
with total variation regularization, emphasizing practicality of the method for
censored survival data. Simulation studies were performed to generate nonlinear
censored survival data and compare the deep learning method with existing
quantile regression methods in terms of prediction accuracy. The proposed
method is illustrated with two publicly available breast cancer data sets with
gene signatures. The method has been built into a package and is freely
available at \url{https://github.com/yicjia/DeepQuantreg}.
Related papers
- Beta quantile regression for robust estimation of uncertainty in the
presence of outliers [1.6377726761463862]
Quantile Regression can be used to estimate aleatoric uncertainty in deep neural networks.
We propose a robust solution for quantile regression that incorporates concepts from robust divergence.
arXiv Detail & Related papers (2023-09-14T01:18:57Z) - Engression: Extrapolation through the Lens of Distributional Regression [2.519266955671697]
We propose a neural network-based distributional regression methodology called engression'
An engression model is generative in the sense that we can sample from the fitted conditional distribution and is also suitable for high-dimensional outcomes.
We show that engression can successfully perform extrapolation under some assumptions such as monotonicity, whereas traditional regression approaches such as least-squares or quantile regression fall short under the same assumptions.
arXiv Detail & Related papers (2023-07-03T08:19:00Z) - What learning algorithm is in-context learning? Investigations with
linear models [87.91612418166464]
We investigate the hypothesis that transformer-based in-context learners implement standard learning algorithms implicitly.
We show that trained in-context learners closely match the predictors computed by gradient descent, ridge regression, and exact least-squares regression.
Preliminary evidence that in-context learners share algorithmic features with these predictors.
arXiv Detail & Related papers (2022-11-28T18:59:51Z) - Conditional Distribution Function Estimation Using Neural Networks for
Censored and Uncensored Data [0.0]
We consider estimating the conditional distribution function using neural networks for both censored and uncensored data.
We show the proposed method possesses desirable performance, whereas the partial likelihood method yields biased estimates when model assumptions are violated.
arXiv Detail & Related papers (2022-07-06T01:12:22Z) - Invariance Learning in Deep Neural Networks with Differentiable Laplace
Approximations [76.82124752950148]
We develop a convenient gradient-based method for selecting the data augmentation.
We use a differentiable Kronecker-factored Laplace approximation to the marginal likelihood as our objective.
arXiv Detail & Related papers (2022-02-22T02:51:11Z) - X-model: Improving Data Efficiency in Deep Learning with A Minimax Model [78.55482897452417]
We aim at improving data efficiency for both classification and regression setups in deep learning.
To take the power of both worlds, we propose a novel X-model.
X-model plays a minimax game between the feature extractor and task-specific heads.
arXiv Detail & Related papers (2021-10-09T13:56:48Z) - Scalable Marginal Likelihood Estimation for Model Selection in Deep
Learning [78.83598532168256]
Marginal-likelihood based model-selection is rarely used in deep learning due to estimation difficulties.
Our work shows that marginal likelihoods can improve generalization and be useful when validation data is unavailable.
arXiv Detail & Related papers (2021-04-11T09:50:24Z) - Flexible Model Aggregation for Quantile Regression [92.63075261170302]
Quantile regression is a fundamental problem in statistical learning motivated by a need to quantify uncertainty in predictions.
We investigate methods for aggregating any number of conditional quantile models.
All of the models we consider in this paper can be fit using modern deep learning toolkits.
arXiv Detail & Related papers (2021-02-26T23:21:16Z) - Censored Quantile Regression Forest [81.9098291337097]
We develop a new estimating equation that adapts to censoring and leads to quantile score whenever the data do not exhibit censoring.
The proposed procedure named it censored quantile regression forest, allows us to estimate quantiles of time-to-event without any parametric modeling assumption.
arXiv Detail & Related papers (2020-01-08T23:20:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.