Learning Survival Models with Right-Censored Reporting Delays
- URL: http://arxiv.org/abs/2510.04421v2
- Date: Fri, 24 Oct 2025 00:58:47 GMT
- Title: Learning Survival Models with Right-Censored Reporting Delays
- Authors: Yuta Shikuri, Hironori Fujisawa,
- Abstract summary: Survival analysis is a statistical technique used to estimate the time until an event occurs.<n> adjusting for reporting delays under practical constraints remains a significant challenge in the insurance industry.<n>Our study addresses this challenge by jointly modeling the parametric hazard functions of event occurrences and report timings.
- Score: 10.312968200748118
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Survival analysis is a statistical technique used to estimate the time until an event occurs. Although it is applied across a wide range of fields, adjusting for reporting delays under practical constraints remains a significant challenge in the insurance industry. Such delays render event occurrences unobservable when their reports are subject to right censoring. This issue becomes particularly critical when estimating hazard rates for newly enrolled cohorts with limited follow-up due to administrative censoring. Our study addresses this challenge by jointly modeling the parametric hazard functions of event occurrences and report timings. The joint probability distribution is marginalized over the latent event occurrence status. We construct an estimator for the proposed survival model and establish its asymptotic consistency. Furthermore, we develop an expectation-maximization algorithm to compute its estimates. Using these findings, we propose a two-stage estimation procedure based on a parametric proportional hazards model to evaluate observations subject to administrative censoring. Experimental results demonstrate that our method effectively improves the timeliness of risk evaluation for newly enrolled cohorts.
Related papers
- Observationally Informed Adaptive Causal Experimental Design [55.998153710215654]
We propose Active Residual Learning, a new paradigm that leverages the observational model as a foundational prior.<n>This approach shifts the experimental focus from learning target causal quantities from scratch to efficiently estimating the residuals required to correct observational bias.<n> Experiments on synthetic and semi-synthetic benchmarks demonstrate that R-Design significantly outperforms baselines.
arXiv Detail & Related papers (2026-03-04T06:52:37Z) - Evidential time-to-event prediction with calibrated uncertainty quantification [12.446406577462069]
Time-to-event analysis provides insights into clinical prognosis and treatment recommendations.<n>We propose an evidential regression model specifically designed for time-to-event prediction.<n>We show that our model delivers both accurate and reliable performance, outperforming state-of-the-art methods.
arXiv Detail & Related papers (2024-11-12T15:06:04Z) - Risk and cross validation in ridge regression with correlated samples [72.59731158970894]
We provide training examples for the in- and out-of-sample risks of ridge regression when the data points have arbitrary correlations.<n>We demonstrate that in this setting, the generalized cross validation estimator (GCV) fails to correctly predict the out-of-sample risk.<n>We further extend our analysis to the case where the test point has nontrivial correlations with the training set, a setting often encountered in time series forecasting.
arXiv Detail & Related papers (2024-08-08T17:27:29Z) - Pseudo-Observations and Super Learner for the Estimation of the Restricted Mean Survival Time [0.0]
We propose a flexible and easy-to-use ensemble algorithm that combines pseudo-observations and super learner.
We complement the predictions obtained from our method with our RMST-adapted risk measure, prediction intervals and variable importance measures.
arXiv Detail & Related papers (2024-04-26T07:38:10Z) - Score Matching-based Pseudolikelihood Estimation of Neural Marked
Spatio-Temporal Point Process with Uncertainty Quantification [59.81904428056924]
We introduce SMASH: a Score MAtching estimator for learning markedPs with uncertainty quantification.
Specifically, our framework adopts a normalization-free objective by estimating the pseudolikelihood of markedPs through score-matching.
The superior performance of our proposed framework is demonstrated through extensive experiments in both event prediction and uncertainty quantification.
arXiv Detail & Related papers (2023-10-25T02:37:51Z) - CenTime: Event-Conditional Modelling of Censoring in Survival Analysis [49.44664144472712]
We introduce CenTime, a novel approach to survival analysis that directly estimates the time to event.
Our method features an innovative event-conditional censoring mechanism that performs robustly even when uncensored data is scarce.
Our results indicate that CenTime offers state-of-the-art performance in predicting time-to-death while maintaining comparable ranking performance.
arXiv Detail & Related papers (2023-09-07T17:07:33Z) - Causal inference for the expected number of recurrent events in the presence of a terminal event [0.2446672595462589]
We develop a multiply robust estimation framework for causal inference in recurrent event data with a terminal failure event.<n>We show that the estimand can be identified under a weaker condition than conditionally independent censoring.
arXiv Detail & Related papers (2023-06-28T21:31:25Z) - Continuous Risk Measures for Driving Support [0.0]
We compare three model-based risk measures by evaluating their stengths and qualitatively testing them quantitatively.
We derive a novel risk measure based on the statistics of sparse critical events and so-called survival conditions.
The resulting survival analysis shows to have an earlier detection time crashes and less false positive detections in near-crash and non-crash cases supported by its solid theoretical grounding.
arXiv Detail & Related papers (2023-03-14T15:54:37Z) - Improved Policy Evaluation for Randomized Trials of Algorithmic Resource
Allocation [54.72195809248172]
We present a new estimator leveraging our proposed novel concept, that involves retrospective reshuffling of participants across experimental arms at the end of an RCT.
We prove theoretically that such an estimator is more accurate than common estimators based on sample means.
arXiv Detail & Related papers (2023-02-06T05:17:22Z) - Gaussian Process Nowcasting: Application to COVID-19 Mortality Reporting [2.8712862578745018]
Updating observations of a signal due to the delays in the measurement process is a common problem in signal processing.
We present a flexible approach using a latent Gaussian process that is capable of describing the changing auto-correlation structure present in the reporting time-delay surface.
This approach also yields robust estimates of uncertainty for the estimated nowcasted numbers of deaths.
arXiv Detail & Related papers (2021-02-22T18:32:44Z) - Enabling Counterfactual Survival Analysis with Balanced Representations [64.17342727357618]
Survival data are frequently encountered across diverse medical applications, i.e., drug development, risk profiling, and clinical trials.
We propose a theoretically grounded unified framework for counterfactual inference applicable to survival outcomes.
arXiv Detail & Related papers (2020-06-14T01:15:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.