TripleSurv: Triplet Time-adaptive Coordinate Loss for Survival Analysis
- URL: http://arxiv.org/abs/2401.02708v1
- Date: Fri, 5 Jan 2024 08:37:57 GMT
- Title: TripleSurv: Triplet Time-adaptive Coordinate Loss for Survival Analysis
- Authors: Liwen Zhang, Lianzhen Zhong, Fan Yang, Di Dong, Hui Hui, Jie Tian
- Abstract summary: We propose a time-adaptive coordinate loss function, TripleSurv, to handle the complexities of learning process and exploit valuable survival time values.
Our TripleSurv is evaluated on three real-world survival datasets and a public synthetic dataset.
- Score: 15.496918127515665
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A core challenge in survival analysis is to model the distribution of
censored time-to-event data, where the event of interest may be a death,
failure, or occurrence of a specific event. Previous studies have showed that
ranking and maximum likelihood estimation (MLE)loss functions are widely-used
for survival analysis. However, ranking loss only focus on the ranking of
survival time and does not consider potential effect of samples for exact
survival time values. Furthermore, the MLE is unbounded and easily subject to
outliers (e.g., censored data), which may cause poor performance of modeling.
To handle the complexities of learning process and exploit valuable survival
time values, we propose a time-adaptive coordinate loss function, TripleSurv,
to achieve adaptive adjustments by introducing the differences in the survival
time between sample pairs into the ranking, which can encourage the model to
quantitatively rank relative risk of pairs, ultimately enhancing the accuracy
of predictions. Most importantly, the TripleSurv is proficient in quantifying
the relative risk between samples by ranking ordering of pairs, and consider
the time interval as a trade-off to calibrate the robustness of model over
sample distribution. Our TripleSurv is evaluated on three real-world survival
datasets and a public synthetic dataset. The results show that our method
outperforms the state-of-the-art methods and exhibits good model performance
and robustness on modeling various sophisticated data distributions with
different censor rates. Our code will be available upon acceptance.
Related papers
- Survival Models: Proper Scoring Rule and Stochastic Optimization with Competing Risks [6.9648613217501705]
SurvivalBoost outperforms 12 state-of-the-art models on 4 real-life datasets.
It provides great calibration, the ability to predict across any time horizon, and computation times faster than existing methods.
arXiv Detail & Related papers (2024-10-22T07:33:34Z) - Risk and cross validation in ridge regression with correlated samples [72.59731158970894]
We provide training examples for the in- and out-of-sample risks of ridge regression when the data points have arbitrary correlations.
We further extend our analysis to the case where the test point has non-trivial correlations with the training set, setting often encountered in time series forecasting.
We validate our theory across a variety of high dimensional data.
arXiv Detail & Related papers (2024-08-08T17:27:29Z) - Teaching Models To Survive: Proper Scoring Rule and Stochastic Optimization with Competing Risks [6.9648613217501705]
When data are right-censored, survival analysis can compute the "time to event"
We introduce a strictly proper censoring-adjusted separable scoring rule that can be optimized on a subpart of the data.
Compared to 11 state-of-the-art models, this model, MultiIncidence, performs best in estimating the probability of outcomes in survival and competing risks.
arXiv Detail & Related papers (2024-06-20T08:00:42Z) - Variational Deep Survival Machines: Survival Regression with Censored Outcomes [11.82370259688716]
Survival regression aims to predict the time when an event of interest will take place, typically a death or a failure.
We present a novel method to predict the survival time by better clustering the survival data and combine primitive distributions.
arXiv Detail & Related papers (2024-04-24T02:16:00Z) - CenTime: Event-Conditional Modelling of Censoring in Survival Analysis [49.44664144472712]
We introduce CenTime, a novel approach to survival analysis that directly estimates the time to event.
Our method features an innovative event-conditional censoring mechanism that performs robustly even when uncensored data is scarce.
Our results indicate that CenTime offers state-of-the-art performance in predicting time-to-death while maintaining comparable ranking performance.
arXiv Detail & Related papers (2023-09-07T17:07:33Z) - Copula-Based Deep Survival Models for Dependent Censoring [10.962520289040336]
This paper presents a parametric model of survival that extends modern non-linear survival analysis by relaxing the assumption of conditional independence.
On synthetic and semi-synthetic data, our approach significantly improves estimates of survival distributions compared to the standard that assumes conditional independence in the data.
arXiv Detail & Related papers (2023-06-20T21:51:13Z) - SurvivalGAN: Generating Time-to-Event Data for Survival Analysis [121.84429525403694]
Imbalances in censoring and time horizons cause generative models to experience three new failure modes specific to survival analysis.
We propose SurvivalGAN, a generative model that handles survival data by addressing the imbalance in the censoring and event horizons.
We evaluate this method via extensive experiments on medical datasets.
arXiv Detail & Related papers (2023-02-24T17:03:51Z) - Rethinking Missing Data: Aleatoric Uncertainty-Aware Recommendation [59.500347564280204]
We propose a new Aleatoric Uncertainty-aware Recommendation (AUR) framework.
AUR consists of a new uncertainty estimator along with a normal recommender model.
As the chance of mislabeling reflects the potential of a pair, AUR makes recommendations according to the uncertainty.
arXiv Detail & Related papers (2022-09-22T04:32:51Z) - X-model: Improving Data Efficiency in Deep Learning with A Minimax Model [78.55482897452417]
We aim at improving data efficiency for both classification and regression setups in deep learning.
To take the power of both worlds, we propose a novel X-model.
X-model plays a minimax game between the feature extractor and task-specific heads.
arXiv Detail & Related papers (2021-10-09T13:56:48Z) - Survival Cluster Analysis [93.50540270973927]
There is an unmet need in survival analysis for identifying subpopulations with distinct risk profiles.
An approach that addresses this need is likely to improve characterization of individual outcomes.
arXiv Detail & Related papers (2020-02-29T22:41:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.