Survival Analysis as Imprecise Classification with Trainable Kernels
- URL: http://arxiv.org/abs/2506.10140v1
- Date: Wed, 11 Jun 2025 19:40:09 GMT
- Title: Survival Analysis as Imprecise Classification with Trainable Kernels
- Authors: Andrei V. Konstantinov, Vlada A. Efremenko, Lev V. Utkin,
- Abstract summary: iSurvM, iSurvQ, and iSurvJ combine imprecise probability theory with attention mechanisms to handle censored data without parametric assumptions.<n>Experiments on synthetic and real datasets demonstrate that the proposed models consistently outperform the Beran estimator from the accuracy and computational complexity points of view.
- Score: 2.2120851074630177
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Survival analysis is a fundamental tool for modeling time-to-event data in healthcare, engineering, and finance, where censored observations pose significant challenges. While traditional methods like the Beran estimator offer nonparametric solutions, they often struggle with the complex data structures and heavy censoring. This paper introduces three novel survival models, iSurvM (the imprecise Survival model based on Mean likelihood functions), iSurvQ (the imprecise Survival model based on the Quantiles of likelihood functions), and iSurvJ (the imprecise Survival model based on the Joint learning), that combine imprecise probability theory with attention mechanisms to handle censored data without parametric assumptions. The first idea behind the models is to represent censored observations by interval-valued probability distributions for each instance over time intervals between events moments. The second idea is to employ the kernel-based Nadaraya-Watson regression with trainable attention weights for computing the imprecise probability distribution over time intervals for the entire dataset. The third idea is to consider three decision strategies for training, which correspond to the proposed three models. Experiments on synthetic and real datasets demonstrate that the proposed models, especially iSurvJ, consistently outperform the Beran estimator from the accuracy and computational complexity points of view. Codes implementing the proposed models are publicly available.
Related papers
- Self-Consistent Equation-guided Neural Networks for Censored Time-to-Event Data [11.550402345767141]
We propose a novel approach to non-parametric estimation of the conditional survival functions using the generative adversarial networks leveraging self-consistent equations.<n>The proposed method is model-free and does not require any parametric assumptions on the structure of the conditional survival function.
arXiv Detail & Related papers (2025-03-12T06:24:35Z) - SurvBETA: Ensemble-Based Survival Models Using Beran Estimators and Several Attention Mechanisms [2.024925013349319]
We propose a new ensemble-based model called SurvBETA (the Survival Beran estimator Ensemble using Three Attention mechanisms)<n>The proposed model is presented in two forms: in a general form requiring to solve a complex optimization problem for its training; and in a simplified form by considering a special representation of the attention weights.
arXiv Detail & Related papers (2024-12-10T16:17:38Z) - Generating Survival Interpretable Trajectories and Data [2.4861619769660637]
The paper demonstrates the efficiency and properties of the proposed model using numerical experiments on synthetic and real datasets.
The code of the algorithm implementing the proposed model is publicly available.
arXiv Detail & Related papers (2024-02-19T18:02:10Z) - TripleSurv: Triplet Time-adaptive Coordinate Loss for Survival Analysis [15.496918127515665]
We propose a time-adaptive coordinate loss function, TripleSurv, to handle the complexities of learning process and exploit valuable survival time values.
Our TripleSurv is evaluated on three real-world survival datasets and a public synthetic dataset.
arXiv Detail & Related papers (2024-01-05T08:37:57Z) - Performative Prediction with Neural Networks [24.880495520422]
performative prediction is a framework for learning models that influence the data they intend to predict.<n>Standard convergence results for finding a performatively stable classifier with the method of repeated risk minimization assume that the data distribution is Lipschitz continuous to the model's parameters.<n>In this work, we instead assume that the data distribution is Lipschitz continuous with respect to the model's predictions, a more natural assumption for performative systems.
arXiv Detail & Related papers (2023-04-14T01:12:48Z) - SurvivalGAN: Generating Time-to-Event Data for Survival Analysis [121.84429525403694]
Imbalances in censoring and time horizons cause generative models to experience three new failure modes specific to survival analysis.
We propose SurvivalGAN, a generative model that handles survival data by addressing the imbalance in the censoring and event horizons.
We evaluate this method via extensive experiments on medical datasets.
arXiv Detail & Related papers (2023-02-24T17:03:51Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - X-model: Improving Data Efficiency in Deep Learning with A Minimax Model [78.55482897452417]
We aim at improving data efficiency for both classification and regression setups in deep learning.
To take the power of both worlds, we propose a novel X-model.
X-model plays a minimax game between the feature extractor and task-specific heads.
arXiv Detail & Related papers (2021-10-09T13:56:48Z) - Probabilistic Modeling for Human Mesh Recovery [73.11532990173441]
This paper focuses on the problem of 3D human reconstruction from 2D evidence.
We recast the problem as learning a mapping from the input to a distribution of plausible 3D poses.
arXiv Detail & Related papers (2021-08-26T17:55:11Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Trust but Verify: Assigning Prediction Credibility by Counterfactual
Constrained Learning [123.3472310767721]
Prediction credibility measures are fundamental in statistics and machine learning.
These measures should account for the wide variety of models used in practice.
The framework developed in this work expresses the credibility as a risk-fit trade-off.
arXiv Detail & Related papers (2020-11-24T19:52:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.