Metaparametric Neural Networks for Survival Analysis
- URL: http://arxiv.org/abs/2110.06610v1
- Date: Wed, 13 Oct 2021 10:06:57 GMT
- Title: Metaparametric Neural Networks for Survival Analysis
- Authors: Fabio Luis de Mello, J Mark Wilkinson and Visakan Kadirkamanathan
- Abstract summary: We present the metaparametric neural network framework that encompasses existing survival analysis methods.
This framework allows survival neural networks to satisfy the same independence of generic function estimation from the underlying data structure.
- Score: 0.6445605125467573
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Survival analysis is a critical tool for the modelling of time-to-event data,
such as life expectancy after a cancer diagnosis or optimal maintenance
scheduling for complex machinery. However, current neural network models
provide an imperfect solution for survival analysis as they either restrict the
shape of the target probability distribution or restrict the estimation to
pre-determined times. As a consequence, current survival neural networks lack
the ability to estimate a generic function without prior knowledge of its
structure. In this article, we present the metaparametric neural network
framework that encompasses existing survival analysis methods and enables their
extension to solve the aforementioned issues. This framework allows survival
neural networks to satisfy the same independence of generic function estimation
from the underlying data structure that characterizes their regression and
classification counterparts. Further, we demonstrate the application of the
metaparametric framework using both simulated and large real-world datasets and
show that it outperforms the current state-of-the-art methods in (i) capturing
nonlinearities, and (ii) identifying temporal patterns, leading to more
accurate overall estimations whilst placing no restrictions on the underlying
function structure.
Related papers
- Fast and Reliable Probabilistic Reflectometry Inversion with Prior-Amortized Neural Posterior Estimation [73.81105275628751]
Finding all structures compatible with reflectometry data is computationally prohibitive for standard algorithms.
We address this lack of reliability with a probabilistic deep learning method that identifies all realistic structures in seconds.
Our method, Prior-Amortized Neural Posterior Estimation (PANPE), combines simulation-based inference with novel adaptive priors.
arXiv Detail & Related papers (2024-07-26T10:29:16Z) - Nonlinear classification of neural manifolds with contextual information [6.292933471495322]
manifold capacity has emerged as a promising framework linking population geometry to the separability of neural manifold.
We propose a theoretical framework that overcomes this limitation by leveraging contextual input information.
Our framework's increased expressivity captures representation untanglement in deep networks at early stages of the layer hierarchy, previously inaccessible to analysis.
arXiv Detail & Related papers (2024-05-10T23:37:31Z) - Tractable Function-Space Variational Inference in Bayesian Neural
Networks [72.97620734290139]
A popular approach for estimating the predictive uncertainty of neural networks is to define a prior distribution over the network parameters.
We propose a scalable function-space variational inference method that allows incorporating prior information.
We show that the proposed method leads to state-of-the-art uncertainty estimation and predictive performance on a range of prediction tasks.
arXiv Detail & Related papers (2023-12-28T18:33:26Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - Probabilistic AutoRegressive Neural Networks for Accurate Long-range
Forecasting [6.295157260756792]
We introduce the Probabilistic AutoRegressive Neural Networks (PARNN)
PARNN is capable of handling complex time series data exhibiting non-stationarity, nonlinearity, non-seasonality, long-range dependence, and chaotic patterns.
We evaluate the performance of PARNN against standard statistical, machine learning, and deep learning models, including Transformers, NBeats, and DeepAR.
arXiv Detail & Related papers (2022-04-01T17:57:36Z) - Interpretable Additive Recurrent Neural Networks For Multivariate
Clinical Time Series [4.125698836261585]
We present the Interpretable-RNN (I-RNN) that balances model complexity and accuracy by forcing the relationship between variables in the model to be additive.
I-RNN specifically captures the unique characteristics of clinical time series, which are unevenly sampled in time, asynchronously acquired, and have missing data.
We evaluate the I-RNN model on the Physionet 2012 Challenge dataset to predict in-hospital mortality, and on a real-world clinical decision support task: predicting hemodynamic interventions in the intensive care unit.
arXiv Detail & Related papers (2021-09-15T22:30:19Z) - The Causal Neural Connection: Expressiveness, Learnability, and
Inference [125.57815987218756]
An object called structural causal model (SCM) represents a collection of mechanisms and sources of random variation of the system under investigation.
In this paper, we show that the causal hierarchy theorem (Thm. 1, Bareinboim et al., 2020) still holds for neural models.
We introduce a special type of SCM called a neural causal model (NCM), and formalize a new type of inductive bias to encode structural constraints necessary for performing causal inferences.
arXiv Detail & Related papers (2021-07-02T01:55:18Z) - BDNNSurv: Bayesian deep neural networks for survival analysis using
pseudo values [7.707091943385522]
We propose a Bayesian hierarchical deep neural networks model for modeling and prediction of survival data.
Compared with previously studied methods, the new proposal can provide point estimate of survival probability.
The Python code implementing the proposed approach was provided.
arXiv Detail & Related papers (2021-01-07T20:18:43Z) - Gradient Starvation: A Learning Proclivity in Neural Networks [97.02382916372594]
Gradient Starvation arises when cross-entropy loss is minimized by capturing only a subset of features relevant for the task.
This work provides a theoretical explanation for the emergence of such feature imbalance in neural networks.
arXiv Detail & Related papers (2020-11-18T18:52:08Z) - Semi-Structured Deep Piecewise Exponential Models [2.7728956081909346]
We propose a versatile framework for survival analysis that combines advanced concepts from statistics with deep learning.
A proof of concept is provided by using the framework to predict Alzheimer's disease progression.
arXiv Detail & Related papers (2020-11-11T14:41:19Z) - Lipschitz Recurrent Neural Networks [100.72827570987992]
We show that our Lipschitz recurrent unit is more robust with respect to input and parameter perturbations as compared to other continuous-time RNNs.
Our experiments demonstrate that the Lipschitz RNN can outperform existing recurrent units on a range of benchmark tasks.
arXiv Detail & Related papers (2020-06-22T08:44:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.