Adaptive tumor growth forecasting via neural & universal ODEs
- URL: http://arxiv.org/abs/2511.22292v1
- Date: Thu, 27 Nov 2025 10:17:35 GMT
- Title: Adaptive tumor growth forecasting via neural & universal ODEs
- Authors: Kavya Subramanian, Prathamesh Dinesh Joshi, Raj Abhijit Dandekar, Rajat Dandekar, Sreedath Panat,
- Abstract summary: We build adaptive tumor growth models capable of learning from experimental data.<n>Our approach has the potential to improve predictive accuracy, guiding dynamic and effective treatment strategies.
- Score: 4.285464959472458
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Forecasting tumor growth is critical for optimizing treatment. Classical growth models such as the Gompertz and Bertalanffy equations capture general tumor dynamics but may fail to adapt to patient-specific variability, particularly with limited data available. In this study, we leverage Neural Ordinary Differential Equations (Neural ODEs) and Universal Differential Equations (UDEs), two pillars of Scientific Machine Learning (SciML), to construct adaptive tumor growth models capable of learning from experimental data. Using the Gompertz model as a baseline, we replace rigid terms with adaptive neural networks to capture hidden dynamics through robust modeling in the Julia programming language. We use our models to perform forecasting under data constraints and symbolic recovery to transform the learned dynamics into explicit mathematical expressions. Our approach has the potential to improve predictive accuracy, guiding dynamic and effective treatment strategies for improved clinical outcomes.
Related papers
- Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - NOBLE -- Neural Operator with Biologically-informed Latent Embeddings to Capture Experimental Variability in Biological Neuron Models [63.592664795493725]
NOBLE is a neural operator framework that learns a mapping from a continuous frequency-modulated embedding of interpretable neuron features to the somatic voltage response induced by current injection.<n>It predicts distributions of neural dynamics accounting for the intrinsic experimental variability.<n>NOBLE is the first scaled-up deep learning framework that validates its generalization with real experimental data.
arXiv Detail & Related papers (2025-06-05T01:01:18Z) - Efficient Deep Learning-based Forward Solvers for Brain Tumor Growth Models [18.222267022441734]
Glioblastoma, a highly aggressive brain tumor, poses major challenges due to its poor prognosis and high morbidity rates.<n> Partial differential equation-based models offer promising potential to enhance therapeutic outcomes.<n>We introduce an approach leveraging a neural forward solver with gradient-based optimization to significantly reduce calibration time.
arXiv Detail & Related papers (2025-01-14T16:10:25Z) - Patient-specific prediction of glioblastoma growth via reduced order modeling and neural networks [0.0]
We present a proof-of-concept for a mathematical model of GBL growth, enabling real-time prediction and patient-specific parameter identification.<n>A neural network surrogate learns the inverse mapping from tumor evolution to model parameters, achieving significant computational speed-up.
arXiv Detail & Related papers (2024-12-04T18:46:05Z) - Physics-Regularized Multi-Modal Image Assimilation for Brain Tumor Localization [3.666412718346211]
We introduce a novel method that integrates data-driven and physics-based cost functions.
We propose a unique discretization scheme that quantifies how well the learned distributions of tumor and brain tissues adhere to their respective growth and elasticity equations.
arXiv Detail & Related papers (2024-09-30T15:36:14Z) - Integration of Graph Neural Network and Neural-ODEs for Tumor Dynamic Prediction [4.850774880198265]
We propose a graph encoder that utilizes a bipartite Graph Convolutional Neural network (GCN) combined with Neural Ordinary Differential Equations (Neural-ODEs)
We first show that the methodology is able to discover a tumor dynamic model that significantly improves upon an empirical model.
Our findings indicate that the methodology holds significant promise and offers potential applications in pre-clinical settings.
arXiv Detail & Related papers (2023-10-02T06:39:08Z) - Explainable Deep Learning for Tumor Dynamic Modeling and Overall
Survival Prediction using Neural-ODE [0.0]
We propose the use of Tumor Dynamic Neural-ODE as a pharmacology-informed neural network.
We show that TDNODE overcomes a key limitation of existing models in its ability to make unbiased predictions from truncated data.
We show that the generated metrics can be used to predict patients' overall survival (OS) with high accuracy.
arXiv Detail & Related papers (2023-08-02T18:08:27Z) - Individualized Dosing Dynamics via Neural Eigen Decomposition [51.62933814971523]
We introduce the Neural Eigen Differential Equation algorithm (NESDE)
NESDE provides individualized modeling, tunable generalization to new treatment policies, and fast, continuous, closed-form prediction.
We demonstrate the robustness of NESDE in both synthetic and real medical problems, and use the learned dynamics to publish simulated medical gym environments.
arXiv Detail & Related papers (2023-06-24T17:01:51Z) - On the Generalization and Adaption Performance of Causal Models [99.64022680811281]
Differentiable causal discovery has proposed to factorize the data generating process into a set of modules.
We study the generalization and adaption performance of such modular neural causal models.
Our analysis shows that the modular neural causal models outperform other models on both zero and few-shot adaptation in low data regimes.
arXiv Detail & Related papers (2022-06-09T17:12:32Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - A multi-stage machine learning model on diagnosis of esophageal
manometry [50.591267188664666]
The framework includes deep-learning models at the swallow-level stage and feature-based machine learning models at the study-level stage.
This is the first artificial-intelligence-style model to automatically predict CC diagnosis of HRM study from raw multi-swallow data.
arXiv Detail & Related papers (2021-06-25T20:09:23Z) - Stochasticity in Neural ODEs: An Empirical Study [68.8204255655161]
Regularization of neural networks (e.g. dropout) is a widespread technique in deep learning that allows for better generalization.
We show that data augmentation during the training improves the performance of both deterministic and versions of the same model.
However, the improvements obtained by the data augmentation completely eliminate the empirical regularization gains, making the performance of neural ODE and neural SDE negligible.
arXiv Detail & Related papers (2020-02-22T22:12:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.