MRI Contrast Enhancement Kinetics World Model
- URL: http://arxiv.org/abs/2602.19285v1
- Date: Sun, 22 Feb 2026 17:39:31 GMT
- Title: MRI Contrast Enhancement Kinetics World Model
- Authors: Jindi Kong, Yuting He, Cong Xia, Rongjun Ge, Shuo Li,
- Abstract summary: Applying world models to simulate the contrast enhancement kinetics in the human body enables continuous contrast-free dynamics.<n>The low temporal resolution in MRI acquisition restricts the training of world models, leading to a sparsely sampled dataset.<n>We propose MRI Contrast Enhancement Kinetics World model (MRI CEKWorld) with SpatioTemporal Consistency Learning (STCL)
- Score: 8.691568608551444
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Clinical MRI contrast acquisition suffers from inefficient information yield, which presents as a mismatch between the risky and costly acquisition protocol and the fixed and sparse acquisition sequence. Applying world models to simulate the contrast enhancement kinetics in the human body enables continuous contrast-free dynamics. However, the low temporal resolution in MRI acquisition restricts the training of world models, leading to a sparsely sampled dataset. Directly training a generative model to capture the kinetics leads to two limitations: (a) Due to the absence of data on missing time, the model tends to overfit to irrelevant features, leading to content distortion. (b) Due to the lack of continuous temporal supervision, the model fails to learn the continuous kinetics law over time, causing temporal discontinuities. For the first time, we propose MRI Contrast Enhancement Kinetics World model (MRI CEKWorld) with SpatioTemporal Consistency Learning (STCL). For (a), guided by the spatial law that patient-level structures remain consistent during enhancement, we propose Latent Alignment Learning (LAL) that constructs a patient-specific template to constrain contents to align with this template. For (b), guided by the temporal law that the kinetics follow a consistent smooth trend, we propose Latent Difference Learning (LDL) which extends the unobserved intervals by interpolation and constrains smooth variations in the latent space among interpolated sequences. Extensive experiments on two datasets show our MRI CEKWorld achieves better realistic contents and kinetics. Codes will be available at https://github.com/DD0922/MRI-Contrast-Enhancement-Kinetics-World-Model.
Related papers
- Revisiting Global Token Mixing in Task-Dependent MRI Restoration: Insights from Minimal Gated CNN Baselines [43.505945728449774]
Global token mixing has become a popular model design choice for MRI restoration.<n>We ask whether global token mixing is actually beneficial in each individual task across three representative settings.<n>For accelerated MRI reconstruction, the minimal unrolled gated-CNN baseline is already highly competitive.<n>For super-resolution, where low-frequency k-space data are largely preserved by the controlled low-pass degradation, local gated models remain competitive.<n>For denoising with pronounced spatially heteroscedastic noise, token-mixing models achieve the strongest overall performance.
arXiv Detail & Related papers (2026-03-02T04:57:52Z) - Moving Beyond Functional Connectivity: Time-Series Modeling for fMRI-Based Brain Disorder Classification [8.837732238971187]
Functional magnetic resonance imaging (fMRI) enables non-invasive brain disorder classification by capturing blood-oxygen-level-dependent (BOLD) signals.<n>Most existing methods rely on functional connectivity (FC) via Pearson correlation.<n>We benchmark state-of-the-art temporal models on raw BOLD signals across five public datasets.
arXiv Detail & Related papers (2026-02-09T04:42:42Z) - Scalable Spatio-Temporal SE(3) Diffusion for Long-Horizon Protein Dynamics [51.85385061275941]
Molecular dynamics (MD) simulations remain the gold standard for studying protein dynamics.<n>Recent generative models have shown promise in accelerating simulations, yet they struggle with long-horizon generation.<n>We present STAR-MD, a scalable diffusion model that generates physically plausible protein trajectories over micro-scale timescales.
arXiv Detail & Related papers (2026-02-02T14:13:28Z) - NeuroSSM: Multiscale Differential State-Space Modeling for Context-Aware fMRI Analysis [4.753690672619091]
We propose NeuroSSM, a selective state-space architecture designed for end-to-end analysis of raw BOLD signals in fMRI time series.<n>NeuroSSM addresses the above limitations through two complementary design components.<n> Experiments on clinical and non-clinical datasets demonstrate that NeuroSSM achieves competitive performance and efficiency against state-of-the-art fMRI analysis methods.
arXiv Detail & Related papers (2026-01-03T16:35:45Z) - Learning Patient-Specific Disease Dynamics with Latent Flow Matching for Longitudinal Imaging Generation [17.33607122354623]
Understanding disease progression is a central clinical challenge with implications for early diagnosis and personalized treatment.<n>We propose to treat the disease dynamic as a velocity field and leverage Flow Matching (FM) to align the temporal evolution of patient data.<n>We present $$-LFM, a framework for modeling patient-specific latent progression with flow matching.
arXiv Detail & Related papers (2025-12-09T23:13:54Z) - Flow-Guided Implicit Neural Representation for Motion-Aware Dynamic MRI Reconstruction [19.66198868468091]
Dynamic magnetic resonance imaging (dMRI) captures anatomy temporally-resolved but is often challenged by limited sampling and motion-induced artifacts.<n>In this work, we propose a novel implicit neural representation framework that jointly models both the dynamic image sequence and its underlying motion field.<n> Experiments on dynamic cardiac MRI datasets demonstrate that the method outperforms state-of-the-art motion-inspired and deep learning approaches.
arXiv Detail & Related papers (2025-11-21T04:51:23Z) - Drift No More? Context Equilibria in Multi-Turn LLM Interactions [58.69551510148673]
contexts drift is the gradual divergence of a model's outputs from goal-consistent behavior across turns.<n>Unlike single-turn errors, drift unfolds temporally and is poorly captured by static evaluation metrics.<n>We show that multi-turn drift can be understood as a controllable equilibrium phenomenon rather than as inevitable decay.
arXiv Detail & Related papers (2025-10-09T04:48:49Z) - X$^{2}$-Gaussian: 4D Radiative Gaussian Splatting for Continuous-time Tomographic Reconstruction [64.2059940799033]
Current methods discretize temporal resolution into fixed phases with respiratory gating devices.<n>X$2$-Gaussian, a novel framework, enables continuous-time 4DCT reconstruction by integrating dynamic radiative splatting with self-supervised respiratory motion learning.
arXiv Detail & Related papers (2025-03-27T17:59:57Z) - Trajectory Flow Matching with Applications to Clinical Time Series Modeling [77.58277281319253]
Trajectory Flow Matching (TFM) trains a Neural SDE in a simulation-free manner, bypassing backpropagation through the dynamics.<n>We demonstrate improved performance on three clinical time series datasets in terms of absolute performance and uncertainty prediction.
arXiv Detail & Related papers (2024-10-28T15:54:50Z) - Synthesizing Late-Stage Contrast Enhancement in Breast MRI: A Comprehensive Pipeline Leveraging Temporal Contrast Enhancement Dynamics [0.3499870393443268]
This study presents a pipeline for synthesizing late-phase DCE-MRI images from early-phase data.<n>The proposed approach introduces a novel loss function, Time Intensity Loss (TI-loss), leveraging the temporal behavior of contrast agents to guide the training of a generative model.<n>Two metrics are proposed to evaluate image quality: the Contrast Agent Pattern Score ($mathcalCP_s$), which validates enhancement patterns in annotated regions, and the Average Difference in Enhancement ($mathcalED$), measuring differences between real and generated enhancements.
arXiv Detail & Related papers (2024-09-03T04:31:49Z) - Learning Successor Features with Distributed Hebbian Temporal Memory [44.99833362998488]
This paper presents a novel approach to address the challenge of online sequence learning for decision making under uncertainty.<n>The proposed algorithm, Distributed Hebbian Temporal Memory (DHTM), is based on the factor graph formalism and a multi-component neuron model.<n> Experimental results show that DHTM outperforms LSTM, RWKV and a biologically inspired HMM-like algorithm, CSCG, on non-stationary data sets.
arXiv Detail & Related papers (2023-10-20T10:03:14Z) - Individualized Dosing Dynamics via Neural Eigen Decomposition [51.62933814971523]
We introduce the Neural Eigen Differential Equation algorithm (NESDE)
NESDE provides individualized modeling, tunable generalization to new treatment policies, and fast, continuous, closed-form prediction.
We demonstrate the robustness of NESDE in both synthetic and real medical problems, and use the learned dynamics to publish simulated medical gym environments.
arXiv Detail & Related papers (2023-06-24T17:01:51Z) - A Neural PDE Solver with Temporal Stencil Modeling [44.97241931708181]
Recent Machine Learning (ML) models have shown new promises in capturing important dynamics in high-resolution signals.
This study shows that significant information is often lost in the low-resolution down-sampled features.
We propose a new approach, which combines the strengths of advanced time-series sequence modeling and state-of-the-art neural PDE solvers.
arXiv Detail & Related papers (2023-02-16T06:13:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.