Training-free score-based diffusion for parameter-dependent stochastic dynamical systems
- URL: http://arxiv.org/abs/2602.02113v1
- Date: Mon, 02 Feb 2026 13:54:36 GMT
- Title: Training-free score-based diffusion for parameter-dependent stochastic dynamical systems
- Authors: Minglei Yang, Sicheng He,
- Abstract summary: We present a training-free conditional diffusion model framework for learning flow maps of parameter-dependent SDEs.<n>A joint kernel-weighted Monte Carlo estimator approximates the conditional score function using trajectory data sampled at discrete parameter values.<n>The resulting generative model produces sample trajectories for any parameter value within the training range without retraining.
- Score: 2.4755898204110642
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Simulating parameter-dependent stochastic differential equations (SDEs) presents significant computational challenges, as separate high-fidelity simulations are typically required for each parameter value of interest. Despite the success of machine learning methods in learning SDE dynamics, existing approaches either require expensive neural network training for score function estimation or lack the ability to handle continuous parameter dependence. We present a training-free conditional diffusion model framework for learning stochastic flow maps of parameter-dependent SDEs, where both drift and diffusion coefficients depend on physical parameters. The key technical innovation is a joint kernel-weighted Monte Carlo estimator that approximates the conditional score function using trajectory data sampled at discrete parameter values, enabling interpolation across both state space and the continuous parameter domain. Once trained, the resulting generative model produces sample trajectories for any parameter value within the training range without retraining, significantly accelerating parameter studies, uncertainty quantification, and real-time filtering applications. The performance of the proposed approach is demonstrated via three numerical examples of increasing complexity, showing accurate approximation of conditional distributions across varying parameter values.
Related papers
- Conditioning on PDE Parameters to Generalise Deep Learning Emulation of Stochastic and Chaotic Dynamics [0.1753733541634709]
We present a deep learning emulator for chaotic andtemporal-temporal systems conditioned on the parameter values of the underlying partial differential equations (Ps)<n>Our approach involves pre-training the model on a single parameter domain, followed by fine-tuning on a smaller, yet diverse dataset, enabling generalisation across a broad range of parameter values.<n>This enables computationally efficient pre-training on smaller domains while requiring only small additional dataset to learn how to generalise to larger domain sizes.
arXiv Detail & Related papers (2025-09-11T16:37:45Z) - Efficient Transformed Gaussian Process State-Space Models for Non-Stationary High-Dimensional Dynamical Systems [49.819436680336786]
We propose an efficient transformed Gaussian process state-space model (ETGPSSM) for scalable and flexible modeling of high-dimensional, non-stationary dynamical systems.<n>Specifically, our ETGPSSM integrates a single shared GP with input-dependent normalizing flows, yielding an expressive implicit process prior that captures complex, non-stationary transition dynamics.<n>Our ETGPSSM outperforms existing GPSSMs and neural network-based SSMs in terms of computational efficiency and accuracy.
arXiv Detail & Related papers (2025-03-24T03:19:45Z) - In-Context Learning of Stochastic Differential Equations with Foundation Inference Models [6.785438664749581]
differential equations (SDEs) describe dynamical systems where deterministic flows, governed by a drift function, are superimposed with random fluctuations, dictated by a diffusion function.<n>We introduce FIM-SDE (Foundation Inference Model for SDEs), a pretrained recognition model that delivers accurate in-context estimation of the drift and diffusion functions of low-dimensional SDEs.<n>We demonstrate that FIM-SDE achieves robust in-context function estimation across a wide range of synthetic and real-world processes.
arXiv Detail & Related papers (2025-02-26T11:04:02Z) - Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Estimating the Distribution of Parameters in Differential Equations with Repeated Cross-Sectional Data [5.79648227233365]
In economy, politics, and biology, observation data points in the time series are often independently obtained.
Traditional methods for parameter estimation in differential equations have limitations in estimating the shape of parameter distributions.
We introduce a novel method, Estimation of.
EPD, providing accurate distribution of parameters without loss of data information.
arXiv Detail & Related papers (2024-04-23T10:01:43Z) - Diffusion Tempering Improves Parameter Estimation with Probabilistic Integrators for Ordinary Differential Equations [34.500484733973536]
Ordinary differential equations (ODEs) are widely used to describe dynamical systems in science, but identifying parameters that explain experimental measurements is challenging.
We propose diffusion tempering, a novel regularization technique for probabilistic numerical methods which improves convergence of gradient-based parameter optimization in ODEs.
We demonstrate that our method is effective for dynamical systems of different complexity and show that it obtains reliable parameter estimates for a Hodgkin-Huxley model with a practically relevant number of parameters.
arXiv Detail & Related papers (2024-02-19T15:36:36Z) - Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Deep learning-based estimation of time-dependent parameters in Markov
models with application to nonlinear regression and SDEs [0.0]
We present a novel deep learning method for estimating time-dependent parameters in Markov processes through discrete sampling.
Our work contributes to SDE-based model parameter estimation, offering a versatile tool for diverse fields.
arXiv Detail & Related papers (2023-12-13T20:13:38Z) - Learning minimal representations of stochastic processes with
variational autoencoders [52.99137594502433]
We introduce an unsupervised machine learning approach to determine the minimal set of parameters required to describe a process.
Our approach enables for the autonomous discovery of unknown parameters describing processes.
arXiv Detail & Related papers (2023-07-21T14:25:06Z) - MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood
Inference from Sampled Trajectories [61.3299263929289]
Simulation-based inference enables learning the parameters of a model even when its likelihood cannot be computed in practice.
One class of methods uses data simulated with different parameters to infer an amortized estimator for the likelihood-to-evidence ratio.
We show that this approach can be formulated in terms of mutual information between model parameters and simulated data.
arXiv Detail & Related papers (2021-06-03T12:59:16Z) - Online Statistical Inference for Stochastic Optimization via
Kiefer-Wolfowitz Methods [8.890430804063705]
We first present the distribution for the Polyak-Ruppert-averaging type Kiefer-Wolfowitz (AKW) estimators.
The distributional result reflects the trade-off between statistical efficiency and function query complexity.
arXiv Detail & Related papers (2021-02-05T19:22:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.