Post-Regularization Confidence Bands for Ordinary Differential Equations
- URL: http://arxiv.org/abs/2110.12510v2
- Date: Sun, 4 Feb 2024 19:55:36 GMT
- Title: Post-Regularization Confidence Bands for Ordinary Differential Equations
- Authors: Xiaowu Dai and Lexin Li
- Abstract summary: We construct confidence band for individual regulatory function in ODE with unknown functionals and noisy data observations.
We show that the constructed confidence band has the desired kernel coverage probability, and the recovered regulatory network approaches the truth with probability tending to one.
- Score: 6.3582148777824115
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Ordinary differential equation (ODE) is an important tool to study the
dynamics of a system of biological and physical processes. A central question
in ODE modeling is to infer the significance of individual regulatory effect of
one signal variable on another. However, building confidence band for ODE with
unknown regulatory relations is challenging, and it remains largely an open
question. In this article, we construct post-regularization confidence band for
individual regulatory function in ODE with unknown functionals and noisy data
observations. Our proposal is the first of its kind, and is built on two novel
ingredients. The first is a new localized kernel learning approach that
combines reproducing kernel learning with local Taylor approximation, and the
second is a new de-biasing method that tackles infinite-dimensional functionals
and additional measurement errors. We show that the constructed confidence band
has the desired asymptotic coverage probability, and the recovered regulatory
network approaches the truth with probability tending to one. We establish the
theoretical properties when the number of variables in the system can be either
smaller or larger than the number of sampling time points, and we study the
regime-switching phenomenon. We demonstrate the efficacy of the proposed method
through both simulations and illustrations with two data applications.
Related papers
- Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - Foundational Inference Models for Dynamical Systems [5.549794481031468]
We offer a fresh perspective on the classical problem of imputing missing time series data, whose underlying dynamics are assumed to be determined by ODEs.
We propose a novel supervised learning framework for zero-shot time series imputation, through parametric functions satisfying some (hidden) ODEs.
We empirically demonstrate that one and the same (pretrained) recognition model can perform zero-shot imputation across 63 distinct time series with missing values.
arXiv Detail & Related papers (2024-02-12T11:48:54Z) - Generative Adversarial Networks to infer velocity components in rotating
turbulent flows [2.0873604996221946]
We show that CNN and GAN always outperform EPOD both concerning point-wise and statistical reconstructions.
The analysis is performed using both standard validation tools based on $L$ spatial distance between the prediction and the ground truth.
arXiv Detail & Related papers (2023-01-18T13:59:01Z) - Discovering ordinary differential equations that govern time-series [65.07437364102931]
We propose a transformer-based sequence-to-sequence model that recovers scalar autonomous ordinary differential equations (ODEs) in symbolic form from time-series data of a single observed solution of the ODE.
Our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing laws of a new observed solution in a few forward passes of the model.
arXiv Detail & Related papers (2022-11-05T07:07:58Z) - Identifiability and Asymptotics in Learning Homogeneous Linear ODE Systems from Discrete Observations [114.17826109037048]
Ordinary Differential Equations (ODEs) have recently gained a lot of attention in machine learning.
theoretical aspects, e.g., identifiability and properties of statistical estimation are still obscure.
This paper derives a sufficient condition for the identifiability of homogeneous linear ODE systems from a sequence of equally-spaced error-free observations sampled from a single trajectory.
arXiv Detail & Related papers (2022-10-12T06:46:38Z) - Calibrating multi-dimensional complex ODE from noisy data via deep
neural networks [7.77129750333676]
Ordinary differential equations (ODEs) are widely used to model complex dynamics that arises in biology, chemistry, engineering, finance, physics, etc.
We propose a two-stage nonparametric approach to address this problem.
We first extract the de-noised data and their higher order derivatives using boundary kernel method, and then feed them into a sparsely connected deep neural network with ReLU activation function.
arXiv Detail & Related papers (2021-06-07T13:17:16Z) - Data-driven discovery of interacting particle systems using Gaussian
processes [3.0938904602244346]
We study the data-driven discovery of distance-based interaction laws in second-order interacting particle systems.
We propose a learning approach that models the latent interaction kernel functions as Gaussian processes.
Numerical results on systems that exhibit different collective behaviors demonstrate efficient learning of our approach from scarce noisy trajectory data.
arXiv Detail & Related papers (2021-06-04T22:00:53Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Disentangling Observed Causal Effects from Latent Confounders using
Method of Moments [67.27068846108047]
We provide guarantees on identifiability and learnability under mild assumptions.
We develop efficient algorithms based on coupled tensor decomposition with linear constraints to obtain scalable and guaranteed solutions.
arXiv Detail & Related papers (2021-01-17T07:48:45Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.