Calibrating multi-dimensional complex ODE from noisy data via deep
neural networks
- URL: http://arxiv.org/abs/2106.03591v2
- Date: Tue, 19 Sep 2023 01:32:14 GMT
- Title: Calibrating multi-dimensional complex ODE from noisy data via deep
neural networks
- Authors: Kexuan Li, Fangfang Wang, Ruiqi Liu, Fan Yang, Zuofeng Shang
- Abstract summary: Ordinary differential equations (ODEs) are widely used to model complex dynamics that arises in biology, chemistry, engineering, finance, physics, etc.
We propose a two-stage nonparametric approach to address this problem.
We first extract the de-noised data and their higher order derivatives using boundary kernel method, and then feed them into a sparsely connected deep neural network with ReLU activation function.
- Score: 7.77129750333676
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Ordinary differential equations (ODEs) are widely used to model complex
dynamics that arises in biology, chemistry, engineering, finance, physics, etc.
Calibration of a complicated ODE system using noisy data is generally very
difficult. In this work, we propose a two-stage nonparametric approach to
address this problem. We first extract the de-noised data and their higher
order derivatives using boundary kernel method, and then feed them into a
sparsely connected deep neural network with ReLU activation function. Our
method is able to recover the ODE system without being subject to the curse of
dimensionality and complicated ODE structure. When the ODE possesses a general
modular structure, with each modular component involving only a few input
variables, and the network architecture is properly chosen, our method is
proven to be consistent. Theoretical properties are corroborated by an
extensive simulation study that demonstrates the validity and effectiveness of
the proposed method. Finally, we use our method to simultaneously characterize
the growth rate of Covid-19 infection cases from 50 states of the USA.
Related papers
- Balanced Neural ODEs: nonlinear model order reduction and Koopman operator approximations [0.0]
Variational Autoencoders (VAEs) are a powerful framework for learning compact latent representations.
NeuralODEs excel in learning transient system dynamics.
This work combines the strengths of both to create fast surrogate models with adjustable complexity.
arXiv Detail & Related papers (2024-10-14T05:45:52Z) - FMint: Bridging Human Designed and Data Pretrained Models for Differential Equation Foundation Model [5.748690310135373]
We propose a novel multi-modal foundation model, named textbfFMint, to bridge the gap between human-designed and data-driven models.
Built on a decoder-only transformer architecture with in-context learning, FMint utilizes both numerical and textual data to learn a universal error correction scheme.
Our results demonstrate the effectiveness of the proposed model in terms of both accuracy and efficiency compared to classical numerical solvers.
arXiv Detail & Related papers (2024-04-23T02:36:47Z) - ODE Discovery for Longitudinal Heterogeneous Treatment Effects Inference [69.24516189971929]
In this paper, we introduce a new type of solution in the longitudinal setting: a closed-form ordinary differential equation (ODE)
While we still rely on continuous optimization to learn an ODE, the resulting inference machine is no longer a neural network.
arXiv Detail & Related papers (2024-03-16T02:07:45Z) - Discovering ordinary differential equations that govern time-series [65.07437364102931]
We propose a transformer-based sequence-to-sequence model that recovers scalar autonomous ordinary differential equations (ODEs) in symbolic form from time-series data of a single observed solution of the ODE.
Our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing laws of a new observed solution in a few forward passes of the model.
arXiv Detail & Related papers (2022-11-05T07:07:58Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - A memory-efficient neural ODE framework based on high-level adjoint
differentiation [4.063868707697316]
We present a new neural ODE framework, PNODE, based on high-level discrete algorithmic differentiation.
We show that PNODE achieves the highest memory efficiency when compared with other reverse-accurate methods.
arXiv Detail & Related papers (2022-06-02T20:46:26Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Post-Regularization Confidence Bands for Ordinary Differential Equations [6.3582148777824115]
We construct confidence band for individual regulatory function in ODE with unknown functionals and noisy data observations.
We show that the constructed confidence band has the desired kernel coverage probability, and the recovered regulatory network approaches the truth with probability tending to one.
arXiv Detail & Related papers (2021-10-24T19:21:10Z) - A Probabilistic State Space Model for Joint Inference from Differential
Equations and Data [23.449725313605835]
We show a new class of solvers for ordinary differential equations (ODEs) that phrase the solution process directly in terms of Bayesian filtering.
It then becomes possible to perform approximate Bayesian inference on the latent force as well as the ODE solution in a single, linear complexity pass of an extended Kalman filter.
We demonstrate the expressiveness and performance of the algorithm by training a non-parametric SIRD model on data from the COVID-19 outbreak.
arXiv Detail & Related papers (2021-03-18T10:36:09Z) - Meta-Solver for Neural Ordinary Differential Equations [77.8918415523446]
We investigate how the variability in solvers' space can improve neural ODEs performance.
We show that the right choice of solver parameterization can significantly affect neural ODEs models in terms of robustness to adversarial attacks.
arXiv Detail & Related papers (2021-03-15T17:26:34Z) - STEER: Simple Temporal Regularization For Neural ODEs [80.80350769936383]
We propose a new regularization technique: randomly sampling the end time of the ODE during training.
The proposed regularization is simple to implement, has negligible overhead and is effective across a wide variety of tasks.
We show through experiments on normalizing flows, time series models and image recognition that the proposed regularization can significantly decrease training time and even improve performance over baseline models.
arXiv Detail & Related papers (2020-06-18T17:44:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.