Optimizing differential equations to fit data and predict outcomes
- URL: http://arxiv.org/abs/2204.07833v1
- Date: Sat, 16 Apr 2022 16:08:08 GMT
- Title: Optimizing differential equations to fit data and predict outcomes
- Authors: Steven A. Frank
- Abstract summary: Recent technical advances in automatic differentiation through numerical differential equation solvers potentially change the fitting process into a relatively easy problem.
This article illustrates how to overcome a variety of common challenges, using the classic ecological data for oscillations in hare and lynx populations.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many scientific problems focus on observed patterns of change or on how to
design a system to achieve particular dynamics. Those problems often require
fitting differential equation models to target trajectories. Fitting such
models can be difficult because each evaluation of the fit must calculate the
distance between the model and target patterns at numerous points along a
trajectory. The gradient of the fit with respect to the model parameters can be
challenging. Recent technical advances in automatic differentiation through
numerical differential equation solvers potentially change the fitting process
into a relatively easy problem, opening up new possibilities to study dynamics.
However, application of the new tools to real data may fail to achieve a good
fit. This article illustrates how to overcome a variety of common challenges,
using the classic ecological data for oscillations in hare and lynx
populations. Models include simple ordinary differential equations (ODEs) and
neural ordinary differential equations (NODEs), which use artificial neural
networks to estimate the derivatives of differential equation systems.
Comparing the fits obtained with ODEs versus NODEs, representing small and
large parameter spaces, and changing the number of variable dimensions provide
insight into the geometry of the observed and model trajectories. To analyze
the quality of the models for predicting future observations, a
Bayesian-inspired preconditioned stochastic gradient Langevin dynamics (pSGLD)
calculation of the posterior distribution of predicted model trajectories
clarifies the tendency for various models to underfit or overfit the data.
Coupling fitted differential equation systems with pSGLD sampling provides a
powerful way to study the properties of optimization surfaces, raising an
analogy with mutation-selection dynamics on fitness landscapes.
Related papers
- Projected Neural Differential Equations for Learning Constrained Dynamics [3.570367665112327]
We introduce a new method for constraining neural differential equations based on projection of the learned vector field to the tangent space of the constraint manifold.
PNDEs outperform existing methods while requiring fewer hyper parameters.
The proposed approach demonstrates significant potential for enhancing the modeling of constrained dynamical systems.
arXiv Detail & Related papers (2024-10-31T06:32:43Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Towards Learning Stochastic Population Models by Gradient Descent [0.0]
We show that simultaneous estimation of parameters and structure poses major challenges for optimization procedures.
We demonstrate accurate estimation of models but find that enforcing the inference of parsimonious, interpretable models drastically increases the difficulty.
arXiv Detail & Related papers (2024-04-10T14:38:58Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Random Grid Neural Processes for Parametric Partial Differential
Equations [5.244037702157957]
We introduce a new class of spatially probabilistic physics and data informed deep latent models for PDEs.
We solve forward and inverse problems for parametric PDEs in a way that leads to the construction of Gaussian process models of solution fields.
We show how to incorporate noisy data in a principled manner into our physics informed model to improve predictions for problems where data may be available.
arXiv Detail & Related papers (2023-01-26T11:30:56Z) - On the Influence of Enforcing Model Identifiability on Learning dynamics
of Gaussian Mixture Models [14.759688428864159]
We propose a technique for extracting submodels from singular models.
Our method enforces model identifiability during training.
We show how the method can be applied to more complex models like deep neural networks.
arXiv Detail & Related papers (2022-06-17T07:50:22Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Compositional Modeling of Nonlinear Dynamical Systems with ODE-based
Random Features [0.0]
We present a novel, domain-agnostic approach to tackling this problem.
We use compositions of physics-informed random features, derived from ordinary differential equations.
We find that our approach achieves comparable performance to a number of other probabilistic models on benchmark regression tasks.
arXiv Detail & Related papers (2021-06-10T17:55:13Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.