Discovering ordinary differential equations that govern time-series
- URL: http://arxiv.org/abs/2211.02830v1
- Date: Sat, 5 Nov 2022 07:07:58 GMT
- Title: Discovering ordinary differential equations that govern time-series
- Authors: S\"oren Becker, Michal Klein, Alexander Neitz, Giambattista
Parascandolo, Niki Kilbertus
- Abstract summary: We propose a transformer-based sequence-to-sequence model that recovers scalar autonomous ordinary differential equations (ODEs) in symbolic form from time-series data of a single observed solution of the ODE.
Our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing laws of a new observed solution in a few forward passes of the model.
- Score: 65.07437364102931
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Natural laws are often described through differential equations yet finding a
differential equation that describes the governing law underlying observed data
is a challenging and still mostly manual task. In this paper we make a step
towards the automation of this process: we propose a transformer-based
sequence-to-sequence model that recovers scalar autonomous ordinary
differential equations (ODEs) in symbolic form from time-series data of a
single observed solution of the ODE. Our method is efficiently scalable: after
one-time pretraining on a large set of ODEs, we can infer the governing laws of
a new observed solution in a few forward passes of the model. Then we show that
our model performs better or on par with existing methods in various test cases
in terms of accurate symbolic recovery of the ODE, especially for more complex
expressions.
Related papers
- On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Predicting Ordinary Differential Equations with Transformers [65.07437364102931]
We develop a transformer-based sequence-to-sequence model that recovers scalar ordinary differential equations (ODEs) in symbolic form from irregularly sampled and noisy observations of a single solution trajectory.
Our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing law of a new observed solution in a few forward passes of the model.
arXiv Detail & Related papers (2023-07-24T08:46:12Z) - Symbolic Recovery of Differential Equations: The Identifiability Problem [52.158782751264205]
Symbolic recovery of differential equations is the ambitious attempt at automating the derivation of governing equations.
We provide both necessary and sufficient conditions for a function to uniquely determine the corresponding differential equation.
We then use our results to devise numerical algorithms aiming to determine whether a function solves a differential equation uniquely.
arXiv Detail & Related papers (2022-10-15T17:32:49Z) - Constraining Gaussian Processes to Systems of Linear Ordinary
Differential Equations [5.33024001730262]
LODE-GPs follow a system of linear homogeneous ODEs with constant coefficients.
We show the effectiveness of LODE-GPs in a number of experiments.
arXiv Detail & Related papers (2022-08-26T09:16:53Z) - D-CIPHER: Discovery of Closed-form Partial Differential Equations [80.46395274587098]
We propose D-CIPHER, which is robust to measurement artifacts and can uncover a new and very general class of differential equations.
We further design a novel optimization procedure, CoLLie, to help D-CIPHER search through this class efficiently.
arXiv Detail & Related papers (2022-06-21T17:59:20Z) - A Probabilistic State Space Model for Joint Inference from Differential
Equations and Data [23.449725313605835]
We show a new class of solvers for ordinary differential equations (ODEs) that phrase the solution process directly in terms of Bayesian filtering.
It then becomes possible to perform approximate Bayesian inference on the latent force as well as the ODE solution in a single, linear complexity pass of an extended Kalman filter.
We demonstrate the expressiveness and performance of the algorithm by training a non-parametric SIRD model on data from the COVID-19 outbreak.
arXiv Detail & Related papers (2021-03-18T10:36:09Z) - STEER: Simple Temporal Regularization For Neural ODEs [80.80350769936383]
We propose a new regularization technique: randomly sampling the end time of the ODE during training.
The proposed regularization is simple to implement, has negligible overhead and is effective across a wide variety of tasks.
We show through experiments on normalizing flows, time series models and image recognition that the proposed regularization can significantly decrease training time and even improve performance over baseline models.
arXiv Detail & Related papers (2020-06-18T17:44:50Z) - Neural Controlled Differential Equations for Irregular Time Series [17.338923885534197]
An ordinary differential equation is determined by its initial condition, and there is no mechanism for adjusting the trajectory based on subsequent observations.
Here we demonstrate how this may be resolved through the well-understood mathematics of emphcontrolled differential equations
We show that our model achieves state-of-the-art performance against similar (ODE or RNN based) models in empirical studies on a range of datasets.
arXiv Detail & Related papers (2020-05-18T17:52:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.