ODEFormer: Symbolic Regression of Dynamical Systems with Transformers
- URL: http://arxiv.org/abs/2310.05573v1
- Date: Mon, 9 Oct 2023 09:54:12 GMT
- Title: ODEFormer: Symbolic Regression of Dynamical Systems with Transformers
- Authors: St\'ephane d'Ascoli, S\"oren Becker, Alexander Mathis, Philippe
Schwaller, Niki Kilbertus
- Abstract summary: We introduce ODEFormer, the first transformer able to infer multidimensional ordinary differential equation (ODE) systems in symbolic form.
We perform extensive evaluations on two datasets: (i) the existing "Strogatz" dataset featuring two-dimensional systems; (ii) ODEBench, a collection of one- to four-dimensional systems.
- Score: 47.75031734856786
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce ODEFormer, the first transformer able to infer multidimensional
ordinary differential equation (ODE) systems in symbolic form from the
observation of a single solution trajectory. We perform extensive evaluations
on two datasets: (i) the existing "Strogatz" dataset featuring two-dimensional
systems; (ii) ODEBench, a collection of one- to four-dimensional systems that
we carefully curated from the literature to provide a more holistic benchmark.
ODEFormer consistently outperforms existing methods while displaying
substantially improved robustness to noisy and irregularly sampled
observations, as well as faster inference. We release our code, model and
benchmark dataset publicly.
Related papers
- On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Foundational Inference Models for Dynamical Systems [3.95944314850151]
We propose a novel supervised learning framework for zero-shot inference of ODEs from noisy data.
We first generate large datasets of one-dimensional ODEs, by sampling distributions over the space of initial conditions.
We then learn neural maps between noisy observations on the solutions of these equations, and their corresponding initial condition and vector fields.
arXiv Detail & Related papers (2024-02-12T11:48:54Z) - Predicting Ordinary Differential Equations with Transformers [65.07437364102931]
We develop a transformer-based sequence-to-sequence model that recovers scalar ordinary differential equations (ODEs) in symbolic form from irregularly sampled and noisy observations of a single solution trajectory.
Our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing law of a new observed solution in a few forward passes of the model.
arXiv Detail & Related papers (2023-07-24T08:46:12Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Discovering ordinary differential equations that govern time-series [65.07437364102931]
We propose a transformer-based sequence-to-sequence model that recovers scalar autonomous ordinary differential equations (ODEs) in symbolic form from time-series data of a single observed solution of the ODE.
Our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing laws of a new observed solution in a few forward passes of the model.
arXiv Detail & Related papers (2022-11-05T07:07:58Z) - Discovering Sparse Interpretable Dynamics from Partial Observations [0.0]
We propose a machine learning framework for discovering these governing equations using only partial observations.
Our tests show that this method can successfully reconstruct the full system state.
arXiv Detail & Related papers (2021-07-22T18:23:23Z) - Variational Inference and Learning of Piecewise-linear Dynamical Systems [33.23231229260119]
We propose a variational approximation of piecewise linear dynamical systems.
We show that the model parameters can be split into two sets, static and dynamic parameters, and that the former parameters can be estimated off-line together with the number of linear modes, or the number of states of the switching variable.
arXiv Detail & Related papers (2020-06-02T14:40:35Z) - Variational Hyper RNN for Sequence Modeling [69.0659591456772]
We propose a novel probabilistic sequence model that excels at capturing high variability in time series data.
Our method uses temporal latent variables to capture information about the underlying data pattern.
The efficacy of the proposed method is demonstrated on a range of synthetic and real-world sequential data.
arXiv Detail & Related papers (2020-02-24T19:30:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.