ODEFormer: Symbolic Regression of Dynamical Systems with Transformers
- URL: http://arxiv.org/abs/2310.05573v1
- Date: Mon, 9 Oct 2023 09:54:12 GMT
- Title: ODEFormer: Symbolic Regression of Dynamical Systems with Transformers
- Authors: St\'ephane d'Ascoli, S\"oren Becker, Alexander Mathis, Philippe
Schwaller, Niki Kilbertus
- Abstract summary: We introduce ODEFormer, the first transformer able to infer multidimensional ordinary differential equation (ODE) systems in symbolic form.
We perform extensive evaluations on two datasets: (i) the existing "Strogatz" dataset featuring two-dimensional systems; (ii) ODEBench, a collection of one- to four-dimensional systems.
- Score: 47.75031734856786
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce ODEFormer, the first transformer able to infer multidimensional
ordinary differential equation (ODE) systems in symbolic form from the
observation of a single solution trajectory. We perform extensive evaluations
on two datasets: (i) the existing "Strogatz" dataset featuring two-dimensional
systems; (ii) ODEBench, a collection of one- to four-dimensional systems that
we carefully curated from the literature to provide a more holistic benchmark.
ODEFormer consistently outperforms existing methods while displaying
substantially improved robustness to noisy and irregularly sampled
observations, as well as faster inference. We release our code, model and
benchmark dataset publicly.
Related papers
- Path-minimizing Latent ODEs for improved extrapolation and inference [0.0]
Latent ODE models provide flexible descriptions of dynamic systems, but they can struggle with extrapolation and predicting complicated non-linear dynamics.
In this paper we exploit this dichotomy by encouraging time-independent latent representations.
By replacing the common variational penalty in latent space with an $ell$ penalty on the path length of each system, the models learn data representations that can easily be distinguished from those of systems with different configurations.
This results in faster training, smaller models, more accurate and long-time extrapolation compared to the baseline ODE models with GRU, RNN, and LSTM/decoders on tests with
arXiv Detail & Related papers (2024-10-11T15:50:01Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Foundational Inference Models for Dynamical Systems [5.549794481031468]
We offer a fresh perspective on the classical problem of imputing missing time series data, whose underlying dynamics are assumed to be determined by ODEs.
We propose a novel supervised learning framework for zero-shot time series imputation, through parametric functions satisfying some (hidden) ODEs.
We empirically demonstrate that one and the same (pretrained) recognition model can perform zero-shot imputation across 63 distinct time series with missing values.
arXiv Detail & Related papers (2024-02-12T11:48:54Z) - Predicting Ordinary Differential Equations with Transformers [65.07437364102931]
We develop a transformer-based sequence-to-sequence model that recovers scalar ordinary differential equations (ODEs) in symbolic form from irregularly sampled and noisy observations of a single solution trajectory.
Our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing law of a new observed solution in a few forward passes of the model.
arXiv Detail & Related papers (2023-07-24T08:46:12Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Discovering ordinary differential equations that govern time-series [65.07437364102931]
We propose a transformer-based sequence-to-sequence model that recovers scalar autonomous ordinary differential equations (ODEs) in symbolic form from time-series data of a single observed solution of the ODE.
Our method is efficiently scalable: after one-time pretraining on a large set of ODEs, we can infer the governing laws of a new observed solution in a few forward passes of the model.
arXiv Detail & Related papers (2022-11-05T07:07:58Z) - Discovering Sparse Interpretable Dynamics from Partial Observations [0.0]
We propose a machine learning framework for discovering these governing equations using only partial observations.
Our tests show that this method can successfully reconstruct the full system state.
arXiv Detail & Related papers (2021-07-22T18:23:23Z) - Variational Hyper RNN for Sequence Modeling [69.0659591456772]
We propose a novel probabilistic sequence model that excels at capturing high variability in time series data.
Our method uses temporal latent variables to capture information about the underlying data pattern.
The efficacy of the proposed method is demonstrated on a range of synthetic and real-world sequential data.
arXiv Detail & Related papers (2020-02-24T19:30:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.