Foundational Inference Models for Dynamical Systems
- URL: http://arxiv.org/abs/2402.07594v1
- Date: Mon, 12 Feb 2024 11:48:54 GMT
- Title: Foundational Inference Models for Dynamical Systems
- Authors: Patrick Seifner, Kostadin Cvejoski, Ramses J. Sanchez
- Abstract summary: We propose a novel supervised learning framework for zero-shot inference of ODEs from noisy data.
We first generate large datasets of one-dimensional ODEs, by sampling distributions over the space of initial conditions.
We then learn neural maps between noisy observations on the solutions of these equations, and their corresponding initial condition and vector fields.
- Score: 3.95944314850151
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Ordinary differential equations (ODEs) underlie dynamical systems which serve
as models for a vast number of natural and social phenomena. Yet inferring the
ODE that best describes a set of noisy observations on one such phenomenon can
be remarkably challenging, and the models available to achieve it tend to be
highly specialized and complex too. In this work we propose a novel supervised
learning framework for zero-shot inference of ODEs from noisy data. We first
generate large datasets of one-dimensional ODEs, by sampling distributions over
the space of initial conditions, and the space of vector fields defining them.
We then learn neural maps between noisy observations on the solutions of these
equations, and their corresponding initial condition and vector fields. The
resulting models, which we call foundational inference models (FIM), can be (i)
copied and matched along the time dimension to increase their resolution; and
(ii) copied and composed to build inference models of any dimensionality,
without the need of any finetuning. We use FIM to model both ground-truth
dynamical systems of different dimensionalities and empirical time series data
in a zero-shot fashion, and outperform state-of-the-art models which are
finetuned to these systems. Our (pretrained) FIMs are available online
Related papers
- Understanding the differences in Foundation Models: Attention, State Space Models, and Recurrent Neural Networks [50.29356570858905]
We introduce the Dynamical Systems Framework (DSF), which allows a principled investigation of all these architectures in a common representation.
We provide principled comparisons between softmax attention and other model classes, discussing the theoretical conditions under which softmax attention can be approximated.
This shows the DSF's potential to guide the systematic development of future more efficient and scalable foundation models.
arXiv Detail & Related papers (2024-05-24T17:19:57Z) - Synthetic location trajectory generation using categorical diffusion
models [50.809683239937584]
Diffusion models (DPMs) have rapidly evolved to be one of the predominant generative models for the simulation of synthetic data.
We propose using DPMs for the generation of synthetic individual location trajectories (ILTs) which are sequences of variables representing physical locations visited by individuals.
arXiv Detail & Related papers (2024-02-19T15:57:39Z) - Anamnesic Neural Differential Equations with Orthogonal Polynomial
Projections [6.345523830122166]
We propose PolyODE, a formulation that enforces long-range memory and preserves a global representation of the underlying dynamical system.
Our construction is backed by favourable theoretical guarantees and we demonstrate that it outperforms previous works in the reconstruction of past and future data.
arXiv Detail & Related papers (2023-03-03T10:49:09Z) - Learning Differential Operators for Interpretable Time Series Modeling [34.32259687441212]
We propose a learning framework that can automatically obtain interpretable PDE models from sequential data.
Our model can provide valuable interpretability and achieve comparable performance to state-of-the-art models.
arXiv Detail & Related papers (2022-09-03T20:14:31Z) - Domain-aware Control-oriented Neural Models for Autonomous Underwater
Vehicles [2.4779082385578337]
We present control-oriented parametric models with varying levels of domain-awareness.
We employ universal differential equations to construct data-driven blackbox and graybox representations of the AUV dynamics.
arXiv Detail & Related papers (2022-08-15T17:01:14Z) - Neural Modal ODEs: Integrating Physics-based Modeling with Neural ODEs
for Modeling High Dimensional Monitored Structures [9.065343126886093]
This paper proposes a framework - termed Neural Modal ODEs - to integrate physics-based modeling with deep learning.
An autoencoder learns the abstract mappings from the first few items of observational data to the initial values of latent variables.
The decoder of the proposed model adopts the eigenmodes derived from an eigen-analysis applied to the linearized portion of a physics-based model.
arXiv Detail & Related papers (2022-07-16T09:30:20Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Low-Rank Constraints for Fast Inference in Structured Models [110.38427965904266]
This work demonstrates a simple approach to reduce the computational and memory complexity of a large class of structured models.
Experiments with neural parameterized structured models for language modeling, polyphonic music modeling, unsupervised grammar induction, and video modeling show that our approach matches the accuracy of standard models at large state spaces.
arXiv Detail & Related papers (2022-01-08T00:47:50Z) - Analysis of ODE2VAE with Examples [0.0]
Ordinary Differential Equation Variational Auto-Encoder (ODE2VAE) is a deep latent variable model.
We show that the model is able to learn meaningful latent representations to an extent without any supervision.
arXiv Detail & Related papers (2021-08-10T20:12:26Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - S2RMs: Spatially Structured Recurrent Modules [105.0377129434636]
We take a step towards exploiting dynamic structure that are capable of simultaneously exploiting both modular andtemporal structures.
We find our models to be robust to the number of available views and better capable of generalization to novel tasks without additional training.
arXiv Detail & Related papers (2020-07-13T17:44:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.