Extracting Dynamical Models from Data
- URL: http://arxiv.org/abs/2110.06917v6
- Date: Wed, 31 Jan 2024 16:29:09 GMT
- Title: Extracting Dynamical Models from Data
- Authors: Michael F. Zimmer
- Abstract summary: The problem of determining the underlying dynamics of a system when only given data of its state over time has challenged scientists for decades.
In this paper, the approach of using machine learning to model the updates of the phase space variables is introduced.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The problem of determining the underlying dynamics of a system when only
given data of its state over time has challenged scientists for decades. In
this paper, the approach of using machine learning to model the updates of the
phase space variables is introduced; this is done as a function of the phase
space variables. (More generally, the modeling is done over functions of the
jet space.) This approach (named FJet) allows one to accurately replicate the
dynamics, and is demonstrated on the examples of the damped harmonic
oscillator, the damped pendulum, and the Duffing oscillator; the underlying
differential equation is also accurately recovered for each example. In
addition, the results in no way depend on how the data is sampled over time
(i.e., regularly or irregularly). It is demonstrated that a regression
implementation of FJet is similar to the model resulting from a Taylor series
expansion of the Runge-Kutta (RK) numerical integration scheme. This
identification confers the advantage of explicitly revealing the function space
to use in the modeling, as well as the associated uncertainty quantification
for the updates. Finally, it is shown in the undamped harmonic oscillator
example that the stability of the updates is stable $10^9$ times longer than
with $4$th-order RK (with time step $0.1$).
Related papers
- Turning mechanistic models into forecasters by using machine learning [5.9650173644260605]
We develop a forecasting model for complex dynamical systems using time-varying parameters.<n>Our model achieves a mean absolute error below 3% for learning a time series and below 6% for forecasting up to a month ahead.<n>Our findings demonstrate that integrating time-varying parameters into data-driven discovery of differential equations improves both modeling accuracy and forecasting performance.
arXiv Detail & Related papers (2026-02-04T01:00:08Z) - Generative Learning for Slow Manifolds and Bifurcation Diagrams [0.35587965024910395]
Conditional score-based generative models (cSGMs) have demonstrated capabilities in generating plausible data from target distributions conditioned on some given label.
We present a framework for using cSGMs to quickly initialize on a low-dimensional (reduced-order) slow manifold of a multi-time-scale system.
This conditional sampling can help uncover the geometry of the reduced slow-manifold and/or approximately fill in'' missing segments of steady states in a bifurcation diagram.
arXiv Detail & Related papers (2025-04-29T02:38:44Z) - Efficient dynamic modal load reconstruction using physics-informed Gaussian processes based on frequency-sparse Fourier basis functions [0.0]
This paper presents an efficient dynamic load reconstruction method using physics-informed Gaussian processes (GP)
The GP's covariance matrices are built using the description of the system dynamics, and the model is trained using structural response measurements.
The developed model holds potential for applications in structural health monitoring, damage prognosis, and load model validation.
arXiv Detail & Related papers (2025-03-12T14:16:27Z) - Value function estimation using conditional diffusion models for control [62.27184818047923]
We propose a simple algorithm called Diffused Value Function (DVF)
It learns a joint multi-step model of the environment-robot interaction dynamics using a diffusion model.
We show how DVF can be used to efficiently capture the state visitation measure for multiple controllers.
arXiv Detail & Related papers (2023-06-09T18:40:55Z) - Generative modeling for time series via Schr{\"o}dinger bridge [0.0]
We propose a novel generative model for time series based on Schr"dinger bridge (SB) approach.
This consists in the entropic via optimal transport between a reference probability measure on path space and a target measure consistent with the joint data distribution of the time series.
arXiv Detail & Related papers (2023-04-11T09:45:06Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Discovering Dynamic Patterns from Spatiotemporal Data with Time-Varying
Low-Rank Autoregression [12.923271427789267]
We develop a time-reduced-rank vector autoregression model whose coefficient are parameterized by low-rank tensor factorization.
In the temporal context, the complex time-varying system behaviors can be revealed by the temporal modes in the proposed model.
arXiv Detail & Related papers (2022-11-28T15:59:52Z) - On the Dynamics of Inference and Learning [0.0]
We present a treatment of this Bayesian updating process as a continuous dynamical system.
We show that when the Cram'er-Rao bound is saturated the learning rate is governed by a simple $1/T$ power-law.
arXiv Detail & Related papers (2022-04-19T18:04:36Z) - Learning continuous models for continuous physics [94.42705784823997]
We develop a test based on numerical analysis theory to validate machine learning models for science and engineering applications.
Our results illustrate how principled numerical analysis methods can be coupled with existing ML training/testing methodologies to validate models for science and engineering applications.
arXiv Detail & Related papers (2022-02-17T07:56:46Z) - Closed-form discovery of structural errors in models of chaotic systems
by integrating Bayesian sparse regression and data assimilation [0.0]
We introduce a framework named MEDIDA: Model Error Discovery with Interpretability and Data Assimilation.
In MEDIDA, first the model error is estimated from differences between the observed states and model-predicted states.
If observations are noisy, a data assimilation technique such as ensemble Kalman filter (EnKF) is first used to provide a noise-free analysis state of the system.
Finally, an equation-discovery technique, such as the relevance vector machine (RVM), a sparsity-promoting Bayesian method, is used to identify an interpretable, parsimonious, closed
arXiv Detail & Related papers (2021-10-01T17:19:28Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Stochastic embeddings of dynamical phenomena through variational
autoencoders [1.7205106391379026]
We use a recognition network to increase the observed space dimensionality during the reconstruction of the phase space.
Our validation shows that this approach not only recovers a state space that resembles the original one, but it is also able to synthetize new time series.
arXiv Detail & Related papers (2020-10-13T10:10:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.