HyperSINDy: Deep Generative Modeling of Nonlinear Stochastic Governing
Equations
- URL: http://arxiv.org/abs/2310.04832v1
- Date: Sat, 7 Oct 2023 14:41:59 GMT
- Title: HyperSINDy: Deep Generative Modeling of Nonlinear Stochastic Governing
Equations
- Authors: Mozes Jacobs, Bingni W. Brunton, Steven L. Brunton, J. Nathan Kutz,
Ryan V. Raut
- Abstract summary: We introduce HyperSINDy, a framework for modeling dynamics via a deep generative model of sparse governing equations from data.
Once trained, HyperSINDy generates dynamics via a differential equation whose coefficients are driven by a white noise.
In experiments, HyperSINDy recovers ground truth governing equations, with learnedity scaling to match that of the data.
- Score: 5.279268784803583
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The discovery of governing differential equations from data is an open
frontier in machine learning. The sparse identification of nonlinear dynamics
(SINDy) \citep{brunton_discovering_2016} framework enables data-driven
discovery of interpretable models in the form of sparse, deterministic
governing laws. Recent works have sought to adapt this approach to the
stochastic setting, though these adaptations are severely hampered by the curse
of dimensionality. On the other hand, Bayesian-inspired deep learning methods
have achieved widespread success in high-dimensional probabilistic modeling via
computationally efficient approximate inference techniques, suggesting the use
of these techniques for efficient stochastic equation discovery. Here, we
introduce HyperSINDy, a framework for modeling stochastic dynamics via a deep
generative model of sparse governing equations whose parametric form is
discovered from data. HyperSINDy employs a variational encoder to approximate
the distribution of observed states and derivatives. A hypernetwork
\citep{ha_hypernetworks_2016} transforms samples from this distribution into
the coefficients of a differential equation whose sparse form is learned
simultaneously using a trainable binary mask \citep{louizos_learning_2018}.
Once trained, HyperSINDy generates stochastic dynamics via a differential
equation whose coefficients are driven by a Gaussian white noise. In
experiments, HyperSINDy accurately recovers ground truth stochastic governing
equations, with learned stochasticity scaling to match that of the data.
Finally, HyperSINDy provides uncertainty quantification that scales to
high-dimensional systems. Taken together, HyperSINDy offers a promising
framework for model discovery and uncertainty quantification in real-world
systems, integrating sparse equation discovery methods with advances in
statistical machine learning and deep generative modeling.
Related papers
- Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Extracting Governing Laws from Sample Path Data of Non-Gaussian
Stochastic Dynamical Systems [4.527698247742305]
We infer equations with non-Gaussian L'evy noise from available data to reasonably predict dynamical behaviors.
We establish a theoretical framework and design a numerical algorithm to compute the asymmetric L'evy jump measure, drift and diffusion.
This method will become an effective tool in discovering the governing laws from available data sets and in understanding the mechanisms underlying complex random phenomena.
arXiv Detail & Related papers (2021-07-21T14:50:36Z) - Discovery of Nonlinear Dynamical Systems using a Runge-Kutta Inspired
Dictionary-based Sparse Regression Approach [9.36739413306697]
We blend machine learning and dictionary-based learning with numerical analysis tools to discover governing differential equations.
We obtain interpretable and parsimonious models which are prone to generalize better beyond the sampling regime.
We discuss its extension to governing equations, containing rational nonlinearities that typically appear in biological networks.
arXiv Detail & Related papers (2021-05-11T08:46:51Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z) - Stochastic Differential Equations with Variational Wishart Diffusions [18.590352916158093]
We present a non-parametric way of inferring differential equations for both regression tasks and continuous-time dynamical modelling.
The work has high emphasis on the part of the differential equation, also known as the diffusion, and modelling it by means of Wishart processes.
arXiv Detail & Related papers (2020-06-26T10:21:35Z) - A Data-Driven Approach for Discovering Stochastic Dynamical Systems with
Non-Gaussian Levy Noise [5.17900889163564]
We develop a new data-driven approach to extract governing laws from noisy data sets.
First, we establish a feasible theoretical framework, by expressing the drift coefficient, diffusion coefficient and jump measure.
We then design a numerical algorithm to compute the drift, diffusion coefficient and jump measure, and thus extract a governing equation with Gaussian and non-Gaussian noise.
arXiv Detail & Related papers (2020-05-07T21:29:17Z) - Data-Driven Discovery of Coarse-Grained Equations [0.0]
Multiscale modeling and simulations are two areas where learning on simulated data can lead to such discovery.
We replace the human discovery of such models with a machine-learning strategy based on sparse regression that can be executed in two modes.
A series of examples demonstrates the accuracy, robustness, and limitations of our approach to equation discovery.
arXiv Detail & Related papers (2020-01-30T23:41:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.