Learning Quantities of Interest from Dynamical Systems for
Observation-Consistent Inversion
- URL: http://arxiv.org/abs/2009.06918v3
- Date: Fri, 16 Jul 2021 17:08:10 GMT
- Title: Learning Quantities of Interest from Dynamical Systems for
Observation-Consistent Inversion
- Authors: Steven Mattis and Kyle Robert Steffen and Troy Butler and Clint N.
Dawson and Donald Estep
- Abstract summary: We present a new framework, Learning Uncertain Quantities (LUQ), that facilitates the tractable solution of SIPs in dynamical systems.
LUQ provides routines for filtering data, unsupervised learning of the underlying dynamics, classifying observations, and feature extraction to learn the QoI map.
For scientific use, we provide links to our Python implementation of LUQ and to all data and scripts required to reproduce the results in this manuscript.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Dynamical systems arise in a wide variety of mathematical models from science
and engineering. A common challenge is to quantify uncertainties on model
inputs (parameters) that correspond to a quantitative characterization of
uncertainties on observable Quantities of Interest (QoI). To this end, we
consider a stochastic inverse problem (SIP) with a solution described by a
pullback probability measure. We call this an observation-consistent solution,
as its subsequent push-forward through the QoI map matches the observed
probability distribution on model outputs. A distinction is made between QoI
useful for solving the SIP and arbitrary model output data. In dynamical
systems, model output data are often given as a series of state variable
responses recorded over a particular time window. Consequently, the dimension
of output data can easily exceed $\mathcal{O}(1E4)$ or more due to the
frequency of observations, and the correct choice or construction of a QoI from
this data is not self-evident. We present a new framework, Learning Uncertain
Quantities (LUQ), that facilitates the tractable solution of SIPs for dynamical
systems. Given ensembles of predicted (simulated) time series and (noisy)
observed data, LUQ provides routines for filtering data, unsupervised learning
of the underlying dynamics, classifying observations, and feature extraction to
learn the QoI map. Subsequently, time series data are transformed into samples
of the underlying predicted and observed distributions associated with the QoI
so that solutions to the SIP are computable. Following the introduction and
demonstration of LUQ, numerical results from several SIPs are presented for a
variety of dynamical systems arising in the life and physical sciences. For
scientific reproducibility, we provide links to our Python implementation of
LUQ and to all data and scripts required to reproduce the results in this
manuscript.
Related papers
- From Displacements to Distributions: A Machine-Learning Enabled
Framework for Quantifying Uncertainties in Parameters of Computational Models [0.09208007322096533]
This work presents novel extensions for combining two frameworks for quantifying uncertainties in the modeling of engineered systems.
The data-consistent iteration (DC) framework poses an inverse problem and solution for quantifying aleatoric uncertainties in terms of pullback and push-forward measures for a given Quantity of Interest (QoI) map.
The Learning Uncertain Quantities (LUQ) framework defines a formal three-step machine-learning enabled process for transforming noisy datasets into samples of a learned QoI map.
arXiv Detail & Related papers (2024-03-04T20:40:50Z) - InVAErt networks: a data-driven framework for model synthesis and
identifiability analysis [0.0]
inVAErt is a framework for data-driven analysis and synthesis of physical systems.
It uses a deterministic decoder to represent the forward and inverse maps, a normalizing flow to capture the probabilistic distribution of system outputs, and a variational encoder to learn a compact latent representation for the lack of bijectivity between inputs and outputs.
arXiv Detail & Related papers (2023-07-24T07:58:18Z) - Interpretable reduced-order modeling with time-scale separation [9.889399863931676]
Partial Differential Equations (PDEs) with high dimensionality are commonly encountered in computational physics and engineering.
We propose a data-driven scheme that automates the identification of the time-scales involved.
We show that this data-driven scheme can automatically learn the independent processes that decompose a system of linear ODEs.
arXiv Detail & Related papers (2023-03-03T19:23:59Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Learning continuous models for continuous physics [94.42705784823997]
We develop a test based on numerical analysis theory to validate machine learning models for science and engineering applications.
Our results illustrate how principled numerical analysis methods can be coupled with existing ML training/testing methodologies to validate models for science and engineering applications.
arXiv Detail & Related papers (2022-02-17T07:56:46Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Modeling Systems with Machine Learning based Differential Equations [0.0]
We propose the design of time-continuous models of dynamical systems as solutions of differential equations.
Our results suggest that this approach can be an useful technique in the case of synthetic or experimental data.
arXiv Detail & Related papers (2021-09-09T19:10:46Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.