Operator inference with roll outs for learning reduced models from
scarce and low-quality data
- URL: http://arxiv.org/abs/2212.01418v1
- Date: Fri, 2 Dec 2022 19:41:31 GMT
- Title: Operator inference with roll outs for learning reduced models from
scarce and low-quality data
- Authors: Wayne Isaac Tan Uy and Dirk Hartmann and Benjamin Peherstorfer
- Abstract summary: We propose to combine data-driven modeling via operator inference with the dynamic training via roll outs of neural ordinary differential equations.
We show that operator inference with roll outs provides predictive models from training trajectories even if data are sampled sparsely in time and polluted with noise of up to 10%.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Data-driven modeling has become a key building block in computational science
and engineering. However, data that are available in science and engineering
are typically scarce, often polluted with noise and affected by measurement
errors and other perturbations, which makes learning the dynamics of systems
challenging. In this work, we propose to combine data-driven modeling via
operator inference with the dynamic training via roll outs of neural ordinary
differential equations. Operator inference with roll outs inherits
interpretability, scalability, and structure preservation of traditional
operator inference while leveraging the dynamic training via roll outs over
multiple time steps to increase stability and robustness for learning from
low-quality and noisy data. Numerical experiments with data describing shallow
water waves and surface quasi-geostrophic dynamics demonstrate that operator
inference with roll outs provides predictive models from training trajectories
even if data are sampled sparsely in time and polluted with noise of up to 10%.
Related papers
- Learning with Noisy Foundation Models [95.50968225050012]
This paper is the first work to comprehensively understand and analyze the nature of noise in pre-training datasets.
We propose a tuning method (NMTune) to affine the feature space to mitigate the malignant effect of noise and improve generalization.
arXiv Detail & Related papers (2024-03-11T16:22:41Z) - CoDBench: A Critical Evaluation of Data-driven Models for Continuous
Dynamical Systems [8.410938527671341]
We introduce CodBench, an exhaustive benchmarking suite comprising 11 state-of-the-art data-driven models for solving differential equations.
Specifically, we evaluate 4 distinct categories of models, viz., feed forward neural networks, deep operator regression models, frequency-based neural operators, and transformer architectures.
We conduct extensive experiments, assessing the operators' capabilities in learning, zero-shot super-resolution, data efficiency, robustness to noise, and computational efficiency.
arXiv Detail & Related papers (2023-10-02T21:27:54Z) - Understanding and Mitigating the Label Noise in Pre-training on
Downstream Tasks [91.15120211190519]
This paper aims to understand the nature of noise in pre-training datasets and to mitigate its impact on downstream tasks.
We propose a light-weight black-box tuning method (NMTune) to affine the feature space to mitigate the malignant effect of noise.
arXiv Detail & Related papers (2023-09-29T06:18:15Z) - Leveraging Neural Koopman Operators to Learn Continuous Representations
of Dynamical Systems from Scarce Data [0.0]
We propose a new deep Koopman framework that represents dynamics in an intrinsically continuous way.
This framework leads to better performance on limited training data.
arXiv Detail & Related papers (2023-03-13T10:16:19Z) - Towards Robust Dataset Learning [90.2590325441068]
We propose a principled, tri-level optimization to formulate the robust dataset learning problem.
Under an abstraction model that characterizes robust vs. non-robust features, the proposed method provably learns a robust dataset.
arXiv Detail & Related papers (2022-11-19T17:06:10Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Physics-Inspired Temporal Learning of Quadrotor Dynamics for Accurate
Model Predictive Trajectory Tracking [76.27433308688592]
Accurately modeling quadrotor's system dynamics is critical for guaranteeing agile, safe, and stable navigation.
We present a novel Physics-Inspired Temporal Convolutional Network (PI-TCN) approach to learning quadrotor's system dynamics purely from robot experience.
Our approach combines the expressive power of sparse temporal convolutions and dense feed-forward connections to make accurate system predictions.
arXiv Detail & Related papers (2022-06-07T13:51:35Z) - Deep Active Learning with Noise Stability [24.54974925491753]
Uncertainty estimation for unlabeled data is crucial to active learning.
We propose a novel algorithm that leverages noise stability to estimate data uncertainty.
Our method is generally applicable in various tasks, including computer vision, natural language processing, and structural data analysis.
arXiv Detail & Related papers (2022-05-26T13:21:01Z) - Active operator inference for learning low-dimensional dynamical-system
models from noisy data [0.0]
Noise poses a challenge for learning dynamical-system models because already small variations can distort the dynamics described by trajectory data.
This work builds on operator inference from scientific machine learning to infer low-dimensional models from high-dimensional state trajectories polluted with noise.
arXiv Detail & Related papers (2021-07-20T04:30:07Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z) - Operator Inference and Physics-Informed Learning of Low-Dimensional
Models for Incompressible Flows [5.756349331930218]
We suggest a new approach to learning structured low-order models for incompressible flow from data.
We show that learning dynamics of the velocity and pressure can be decoupled, thus leading to an efficient operator inference approach.
arXiv Detail & Related papers (2020-10-13T21:26:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.