Combining Machine Learning with Knowledge-Based Modeling for Scalable
Forecasting and Subgrid-Scale Closure of Large, Complex, Spatiotemporal
Systems
- URL: http://arxiv.org/abs/2002.05514v1
- Date: Mon, 10 Feb 2020 23:21:50 GMT
- Title: Combining Machine Learning with Knowledge-Based Modeling for Scalable
Forecasting and Subgrid-Scale Closure of Large, Complex, Spatiotemporal
Systems
- Authors: Alexander Wikner, Jaideep Pathak, Brian Hunt, Michelle Girvan, Troy
Arcomano, Istvan Szunyogh, Andrew Pomerance, and Edward Ott
- Abstract summary: We attempt to utilize machine learning as the essential tool for integrating pasttemporal data into predictions.
We propose combining two approaches: (i) a parallel machine learning prediction scheme; and (ii) a hybrid technique, for a composite prediction system composed of a knowledge-based component and a machine-learning-based component.
We demonstrate that not only can this method combining (i) and (ii) be scaled to give excellent performance for very large systems, but also that the length of time series data needed to train our multiple, parallel machine learning components is dramatically less than that necessary without parallelization.
- Score: 48.7576911714538
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider the commonly encountered situation (e.g., in weather forecasting)
where the goal is to predict the time evolution of a large, spatiotemporally
chaotic dynamical system when we have access to both time series data of
previous system states and an imperfect model of the full system dynamics.
Specifically, we attempt to utilize machine learning as the essential tool for
integrating the use of past data into predictions. In order to facilitate
scalability to the common scenario of interest where the spatiotemporally
chaotic system is very large and complex, we propose combining two
approaches:(i) a parallel machine learning prediction scheme; and (ii) a hybrid
technique, for a composite prediction system composed of a knowledge-based
component and a machine-learning-based component. We demonstrate that not only
can this method combining (i) and (ii) be scaled to give excellent performance
for very large systems, but also that the length of time series data needed to
train our multiple, parallel machine learning components is dramatically less
than that necessary without parallelization. Furthermore, considering cases
where computational realization of the knowledge-based component does not
resolve subgrid-scale processes, our scheme is able to use training data to
incorporate the effect of the unresolved short-scale dynamics upon the resolved
longer-scale dynamics ("subgrid-scale closure").
Related papers
- Reconstructing dynamics from sparse observations with no training on target system [0.0]
The power of the proposed hybrid machine-learning framework is demonstrated using a large number of prototypical nonlinear dynamical systems.
The framework provides a paradigm of reconstructing complex and nonlinear dynamics in the extreme situation where training data does not exist and the observations are random and sparse.
arXiv Detail & Related papers (2024-10-28T17:05:04Z) - Controlling dynamical systems to complex target states using machine
learning: next-generation vs. classical reservoir computing [68.8204255655161]
Controlling nonlinear dynamical systems using machine learning allows to drive systems into simple behavior like periodicity but also to more complex arbitrary dynamics.
We show first that classical reservoir computing excels at this task.
In a next step, we compare those results based on different amounts of training data to an alternative setup, where next-generation reservoir computing is used instead.
It turns out that while delivering comparable performance for usual amounts of training data, next-generation RC significantly outperforms in situations where only very limited data is available.
arXiv Detail & Related papers (2023-07-14T07:05:17Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - A Meta-learning Approach to Reservoir Computing: Time Series Prediction
with Limited Data [0.0]
We present a data-driven approach to automatically extract an appropriate model structure from experimentally observed processes.
We demonstrate our approach on a simple benchmark problem, where it beats the state of the art meta-learning techniques.
arXiv Detail & Related papers (2021-10-07T18:23:14Z) - Supervised DKRC with Images for Offline System Identification [77.34726150561087]
Modern dynamical systems are becoming increasingly non-linear and complex.
There is a need for a framework to model these systems in a compact and comprehensive representation for prediction and control.
Our approach learns these basis functions using a supervised learning approach.
arXiv Detail & Related papers (2021-09-06T04:39:06Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z) - Model-Based Deep Learning [155.063817656602]
Signal processing, communications, and control have traditionally relied on classical statistical modeling techniques.
Deep neural networks (DNNs) use generic architectures which learn to operate from data, and demonstrate excellent performance.
We are interested in hybrid techniques that combine principled mathematical models with data-driven systems to benefit from the advantages of both approaches.
arXiv Detail & Related papers (2020-12-15T16:29:49Z) - Multiscale Simulations of Complex Systems by Learning their Effective
Dynamics [10.52078600986485]
We present a systematic framework that bridges large scale simulations and reduced order models to Learn the Effective Dynamics.
LED provides a novel potent modality for the accurate prediction of complex systems.
LED is applicable to systems ranging from chemistry to fluid mechanics and reduces computational effort by up to two orders of magnitude.
arXiv Detail & Related papers (2020-06-24T02:35:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.