Learning to Generate Lumped Hydrological Models
- URL: http://arxiv.org/abs/2309.09904v2
- Date: Wed, 22 Nov 2023 08:33:23 GMT
- Title: Learning to Generate Lumped Hydrological Models
- Authors: Yang Yang and Ting Fong May Chui
- Abstract summary: In this study, a generative model was learned from data from over 3,000 catchments worldwide.
The model was then used to derive optimal modeling functions for over 700 different catchments.
Overall, this study demonstrates that the hydrological behavior of a catchment can be effectively described using a small number of latent variables.
- Score: 4.368211287521716
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: A lumped hydrological model structure can be considered a generative model
because, given a set of parameter values, it can generate a hydrological
modeling function that accurately predicts the behavior of a catchment under
external forcing. It is implicitly assumed that a small number of variables
(i.e., the model parameters) can sufficiently characterize variations in the
behavioral characteristics of different catchments. This study adopts this
assumption and uses a deep learning method to learn a generative model of
hydrological modeling functions directly from the forcing and runoff data of
multiple catchments. The learned generative model uses a small number of latent
variables to characterize a catchment's behavior, so that assigning values to
these latent variables produces a hydrological modeling function that resembles
a real-world catchment. The learned generative model can be used similarly to a
lumped model structure, i.e., the optimal hydrological modeling function of a
catchment can be derived by estimating optimal parameter values (or latent
variables) with a generic calibration algorithm. In this study, a generative
model was learned from data from over 3,000 catchments worldwide. The model was
then used to derive optimal modeling functions for over 700 different
catchments. The resulting modeling functions generally showed a quality that
was comparable to or better than 36 types of lumped model structures. Overall,
this study demonstrates that the hydrological behavior of a catchment can be
effectively described using a small number of latent variables, and that
well-fitting hydrologic model functions can be reconstructed from these
variables.
Related papers
- A statistical approach to latent dynamic modeling with differential
equations [0.0]
Ordinary differential equations (ODEs) can provide mechanistic models of temporally local changes of processes.
We propose to use each observation in the course of time as the initial value to obtain multiple local ODE solutions.
We illustrate the proposed approach in an application with spinal muscular atrophy patients and a corresponding simulation study.
arXiv Detail & Related papers (2023-11-27T20:02:55Z) - Towards an Hybrid Hodgkin-Huxley Action Potential Generation Model [0.0]
We investigate the possibility of finding the Hodgkin-Huxley model's parametric functions using only two simple measurements.
Experiments were carried out using data generated from the original Hodgkin-Huxley model.
Results show that a simple two-layer artificial neural network architecture trained on a minimal amount of data can learn to model some of the fundamental proprieties of the action potential generation.
arXiv Detail & Related papers (2023-03-15T22:39:23Z) - Bayesian Learning of Coupled Biogeochemical-Physical Models [28.269731698116257]
Predictive models for marine ecosystems are used for a variety of needs.
Due to sparse measurements and limited understanding of the myriad of ocean processes, there is significant uncertainty.
We develop a Bayesian model learning methodology that allows handling in the space of candidate models and discovery of new models.
arXiv Detail & Related papers (2022-11-12T17:49:18Z) - Differentiable, learnable, regionalized process-based models with
physical outputs can approach state-of-the-art hydrologic prediction accuracy [1.181206257787103]
We show that differentiable, learnable, process-based models (called delta models here) can approach the performance level of LSTM for the intensively-observed variable (streamflow) with regionalized parameterization.
We use a simple hydrologic model HBV as the backbone and use embedded neural networks, which can only be trained in a differentiable programming framework.
arXiv Detail & Related papers (2022-03-28T15:06:53Z) - Prediction of liquid fuel properties using machine learning models with
Gaussian processes and probabilistic conditional generative learning [56.67751936864119]
The present work aims to construct cheap-to-compute machine learning (ML) models to act as closure equations for predicting the physical properties of alternative fuels.
Those models can be trained using the database from MD simulations and/or experimental measurements in a data-fusion-fidelity approach.
The results show that ML models can predict accurately the fuel properties of a wide range of pressure and temperature conditions.
arXiv Detail & Related papers (2021-10-18T14:43:50Z) - Model-agnostic multi-objective approach for the evolutionary discovery
of mathematical models [55.41644538483948]
In modern data science, it is more interesting to understand the properties of the model, which parts could be replaced to obtain better results.
We use multi-objective evolutionary optimization for composite data-driven model learning to obtain the algorithm's desired properties.
arXiv Detail & Related papers (2021-07-07T11:17:09Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Distilling Interpretable Models into Human-Readable Code [71.11328360614479]
Human-readability is an important and desirable standard for machine-learned model interpretability.
We propose to train interpretable models using conventional methods, and then distill them into concise, human-readable code.
We describe a piecewise-linear curve-fitting algorithm that produces high-quality results efficiently and reliably across a broad range of use cases.
arXiv Detail & Related papers (2021-01-21T01:46:36Z) - Gaussian Function On Response Surface Estimation [12.35564140065216]
We propose a new framework for interpreting (features and samples) black-box machine learning models via a metamodeling technique.
The metamodel can be estimated from data generated via a trained complex model by running the computer experiment on samples of data in the region of interest.
arXiv Detail & Related papers (2021-01-04T04:47:00Z) - Improving the Reconstruction of Disentangled Representation Learners via Multi-Stage Modeling [55.28436972267793]
Current autoencoder-based disentangled representation learning methods achieve disentanglement by penalizing the ( aggregate) posterior to encourage statistical independence of the latent factors.
We present a novel multi-stage modeling approach where the disentangled factors are first learned using a penalty-based disentangled representation learning method.
Then, the low-quality reconstruction is improved with another deep generative model that is trained to model the missing correlated latent variables.
arXiv Detail & Related papers (2020-10-25T18:51:15Z) - Goal-directed Generation of Discrete Structures with Conditional
Generative Models [85.51463588099556]
We introduce a novel approach to directly optimize a reinforcement learning objective, maximizing an expected reward.
We test our methodology on two tasks: generating molecules with user-defined properties and identifying short python expressions which evaluate to a given target value.
arXiv Detail & Related papers (2020-10-05T20:03:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.