Learning to Generate Lumped Hydrological Models
- URL: http://arxiv.org/abs/2309.09904v2
- Date: Wed, 22 Nov 2023 08:33:23 GMT
- Title: Learning to Generate Lumped Hydrological Models
- Authors: Yang Yang and Ting Fong May Chui
- Abstract summary: In this study, a generative model was learned from data from over 3,000 catchments worldwide.
The model was then used to derive optimal modeling functions for over 700 different catchments.
Overall, this study demonstrates that the hydrological behavior of a catchment can be effectively described using a small number of latent variables.
- Score: 4.368211287521716
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: A lumped hydrological model structure can be considered a generative model
because, given a set of parameter values, it can generate a hydrological
modeling function that accurately predicts the behavior of a catchment under
external forcing. It is implicitly assumed that a small number of variables
(i.e., the model parameters) can sufficiently characterize variations in the
behavioral characteristics of different catchments. This study adopts this
assumption and uses a deep learning method to learn a generative model of
hydrological modeling functions directly from the forcing and runoff data of
multiple catchments. The learned generative model uses a small number of latent
variables to characterize a catchment's behavior, so that assigning values to
these latent variables produces a hydrological modeling function that resembles
a real-world catchment. The learned generative model can be used similarly to a
lumped model structure, i.e., the optimal hydrological modeling function of a
catchment can be derived by estimating optimal parameter values (or latent
variables) with a generic calibration algorithm. In this study, a generative
model was learned from data from over 3,000 catchments worldwide. The model was
then used to derive optimal modeling functions for over 700 different
catchments. The resulting modeling functions generally showed a quality that
was comparable to or better than 36 types of lumped model structures. Overall,
this study demonstrates that the hydrological behavior of a catchment can be
effectively described using a small number of latent variables, and that
well-fitting hydrologic model functions can be reconstructed from these
variables.
Related papers
- Consistent World Models via Foresight Diffusion [56.45012929930605]
We argue that a key bottleneck in learning consistent diffusion-based world models lies in the suboptimal predictive ability.<n>We propose Foresight Diffusion (ForeDiff), a diffusion-based world modeling framework that enhances consistency by decoupling condition understanding from target denoising.
arXiv Detail & Related papers (2025-05-22T10:01:59Z) - Guiding Time-Varying Generative Models with Natural Gradients on Exponential Family Manifold [5.000311680307273]
We show that the evolution of time-varying generative models can be projected onto an exponential family manifold.
We then train the generative model by moving its projection on the manifold according to the natural gradient descent scheme.
We propose particle versions of the algorithm, which feature closed-form update rules for any parametric model within the exponential family.
arXiv Detail & Related papers (2025-02-11T15:39:47Z) - On conditional diffusion models for PDE simulations [53.01911265639582]
We study score-based diffusion models for forecasting and assimilation of sparse observations.
We propose an autoregressive sampling approach that significantly improves performance in forecasting.
We also propose a new training strategy for conditional score-based models that achieves stable performance over a range of history lengths.
arXiv Detail & Related papers (2024-10-21T18:31:04Z) - Latent diffusion models for parameterization and data assimilation of facies-based geomodels [0.0]
Diffusion models are trained to generate new geological realizations from input fields characterized by random noise.
Latent diffusion models are shown to provide realizations that are visually consistent with samples from geomodeling software.
arXiv Detail & Related papers (2024-06-21T01:32:03Z) - Modeling Randomly Observed Spatiotemporal Dynamical Systems [7.381752536547389]
Currently available neural network-based modeling approaches fall short when faced with data collected randomly over time and space.
In response, we developed a new method that effectively handles such randomly sampled data.
Our model integrates techniques from amortized variational inference, neural differential equations, neural point processes, and implicit neural representations to predict both the dynamics of the system and the timings and locations of future observations.
arXiv Detail & Related papers (2024-06-01T09:03:32Z) - Efficient modeling of sub-kilometer surface wind with Gaussian processes and neural networks [0.0]
Wind represents a particularly challenging variable to model due to its high spatial and temporal variability.
This paper presents a novel approach that integrates Gaussian processes and neural networks to model surface wind gusts at sub-kilometer resolution.
arXiv Detail & Related papers (2024-05-21T09:07:47Z) - Synthetic location trajectory generation using categorical diffusion
models [50.809683239937584]
Diffusion models (DPMs) have rapidly evolved to be one of the predominant generative models for the simulation of synthetic data.
We propose using DPMs for the generation of synthetic individual location trajectories (ILTs) which are sequences of variables representing physical locations visited by individuals.
arXiv Detail & Related papers (2024-02-19T15:57:39Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Weather Prediction with Diffusion Guided by Realistic Forecast Processes [49.07556359513563]
We introduce a novel method that applies diffusion models (DM) for weather forecasting.
Our method can achieve both direct and iterative forecasting with the same modeling framework.
The flexibility and controllability of our model empowers a more trustworthy DL system for the general weather community.
arXiv Detail & Related papers (2024-02-06T21:28:42Z) - Neural Likelihood Approximation for Integer Valued Time Series Data [0.0]
We construct a neural likelihood approximation that can be trained using unconditional simulation of the underlying model.
We demonstrate our method by performing inference on a number of ecological and epidemiological models.
arXiv Detail & Related papers (2023-10-19T07:51:39Z) - Neural Superstatistics for Bayesian Estimation of Dynamic Cognitive
Models [2.7391842773173334]
We develop a simulation-based deep learning method for Bayesian inference, which can recover both time-varying and time-invariant parameters.
Our results show that the deep learning approach is very efficient in capturing the temporal dynamics of the model.
arXiv Detail & Related papers (2022-11-23T17:42:53Z) - A Spatial-temporal Graph Deep Learning Model for Urban Flood Nowcasting
Leveraging Heterogeneous Community Features [1.2599533416395765]
The objective of this study is to develop and test a novel structured deep-learning modeling framework for urban flood nowcasting.
We present a new computational modeling framework including an attention-based spatial-temporal graph convolution network (ASTGCN) model.
Results indicate that the model provides superior performance for the nowcasting of urban flood inundation at the census tract level.
arXiv Detail & Related papers (2021-11-09T15:35:05Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Generative Temporal Difference Learning for Infinite-Horizon Prediction [101.59882753763888]
We introduce the $gamma$-model, a predictive model of environment dynamics with an infinite probabilistic horizon.
We discuss how its training reflects an inescapable tradeoff between training-time and testing-time compounding errors.
arXiv Detail & Related papers (2020-10-27T17:54:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.