Multi-gauge Hydrological Variational Data Assimilation: Regionalization
Learning with Spatial Gradients using Multilayer Perceptron and
Bayesian-Guided Multivariate Regression
- URL: http://arxiv.org/abs/2307.02497v1
- Date: Tue, 4 Jul 2023 08:27:52 GMT
- Title: Multi-gauge Hydrological Variational Data Assimilation: Regionalization
Learning with Spatial Gradients using Multilayer Perceptron and
Bayesian-Guided Multivariate Regression
- Authors: Ngo Nghi Truyen Huynh, Pierre-Andr\'e Garambois, Fran\c{c}ois
Colleoni, Benjamin Renard, H\'el\`ene Roux (IMFT)
- Abstract summary: This contribution presents a novel seamless regionalization technique for learning complex regional transfer functions designed for high-resolution hydrological models.
The approach involves incorporating the inferable regionalization mappings into a differentiable hydrological model and optimizing a cost function computed on multi-gauge data with accurate adjoint-based spatially distributed gradients.
- Score: 0.3281128493853064
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Tackling the difficult problem of estimating spatially distributed
hydrological parameters, especially for floods on ungauged watercourses, this
contribution presents a novel seamless regionalization technique for learning
complex regional transfer functions designed for high-resolution hydrological
models. The transfer functions rely on: (i) a multilayer perceptron enabling a
seamless flow of gradient computation to employ machine learning optimization
algorithms, or (ii) a multivariate regression mapping optimized by variational
data assimilation algorithms and guided by Bayesian estimation, addressing the
equifinality issue of feasible solutions. The approach involves incorporating
the inferable regionalization mappings into a differentiable hydrological model
and optimizing a cost function computed on multi-gauge data with accurate
adjoint-based spatially distributed gradients.
Related papers
- Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - Nonparametric Automatic Differentiation Variational Inference with
Spline Approximation [7.5620760132717795]
We develop a nonparametric approximation approach that enables flexible posterior approximation for distributions with complicated structures.
Compared with widely-used nonparametrical inference methods, the proposed method is easy to implement and adaptive to various data structures.
Experiments demonstrate the efficiency of the proposed method in approximating complex posterior distributions and improving the performance of generative models with incomplete data.
arXiv Detail & Related papers (2024-03-10T20:22:06Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Flow-based Distributionally Robust Optimization [23.232731771848883]
We present a framework, called $textttFlowDRO$, for solving flow-based distributionally robust optimization (DRO) problems with Wasserstein uncertainty sets.
We aim to find continuous worst-case distribution (also called the Least Favorable Distribution, LFD) and sample from it.
We demonstrate its usage in adversarial learning, distributionally robust hypothesis testing, and a new mechanism for data-driven distribution perturbation differential privacy.
arXiv Detail & Related papers (2023-10-30T03:53:31Z) - Subsurface Characterization using Ensemble-based Approaches with Deep
Generative Models [2.184775414778289]
Inverse modeling is limited for ill-posed, high-dimensional applications due to computational costs and poor prediction accuracy with sparse datasets.
We combine Wasserstein Geneversarative Adrial Network with Gradient Penalty (WGAN-GP) and Ensemble Smoother with Multiple Data Assimilation (ES-MDA)
WGAN-GP is trained to generate high-dimensional K fields from a low-dimensional latent space and ES-MDA updates the latent variables by assimilating available measurements.
arXiv Detail & Related papers (2023-10-02T01:27:10Z) - Learning Regionalization using Accurate Spatial Cost Gradients within a Differentiable High-Resolution Hydrological Model: Application to the French Mediterranean Region [0.18139022013189662]
Estimating distributed hydrological parameters in ungauged catchments poses a challenging regionalization problem.
This paper introduces a Hybrid Assimilation and Regionalization (HDA-PR) approach incorporating learnable regionalization mappings.
Results highlight a strong regionalization of HDA-PR especially in the most challenging upstream-to-downstream extrapolation scenario.
arXiv Detail & Related papers (2023-08-02T07:23:50Z) - Optimization of a Hydrodynamic Computational Reservoir through Evolution [58.720142291102135]
We interface with a model of a hydrodynamic system, under development by a startup, as a computational reservoir.
We optimized the readout times and how inputs are mapped to the wave amplitude or frequency using an evolutionary search algorithm.
Applying evolutionary methods to this reservoir system substantially improved separability on an XNOR task, in comparison to implementations with hand-selected parameters.
arXiv Detail & Related papers (2023-04-20T19:15:02Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - VI-DGP: A variational inference method with deep generative prior for
solving high-dimensional inverse problems [0.7734726150561089]
We propose a novel approximation method for estimating the high-dimensional posterior distribution.
This approach leverages a deep generative model to learn a prior model capable of generating spatially-varying parameters.
The proposed method can be fully implemented in an automatic differentiation manner.
arXiv Detail & Related papers (2023-02-22T06:48:10Z) - Harnessing Heterogeneity: Learning from Decomposed Feedback in Bayesian
Modeling [68.69431580852535]
We introduce a novel GP regression to incorporate the subgroup feedback.
Our modified regression has provably lower variance -- and thus a more accurate posterior -- compared to previous approaches.
We execute our algorithm on two disparate social problems.
arXiv Detail & Related papers (2021-07-07T03:57:22Z) - Spatially Adaptive Inference with Stochastic Feature Sampling and
Interpolation [72.40827239394565]
We propose to compute features only at sparsely sampled locations.
We then densely reconstruct the feature map with an efficient procedure.
The presented network is experimentally shown to save substantial computation while maintaining accuracy over a variety of computer vision tasks.
arXiv Detail & Related papers (2020-03-19T15:36:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.