Bayesian Learning of Coupled Biogeochemical-Physical Models
- URL: http://arxiv.org/abs/2211.06714v2
- Date: Sun, 4 Jun 2023 17:49:45 GMT
- Title: Bayesian Learning of Coupled Biogeochemical-Physical Models
- Authors: Abhinav Gupta and Pierre F. J. Lermusiaux
- Abstract summary: Predictive models for marine ecosystems are used for a variety of needs.
Due to sparse measurements and limited understanding of the myriad of ocean processes, there is significant uncertainty.
We develop a Bayesian model learning methodology that allows handling in the space of candidate models and discovery of new models.
- Score: 28.269731698116257
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Predictive dynamical models for marine ecosystems are used for a variety of
needs. Due to sparse measurements and limited understanding of the myriad of
ocean processes, there is however significant uncertainty. There is model
uncertainty in the parameter values, functional forms with diverse
parameterizations, level of complexity needed, and thus in the state fields. We
develop a Bayesian model learning methodology that allows interpolation in the
space of candidate models and discovery of new models from noisy, sparse, and
indirect observations, all while estimating state fields and parameter values,
as well as the joint PDFs of all learned quantities. We address the challenges
of high-dimensional and multidisciplinary dynamics governed by PDEs by using
state augmentation and the computationally efficient GMM-DO filter. Our
innovations include stochastic formulation and complexity parameters to unify
candidate models into a single general model as well as stochastic expansion
parameters within piecewise function approximations to generate dense candidate
model spaces. These innovations allow handling many compatible and embedded
candidate models, possibly none of which are accurate, and learning elusive
unknown functional forms. Our new methodology is generalizable, interpretable,
and extrapolates out of the space of models to discover new ones. We perform a
series of twin experiments based on flows past a ridge coupled with
three-to-five component ecosystem models, including flows with chaotic
advection. The probabilities of known, uncertain, and unknown model
formulations, and of state fields and parameters, are updated jointly using
Bayes' law. Non-Gaussian statistics, ambiguity, and biases are captured. The
parameter values and model formulations that best explain the data are
identified. When observations are sufficiently informative, model complexity
and functions are discovered.
Related papers
- SMILE: Zero-Shot Sparse Mixture of Low-Rank Experts Construction From Pre-Trained Foundation Models [85.67096251281191]
We present an innovative approach to model fusion called zero-shot Sparse MIxture of Low-rank Experts (SMILE) construction.
SMILE allows for the upscaling of source models into an MoE model without extra data or further training.
We conduct extensive experiments across diverse scenarios, such as image classification and text generation tasks, using full fine-tuning and LoRA fine-tuning.
arXiv Detail & Related papers (2024-08-19T17:32:15Z) - Latent diffusion models for parameterization and data assimilation of facies-based geomodels [0.0]
Diffusion models are trained to generate new geological realizations from input fields characterized by random noise.
Latent diffusion models are shown to provide realizations that are visually consistent with samples from geomodeling software.
arXiv Detail & Related papers (2024-06-21T01:32:03Z) - Towards Learning Stochastic Population Models by Gradient Descent [0.0]
We show that simultaneous estimation of parameters and structure poses major challenges for optimization procedures.
We demonstrate accurate estimation of models but find that enforcing the inference of parsimonious, interpretable models drastically increases the difficulty.
arXiv Detail & Related papers (2024-04-10T14:38:58Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Neural parameter calibration for large-scale multi-agent models [0.7734726150561089]
We present a method to retrieve accurate probability densities for parameters using neural equations.
The two combined create a powerful tool that can quickly estimate densities on model parameters, even for very large systems.
arXiv Detail & Related papers (2022-09-27T17:36:26Z) - On the Influence of Enforcing Model Identifiability on Learning dynamics
of Gaussian Mixture Models [14.759688428864159]
We propose a technique for extracting submodels from singular models.
Our method enforces model identifiability during training.
We show how the method can be applied to more complex models like deep neural networks.
arXiv Detail & Related papers (2022-06-17T07:50:22Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Post-mortem on a deep learning contest: a Simpson's paradox and the
complementary roles of scale metrics versus shape metrics [61.49826776409194]
We analyze a corpus of models made publicly-available for a contest to predict the generalization accuracy of neural network (NN) models.
We identify what amounts to a Simpson's paradox: where "scale" metrics perform well overall but perform poorly on sub partitions of the data.
We present two novel shape metrics, one data-independent, and the other data-dependent, which can predict trends in the test accuracy of a series of NNs.
arXiv Detail & Related papers (2021-06-01T19:19:49Z) - Integrating Domain Knowledge in Data-driven Earth Observation with
Process Convolutions [13.13700072257046]
We argue that hybrid learning schemes that combine both approaches can address all these issues efficiently.
We specifically propose the use of a class of GP convolution models called latent force models (LFMs) for time series modelling.
We consider time series of soil moisture from active (ASCAT) and passive (SMOS, AMSR2) microwave satellites.
arXiv Detail & Related papers (2021-04-16T14:30:40Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.