Simultaneous identification of models and parameters of scientific simulators
- URL: http://arxiv.org/abs/2305.15174v3
- Date: Thu, 30 May 2024 14:15:22 GMT
- Title: Simultaneous identification of models and parameters of scientific simulators
- Authors: Cornelius Schröder, Jakob H. Macke,
- Abstract summary: We develop a simulation-based inference framework to identify essential model components.
It can be applied to any compositional simulator without requiring evaluations.
It reveals non-identifiable model components and parameters.
- Score: 7.473394133229206
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Many scientific models are composed of multiple discrete components, and scientists often make heuristic decisions about which components to include. Bayesian inference provides a mathematical framework for systematically selecting model components, but defining prior distributions over model components and developing associated inference schemes has been challenging. We approach this problem in a simulation-based inference framework: We define model priors over candidate components and, from model simulations, train neural networks to infer joint probability distributions over both model components and associated parameters. Our method, simulation-based model inference (SBMI), represents distributions over model components as a conditional mixture of multivariate binary distributions in the Grassmann formalism. SBMI can be applied to any compositional stochastic simulator without requiring likelihood evaluations. We evaluate SBMI on a simple time series model and on two scientific models from neuroscience, and show that it can discover multiple data-consistent model configurations, and that it reveals non-identifiable model components and parameters. SBMI provides a powerful tool for data-driven scientific inquiry which will allow scientists to identify essential model components and make uncertainty-informed modelling decisions.
Related papers
- Quantifying and Attributing Submodel Uncertainty in Stochastic Simulation Models and Digital Twins [0.1234398109349733]
This paper investigates how submodel uncertainty affects the estimation of system performance metrics.<n>We develop a framework for quantifying submodel uncertainty in simulation models and extend the framework to digital-twin settings.
arXiv Detail & Related papers (2026-02-18T00:06:39Z) - Diffusion Models in Simulation-Based Inference: A Tutorial Review [9.572470603492077]
Diffusion models have emerged as powerful learners for simulation-based inference ( SBI)<n>In this tutorial review, we synthesize recent developments on diffusion models for SBI.<n>We highlight opportunities created by various concepts such as guidance, score composition, flow matching, consistency models, and joint modeling.
arXiv Detail & Related papers (2025-12-22T15:10:35Z) - Revisit Mixture Models for Multi-Agent Simulation: Experimental Study within a Unified Framework [19.558523263211942]
In multi-agent simulation, the primary challenges include behavioral multimodality and closed-loop distributional shifts.
In this study, we revisit mixture models for generating multimodal agent behaviors, which can cover the mainstream methods.
We introduce a closed-loop sample generation approach tailored for mixture models to mitigate distributional shifts.
arXiv Detail & Related papers (2025-01-28T15:26:25Z) - Amortized Bayesian Mixture Models [1.3976439685325095]
This paper introduces a novel extension of Amortized Bayesian Inference (ABI) tailored to mixture models.
We factorize the posterior into a distribution of the parameters and a distribution of (categorical) mixture indicators, which allows us to use a combination of generative neural networks.
The proposed framework accommodates both independent and dependent mixture models, enabling filtering and smoothing.
arXiv Detail & Related papers (2025-01-17T14:51:03Z) - Synthetic location trajectory generation using categorical diffusion
models [50.809683239937584]
Diffusion models (DPMs) have rapidly evolved to be one of the predominant generative models for the simulation of synthetic data.
We propose using DPMs for the generation of synthetic individual location trajectories (ILTs) which are sequences of variables representing physical locations visited by individuals.
arXiv Detail & Related papers (2024-02-19T15:57:39Z) - On generative models as the basis for digital twins [0.0]
A framework is proposed for generative models as a basis for digital twins or mirrors of structures.
The proposal is based on the premise that deterministic models cannot account for the uncertainty present in most structural modelling applications.
arXiv Detail & Related papers (2022-03-08T20:34:56Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Surrogate Modeling for Physical Systems with Preserved Properties and
Adjustable Tradeoffs [0.0]
We present a model-based and a data-driven strategy to generate surrogate models.
The latter generates interpretable surrogate models by fitting artificial relations to a presupposed topological structure.
Our framework is compatible with various spatial discretization schemes for distributed parameter models.
arXiv Detail & Related papers (2022-02-02T17:07:02Z) - Model-agnostic multi-objective approach for the evolutionary discovery
of mathematical models [55.41644538483948]
In modern data science, it is more interesting to understand the properties of the model, which parts could be replaced to obtain better results.
We use multi-objective evolutionary optimization for composite data-driven model learning to obtain the algorithm's desired properties.
arXiv Detail & Related papers (2021-07-07T11:17:09Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Robust Finite Mixture Regression for Heterogeneous Targets [70.19798470463378]
We propose an FMR model that finds sample clusters and jointly models multiple incomplete mixed-type targets simultaneously.
We provide non-asymptotic oracle performance bounds for our model under a high-dimensional learning framework.
The results show that our model can achieve state-of-the-art performance.
arXiv Detail & Related papers (2020-10-12T03:27:07Z) - Amortized Bayesian model comparison with evidential deep learning [0.12314765641075436]
We propose a novel method for performing Bayesian model comparison using specialized deep learning architectures.
Our method is purely simulation-based and circumvents the step of explicitly fitting all alternative models under consideration to each observed dataset.
We show that our method achieves excellent results in terms of accuracy, calibration, and efficiency across the examples considered in this work.
arXiv Detail & Related papers (2020-04-22T15:15:46Z) - Hybrid modeling: Applications in real-time diagnosis [64.5040763067757]
We outline a novel hybrid modeling approach that combines machine learning inspired models and physics-based models.
We are using such models for real-time diagnosis applications.
arXiv Detail & Related papers (2020-03-04T00:44:57Z) - Struct-MMSB: Mixed Membership Stochastic Blockmodels with Interpretable
Structured Priors [13.712395104755783]
Mixed membership blockmodel (MMSB) is a popular framework for community detection and network generation.
We present a flexible MMSB model, textitStruct-MMSB, that uses a recently developed statistical relational learning model, hinge-loss Markov random fields (HL-MRFs)
Our model is capable of learning latent characteristics in real-world networks via meaningful latent variables encoded as a complex combination of observed features and membership distributions.
arXiv Detail & Related papers (2020-02-21T19:32:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.