Applications of ML-Based Surrogates in Bayesian Approaches to Inverse
Problems
- URL: http://arxiv.org/abs/2310.12046v2
- Date: Mon, 23 Oct 2023 21:43:52 GMT
- Title: Applications of ML-Based Surrogates in Bayesian Approaches to Inverse
Problems
- Authors: Pelin Ersin, Emma Hayes, Peter Matthews, Paramjyoti Mohapatra, Elisa
Negrini and Karl Schulz
- Abstract summary: We consider the inverse problem of inferring the location of a wave source on a square domain, given a noisy solution to the 2-D acoustic wave equation.
Using a standard neural network as a surrogate model makes it computationally feasible to evaluate this likelihood several times.
We demonstrate that this method can accurately infer source-locations from noisy data.
- Score: 0.41942958779358674
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural networks have become a powerful tool as surrogate models to provide
numerical solutions for scientific problems with increased computational
efficiency. This efficiency can be advantageous for numerically challenging
problems where time to solution is important or when evaluation of many similar
analysis scenarios is required. One particular area of scientific interest is
the setting of inverse problems, where one knows the forward dynamics of a
system are described by a partial differential equation and the task is to
infer properties of the system given (potentially noisy) observations of these
dynamics. We consider the inverse problem of inferring the location of a wave
source on a square domain, given a noisy solution to the 2-D acoustic wave
equation. Under the assumption of Gaussian noise, a likelihood function for
source location can be formulated, which requires one forward simulation of the
system per evaluation. Using a standard neural network as a surrogate model
makes it computationally feasible to evaluate this likelihood several times,
and so Markov Chain Monte Carlo methods can be used to evaluate the posterior
distribution of the source location. We demonstrate that this method can
accurately infer source-locations from noisy data.
Related papers
- Accelerated zero-order SGD under high-order smoothness and overparameterized regime [79.85163929026146]
We present a novel gradient-free algorithm to solve convex optimization problems.
Such problems are encountered in medicine, physics, and machine learning.
We provide convergence guarantees for the proposed algorithm under both types of noise.
arXiv Detail & Related papers (2024-11-21T10:26:17Z) - Conditional score-based diffusion models for solving inverse problems in mechanics [6.319616423658121]
We propose a framework to perform Bayesian inference using conditional score-based diffusion models.
Conditional score-based diffusion models are generative models that learn to approximate the score function of a conditional distribution.
We demonstrate the efficacy of the proposed approach on a suite of high-dimensional inverse problems in mechanics.
arXiv Detail & Related papers (2024-06-19T02:09:15Z) - An information field theory approach to Bayesian state and parameter estimation in dynamical systems [0.0]
This paper develops a scalable Bayesian approach to state and parameter estimation suitable for continuous-time, deterministic dynamical systems.
We construct a physics-informed prior probability measure on the function space of system responses so that functions that satisfy the physics are more likely.
arXiv Detail & Related papers (2023-06-03T16:36:43Z) - Solving High-Dimensional Inverse Problems with Auxiliary Uncertainty via
Operator Learning with Limited Data [0.35880734696551125]
Identification of sources from observations of system state is vital for attribution and prediction.
Data challenges arise from high dimensionality of the state and source, limited ensembles of costly model simulations to train a surrogate model, and few and potentially noisy state observations for inversion.
We introduce a framework based on (1) calibrating deep neural network surrogates to the flow maps provided by an ensemble of simulations, and (2) using these surrogates in a Bayesian framework to identify sources from observations via optimization.
arXiv Detail & Related papers (2023-03-20T18:29:23Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Optimal Algorithms for the Inhomogeneous Spiked Wigner Model [89.1371983413931]
We derive an approximate message-passing algorithm (AMP) for the inhomogeneous problem.
We identify in particular the existence of a statistical-to-computational gap where known algorithms require a signal-to-noise ratio bigger than the information-theoretic threshold to perform better than random.
arXiv Detail & Related papers (2023-02-13T19:57:17Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - System identification using Bayesian neural networks with nonparametric
noise models [0.0]
We propose a nonparametric approach for system identification in discrete time nonlinear random dynamical systems.
A Gibbs sampler for posterior inference is proposed and its effectiveness is illustrated in simulated and real time series.
arXiv Detail & Related papers (2021-04-25T09:49:50Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Parameterizing uncertainty by deep invertible networks, an application
to reservoir characterization [0.9176056742068814]
Uncertainty quantification for full-waveform inversion provides a probabilistic characterization of the ill-conditioning of the problem.
We propose an approach characterized by training a deep network that "pushes forward" Gaussian random inputs into the model space as if they were sampled from the actual posterior distribution.
arXiv Detail & Related papers (2020-04-16T18:37:56Z) - Active Model Estimation in Markov Decision Processes [108.46146218973189]
We study the problem of efficient exploration in order to learn an accurate model of an environment, modeled as a Markov decision process (MDP)
We show that our Markov-based algorithm outperforms both our original algorithm and the maximum entropy algorithm in the small sample regime.
arXiv Detail & Related papers (2020-03-06T16:17:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.