Bayesian Experimental Design for Symbolic Discovery
- URL: http://arxiv.org/abs/2211.15860v1
- Date: Tue, 29 Nov 2022 01:25:29 GMT
- Title: Bayesian Experimental Design for Symbolic Discovery
- Authors: Kenneth L. Clarkson and Cristina Cornelio and Sanjeeb Dash and Joao
Goncalves and Lior Horesh and Nimrod Megiddo
- Abstract summary: We apply constrained first-order methods to optimize an appropriate selection criterion, using Hamiltonian Monte Carlo to sample from the prior.
A step for computing the predictive distribution, involving convolution, is computed via either numerical integration, or via fast transform methods.
- Score: 12.855710007840479
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This study concerns the formulation and application of Bayesian optimal
experimental design to symbolic discovery, which is the inference from
observational data of predictive models taking general functional forms. We
apply constrained first-order methods to optimize an appropriate selection
criterion, using Hamiltonian Monte Carlo to sample from the prior. A step for
computing the predictive distribution, involving convolution, is computed via
either numerical integration, or via fast transform methods.
Related papers
- Fusion of Gaussian Processes Predictions with Monte Carlo Sampling [61.31380086717422]
In science and engineering, we often work with models designed for accurate prediction of variables of interest.
Recognizing that these models are approximations of reality, it becomes desirable to apply multiple models to the same data and integrate their outcomes.
arXiv Detail & Related papers (2024-03-03T04:21:21Z) - Generalizing Backpropagation for Gradient-Based Interpretability [103.2998254573497]
We show that the gradient of a model is a special case of a more general formulation using semirings.
This observation allows us to generalize the backpropagation algorithm to efficiently compute other interpretable statistics.
arXiv Detail & Related papers (2023-07-06T15:19:53Z) - Scalable method for Bayesian experimental design without integrating
over posterior distribution [0.0]
We address the computational efficiency in solving the A-optimal Bayesian design of experiments problems.
A-optimality is a widely used and easy-to-interpret criterion for Bayesian experimental design.
This study presents a novel likelihood-free approach to the A-optimal experimental design.
arXiv Detail & Related papers (2023-06-30T12:40:43Z) - Adaptive Sampling Quasi-Newton Methods for Zeroth-Order Stochastic
Optimization [1.7513645771137178]
We consider unconstrained optimization problems with no available gradient information.
We propose an adaptive sampling quasi-Newton method where we estimate the gradients of a simulation function using finite differences within a common random number framework.
We develop modified versions of a norm test and an inner product quasi-Newton test to control the sample sizes used in the approximations and provide global convergence results to the neighborhood of the optimal solution.
arXiv Detail & Related papers (2021-09-24T21:49:25Z) - Stochastic Learning Approach to Binary Optimization for Optimal Design
of Experiments [0.0]
We present a novel approach to binary optimization for optimal experimental design (OED) for Bayesian inverse problems governed by mathematical models such as partial differential equations.
The OED utility function, namely, the regularized optimality gradient, is cast into an objective function in the form of an expectation over a Bernoulli distribution.
The objective is then solved by using a probabilistic optimization routine to find an optimal observational policy.
arXiv Detail & Related papers (2021-01-15T03:54:12Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Optimal Bayesian experimental design for subsurface flow problems [77.34726150561087]
We propose a novel approach for development of chaos expansion (PCE) surrogate model for the design utility function.
This novel technique enables the derivation of a reasonable quality response surface for the targeted objective function with a computational budget comparable to several single-point evaluations.
arXiv Detail & Related papers (2020-08-10T09:42:59Z) - Unbiased MLMC stochastic gradient-based optimization of Bayesian
experimental designs [4.112293524466434]
The gradient of the expected information gain with respect to experimental design parameters is given by a nested expectation.
We introduce an unbiased Monte Carlo estimator for the gradient of the expected information gain with finite expected squared $ell$-norm and finite expected computational cost per sample.
arXiv Detail & Related papers (2020-05-18T01:02:31Z) - Adversarial Monte Carlo Meta-Learning of Optimal Prediction Procedures [0.0]
We frame the meta-learning of prediction procedures as a search for an optimal strategy in a two-player game.
In this game, Nature selects a prior overparametric distributions that generate labeled data consisting of features and an associated outcome.
The Predictor's objective is to learn a function that maps from a new feature to an estimate of the associated outcome.
arXiv Detail & Related papers (2020-02-26T03:16:05Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z) - Distributed Averaging Methods for Randomized Second Order Optimization [54.51566432934556]
We consider distributed optimization problems where forming the Hessian is computationally challenging and communication is a bottleneck.
We develop unbiased parameter averaging methods for randomized second order optimization that employ sampling and sketching of the Hessian.
We also extend the framework of second order averaging methods to introduce an unbiased distributed optimization framework for heterogeneous computing systems.
arXiv Detail & Related papers (2020-02-16T09:01:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.