ProbNum: Probabilistic Numerics in Python
- URL: http://arxiv.org/abs/2112.02100v1
- Date: Fri, 3 Dec 2021 07:20:50 GMT
- Title: ProbNum: Probabilistic Numerics in Python
- Authors: Jonathan Wenger, Nicholas Kr\"amer, Marvin Pf\"ortner, Jonathan
Schmidt, Nathanael Bosch, Nina Effenberger, Johannes Zenn, Alexandra Gessner,
Toni Karvonen, Fran\c{c}ois-Xavier Briol, Maren Mahsereci, Philipp Hennig
- Abstract summary: Probabilistic numerical methods (PNMs) solve numerical problems via probabilistic inference.
We present ProbNum: a Python library providing state-of-the-art PNMs.
- Score: 62.52335490524408
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Probabilistic numerical methods (PNMs) solve numerical problems via
probabilistic inference. They have been developed for linear algebra,
optimization, integration and differential equation simulation. PNMs naturally
incorporate prior information about a problem and quantify uncertainty due to
finite computational resources as well as stochastic input. In this paper, we
present ProbNum: a Python library providing state-of-the-art probabilistic
numerical solvers. ProbNum enables custom composition of PNMs for specific
problem classes via a modular design as well as wrappers for off-the-shelf use.
Tutorials, documentation, developer guides and benchmarks are available online
at www.probnum.org.
Related papers
- Bayesian Quantum State Tomography with Python's PyMC [0.0]
We show how to use Python-3's open source PyMC probabilistic programming package to transform an otherwise complicated QST optimization problem into a simple form.
We show how to use Python-3's open source PyMC probabilistic programming package to transform an otherwise complicated QST optimization problem into a simple form.
arXiv Detail & Related papers (2022-12-20T21:16:28Z) - Linear Algorithms for Nonparametric Multiclass Probability Estimation [0.0]
Support Vector Machines (wSVMs) have been developed to estimate class probabilities through ensemble learning.
We propose two new learning schemes, the baseline learning and the One-vs. All (OVA) learning, to further improve wSVMs in terms of computational efficiency and estimation accuracy.
arXiv Detail & Related papers (2022-05-25T03:15:22Z) - Test Set Sizing Via Random Matrix Theory [91.3755431537592]
This paper uses techniques from Random Matrix Theory to find the ideal training-testing data split for a simple linear regression.
It defines "ideal" as satisfying the integrity metric, i.e. the empirical model error is the actual measurement noise.
This paper is the first to solve for the training and test size for any model in a way that is truly optimal.
arXiv Detail & Related papers (2021-12-11T13:18:33Z) - Solving Probability and Statistics Problems by Program Synthesis [1.0937094979510211]
We solve university level probability and statistics questions by program synthesis using OpenAI's Codex.
Our work is the first to introduce a new dataset of university-level probability and statistics problems.
arXiv Detail & Related papers (2021-11-16T07:34:25Z) - Black Box Probabilistic Numerics [7.6034684297555]
This paper proposes to construct probabilistic numerical methods based only on the final output from a traditional method.
A convergent sequence of approximations to the quantity of interest constitute a dataset.
This black box approach massively expands the range of tasks to which probabilistic numerics can be applied.
arXiv Detail & Related papers (2021-06-15T11:21:10Z) - Probabilistic Gradient Boosting Machines for Large-Scale Probabilistic
Regression [51.770998056563094]
Probabilistic Gradient Boosting Machines (PGBM) is a method to create probabilistic predictions with a single ensemble of decision trees.
We empirically demonstrate the advantages of PGBM compared to existing state-of-the-art methods.
arXiv Detail & Related papers (2021-06-03T08:32:13Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z) - Stochastic Saddle-Point Optimization for Wasserstein Barycenters [69.68068088508505]
We consider the populationimation barycenter problem for random probability measures supported on a finite set of points and generated by an online stream of data.
We employ the structure of the problem and obtain a convex-concave saddle-point reformulation of this problem.
In the setting when the distribution of random probability measures is discrete, we propose an optimization algorithm and estimate its complexity.
arXiv Detail & Related papers (2020-06-11T19:40:38Z) - Marginal likelihood computation for model selection and hypothesis
testing: an extensive review [66.37504201165159]
This article provides a comprehensive study of the state-of-the-art of the topic.
We highlight limitations, benefits, connections and differences among the different techniques.
Problems and possible solutions with the use of improper priors are also described.
arXiv Detail & Related papers (2020-05-17T18:31:58Z) - Sum-product networks: A survey [0.0]
A sum-product network (SPN) is a probabilistic model, based on a rooted acyclic directed graph.
This paper offers a survey of SPNs, including their definition, the main algorithms for inference and learning from data, the main applications, a brief review of software libraries, and a comparison with related models.
arXiv Detail & Related papers (2020-04-02T17:46:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.