Weighted Programming
- URL: http://arxiv.org/abs/2202.07577v1
- Date: Tue, 15 Feb 2022 17:06:43 GMT
- Title: Weighted Programming
- Authors: Kevin Batz, Adrian Gallus, Benjamin Lucien Kaminski, Joost-Pieter
Katoen, Tobias Winkler
- Abstract summary: We study weighted programming, a programming paradigm for specifying mathematical models.
We argue that weighted programming as a paradigm can be used to specify mathematical models beyond probability distributions.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study weighted programming, a programming paradigm for specifying
mathematical models. More specifically, the weighted programs we investigate
are like usual imperative programs with two additional features: (1)
nondeterministic branching and (2) weighting execution traces. Weights can be
numbers but also other objects like words from an alphabet, polynomials, formal
power series, or cardinal numbers. We argue that weighted programming as a
paradigm can be used to specify mathematical models beyond probability
distributions (as is done in probabilistic programming).
We develop weakest-precondition- and weakest-liberal-precondition-style
calculi \`{a} la Dijkstra for reasoning about mathematical models specified by
weighted programs. We present several case studies. For instance, we use
weighted programming to model the ski rental problem - an optimization problem.
We model not only the optimization problem itself, but also the best
deterministic online algorithm for solving this problem as weighted programs.
By means of weakest-precondition-style reasoning, we can determine the
competitive ratio of the online algorithm on source code level.
Related papers
- Sum-of-Squares inspired Quantum Metaheuristic for Polynomial Optimization with the Hadamard Test and Approximate Amplitude Constraints [76.53316706600717]
Recently proposed quantum algorithm arXiv:2206.14999 is based on semidefinite programming (SDP)
We generalize the SDP-inspired quantum algorithm to sum-of-squares.
Our results show that our algorithm is suitable for large problems and approximate the best known classicals.
arXiv Detail & Related papers (2024-08-14T19:04:13Z) - Statistical investigations into the geometry and homology of random programs [0.2302001830524133]
We show how the relation between random Python programs generated from chatGPT can be described geometrically and topologically.
We compare publicly available models: ChatGPT-4 and TinyLlama, on a simple problem related to image processing.
We speculate that our approach may in the future give new insights into the structure of programming languages.
arXiv Detail & Related papers (2024-07-05T20:25:02Z) - The Elements of Differentiable Programming [14.197724178748176]
Differentiable programming enables end-to-end differentiation of complex computer programs.
Differentiable programming builds upon several areas of computer science and applied mathematics.
arXiv Detail & Related papers (2024-03-21T17:55:16Z) - MathScale: Scaling Instruction Tuning for Mathematical Reasoning [70.89605383298331]
Large language models (LLMs) have demonstrated remarkable capabilities in problem-solving.
However, their proficiency in solving mathematical problems remains inadequate.
We propose MathScale, a simple and scalable method to create high-quality mathematical reasoning data.
arXiv Detail & Related papers (2024-03-05T11:42:59Z) - Computability of Optimizers [71.84486326350338]
We will show that in various situations the is unattainable on Turing machines and consequently on digital computers.
We prove such results for a variety of well-known problems from very different areas, including artificial intelligence, financial mathematics, and information theory.
arXiv Detail & Related papers (2023-01-15T17:41:41Z) - JiuZhang: A Chinese Pre-trained Language Model for Mathematical Problem
Understanding [74.12405417718054]
This paper aims to advance the mathematical intelligence of machines by presenting the first Chinese mathematical pre-trained language model(PLM)
Unlike other standard NLP tasks, mathematical texts are difficult to understand, since they involve mathematical terminology, symbols and formulas in the problem statement.
We design a novel curriculum pre-training approach for improving the learning of mathematical PLMs, consisting of both basic and advanced courses.
arXiv Detail & Related papers (2022-06-13T17:03:52Z) - Program Analysis of Probabilistic Programs [3.299672391663527]
dissertation presents three novel techniques to improve probabilistic programming using program analysis.
The techniques analyse a probabilistic program and adapt it to make inference more efficient, sometimes in a way that would have been tedious or impossible to do by hand.
arXiv Detail & Related papers (2022-04-14T10:40:54Z) - Searching for More Efficient Dynamic Programs [61.79535031840558]
We describe a set of program transformations, a simple metric for assessing the efficiency of a transformed program, and a search procedure to improve this metric.
We show that in practice, automated search can find substantial improvements to the initial program.
arXiv Detail & Related papers (2021-09-14T20:52:55Z) - Discrete Math with Programming: A Principled Approach [0.0]
It has long been argued that discrete math is better taught with programming.
This paper introduces an approach that supports all central concepts of discrete math.
Math and logical statements can be expressed precisely at a high level and be executed on a computer.
arXiv Detail & Related papers (2020-11-28T03:41:27Z) - From Checking to Inference: Actual Causality Computations as
Optimization Problems [79.87179017975235]
We present a novel approach to formulate different notions of causal reasoning, over binary acyclic models, as optimization problems.
We show that both notions are efficiently automated. Using models with more than $8000$ variables, checking is computed in a matter of seconds, with MaxSAT outperforming ILP in many cases.
arXiv Detail & Related papers (2020-06-05T10:56:52Z) - Generating Random Logic Programs Using Constraint Programming [12.47276164048813]
We present a novel approach to generating random logic programs using constraint programming.
We show how the model scales with parameter values, and use the model to compare probabilistic inference algorithms across a range of synthetic problems.
arXiv Detail & Related papers (2020-06-02T19:12:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.