The Elements of Differentiable Programming
- URL: http://arxiv.org/abs/2403.14606v2
- Date: Wed, 24 Jul 2024 16:56:17 GMT
- Title: The Elements of Differentiable Programming
- Authors: Mathieu Blondel, Vincent Roulet,
- Abstract summary: Differentiable programming enables end-to-end differentiation of complex computer programs.
Differentiable programming builds upon several areas of computer science and applied mathematics.
- Score: 14.197724178748176
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Artificial intelligence has recently experienced remarkable advances, fueled by large models, vast datasets, accelerated hardware, and, last but not least, the transformative power of differentiable programming. This new programming paradigm enables end-to-end differentiation of complex computer programs (including those with control flows and data structures), making gradient-based optimization of program parameters possible. As an emerging paradigm, differentiable programming builds upon several areas of computer science and applied mathematics, including automatic differentiation, graphical models, optimization and statistics. This book presents a comprehensive review of the fundamental concepts useful for differentiable programming. We adopt two main perspectives, that of optimization and that of probability, with clear analogies between the two. Differentiable programming is not merely the differentiation of programs, but also the thoughtful design of programs intended for differentiation. By making programs differentiable, we inherently introduce probability distributions over their execution, providing a means to quantify the uncertainty associated with program outputs.
Related papers
- Probabilistic Programming with Programmable Variational Inference [45.593974530502095]
We propose a more modular approach to supporting variational inference in PPLs, based on compositional program transformation.
Our design enables modular reasoning about many interacting concerns, including automatic differentiation, density, tracing, and the application of unbiased gradient estimation strategies.
We implement our approach in an extension to the Gen probabilistic programming system (genjax.vi), implemented in JAX, and evaluate on several deep generative modeling tasks.
arXiv Detail & Related papers (2024-06-22T05:49:37Z) - Branches of a Tree: Taking Derivatives of Programs with Discrete and
Branching Randomness in High Energy Physics [1.0587959762260988]
We discuss several possible gradient estimation strategies, including the recent AD method, and compare them in simplified detector design experiments.
In doing so we develop, to the best of our knowledge, the first fully differentiable branching program.
arXiv Detail & Related papers (2023-08-31T12:32:34Z) - $\omega$PAP Spaces: Reasoning Denotationally About Higher-Order,
Recursive Probabilistic and Differentiable Programs [64.25762042361839]
$omega$PAP spaces are spaces for reasoning denotationally about expressive differentiable and probabilistic programming languages.
Our semantics is general enough to assign meanings to most practical probabilistic and differentiable programs.
We establish the almost-everywhere differentiability of probabilistic programs' trace density functions.
arXiv Detail & Related papers (2023-02-21T12:50:05Z) - Differentiable programming: Generalization, characterization and
limitations of deep learning [0.47791962198275073]
We define differentiable programming, as well as specify some program characteristics that allow us to incorporate the structure of the problem in a differentiable program.
We analyze different types of differentiable programs, from more general to more specific, and evaluate, for a specific problem with a graph dataset.
arXiv Detail & Related papers (2022-05-13T21:23:57Z) - Differentiable Spline Approximations [48.10988598845873]
Differentiable programming has significantly enhanced the scope of machine learning.
Standard differentiable programming methods (such as autodiff) typically require that the machine learning models be differentiable.
We show that leveraging this redesigned Jacobian in the form of a differentiable "layer" in predictive models leads to improved performance in diverse applications.
arXiv Detail & Related papers (2021-10-04T16:04:46Z) - Efficient and Modular Implicit Differentiation [68.74748174316989]
We propose a unified, efficient and modular approach for implicit differentiation of optimization problems.
We show that seemingly simple principles allow to recover many recently proposed implicit differentiation methods and create new ones easily.
arXiv Detail & Related papers (2021-05-31T17:45:58Z) - Differentiable Programming \`a la Moreau [4.289574109162585]
We define a compositional calculus adapted to Moreau envelopes and show how to integrate it within differentiable programming.
The proposed framework casts in a mathematical optimization framework several variants of gradient back-propagation related to the idea of the propagation of virtual targets.
arXiv Detail & Related papers (2020-12-31T05:56:51Z) - Learning Differentiable Programs with Admissible Neural Heuristics [43.54820901841979]
We study the problem of learning differentiable functions expressed as programs in a domain-specific language.
We frame this optimization problem as a search in a weighted graph whose paths encode top-down derivations of program syntax.
Our key innovation is to view various classes of neural networks as continuous relaxations over the space of programs.
arXiv Detail & Related papers (2020-07-23T16:07:39Z) - Efficient Learning of Generative Models via Finite-Difference Score
Matching [111.55998083406134]
We present a generic strategy to efficiently approximate any-order directional derivative with finite difference.
Our approximation only involves function evaluations, which can be executed in parallel, and no gradient computations.
arXiv Detail & Related papers (2020-07-07T10:05:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.