Efficient and Sound Differentiable Programming in a Functional
Array-Processing Language
- URL: http://arxiv.org/abs/2212.10307v1
- Date: Tue, 20 Dec 2022 14:54:47 GMT
- Title: Efficient and Sound Differentiable Programming in a Functional
Array-Processing Language
- Authors: Amir Shaikhha, Mathieu Huot, Shabnam Ghasemirad, Andrew Fitzgibbon,
Simon Peyton Jones, Dimitrios Vytiniotis
- Abstract summary: Automatic differentiation (AD) is a technique for computing the derivative of a function represented by a program.
We present an AD system for a higher-order functional array-processing language.
In combination, computation with forward-mode AD can be as efficient as reverse mode.
- Score: 4.1779847272994495
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Automatic differentiation (AD) is a technique for computing the derivative of
a function represented by a program. This technique is considered as the
de-facto standard for computing the differentiation in many machine learning
and optimisation software tools. Despite the practicality of this technique,
the performance of the differentiated programs, especially for functional
languages and in the presence of vectors, is suboptimal. We present an AD
system for a higher-order functional array-processing language. The core
functional language underlying this system simultaneously supports both
source-to-source forward-mode AD and global optimisations such as loop
transformations. In combination, gradient computation with forward-mode AD can
be as efficient as reverse mode, and the Jacobian matrices required for
numerical algorithms such as Gauss-Newton and Levenberg-Marquardt can be
efficiently computed.
Related papers
- Sample-efficient Bayesian Optimisation Using Known Invariances [56.34916328814857]
We show that vanilla and constrained BO algorithms are inefficient when optimising invariant objectives.
We derive a bound on the maximum information gain of these invariant kernels.
We use our method to design a current drive system for a nuclear fusion reactor, finding a high-performance solution.
arXiv Detail & Related papers (2024-10-22T12:51:46Z) - A Method for Efficient Heterogeneous Parallel Compilation: A Cryptography Case Study [8.06660833012594]
This paper introduces a novel MLIR-based dialect, named hyper, designed to optimize data management and parallel computation across diverse hardware architectures.
We present HETOCompiler, a cryptography-focused compiler prototype that implements multiple hash algorithms and enables their execution on heterogeneous systems.
arXiv Detail & Related papers (2024-07-12T15:12:51Z) - AxOMaP: Designing FPGA-based Approximate Arithmetic Operators using
Mathematical Programming [2.898055875927704]
We propose a data analysis-driven mathematical programming-based approach to synthesizing approximate operators for FPGAs.
Specifically, we formulate mixed integer quadratically constrained programs based on the results of correlation analysis of the characterization data.
Compared to traditional evolutionary algorithms-based optimization, we report up to 21% improvement in the hypervolume, for joint optimization of PPA and BEHAV.
arXiv Detail & Related papers (2023-09-23T18:23:54Z) - Source-to-Source Automatic Differentiation of OpenMP Parallel Loops [0.0]
This paper presents our work toward correct and efficient automatic differentiation of OpenMP parallel worksharing loops in forward and reverse mode.
We propose a framework to reason about the correctness of the generated derivative code, from which we justify our OpenMP extension to the differentiation model.
Performance of the generated derivative programs in forward and reverse mode is better than sequential, although our reverse mode often scales worse than the input programs.
arXiv Detail & Related papers (2021-11-02T19:40:59Z) - Geometry-aware Bayesian Optimization in Robotics using Riemannian
Mat\'ern Kernels [64.62221198500467]
We show how to implement geometry-aware kernels for Bayesian optimization.
This technique can be used for control parameter tuning, parametric policy adaptation, and structure design in robotics.
arXiv Detail & Related papers (2021-11-02T09:47:22Z) - Efficient and Modular Implicit Differentiation [68.74748174316989]
We propose a unified, efficient and modular approach for implicit differentiation of optimization problems.
We show that seemingly simple principles allow to recover many recently proposed implicit differentiation methods and create new ones easily.
arXiv Detail & Related papers (2021-05-31T17:45:58Z) - Automatic differentiation for Riemannian optimization on low-rank matrix
and tensor-train manifolds [71.94111815357064]
In scientific computing and machine learning applications, matrices and more general multidimensional arrays (tensors) can often be approximated with the help of low-rank decompositions.
One of the popular tools for finding the low-rank approximations is to use the Riemannian optimization.
arXiv Detail & Related papers (2021-03-27T19:56:00Z) - Efficient Learning of Generative Models via Finite-Difference Score
Matching [111.55998083406134]
We present a generic strategy to efficiently approximate any-order directional derivative with finite difference.
Our approximation only involves function evaluations, which can be executed in parallel, and no gradient computations.
arXiv Detail & Related papers (2020-07-07T10:05:01Z) - Automatic Differentiation in ROOT [62.997667081978825]
In mathematics and computer algebra, automatic differentiation (AD) is a set of techniques to evaluate the derivative of a function specified by a computer program.
This paper presents AD techniques available in ROOT, supported by Cling, to produce derivatives of arbitrary C/C++ functions.
arXiv Detail & Related papers (2020-04-09T09:18:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.