Progress Report: A Deep Learning Guided Exploration of Affine Unimodular
Loop Transformations
- URL: http://arxiv.org/abs/2206.03684v1
- Date: Wed, 8 Jun 2022 05:47:42 GMT
- Title: Progress Report: A Deep Learning Guided Exploration of Affine Unimodular
Loop Transformations
- Authors: Massinissa Merouani, Khaled Afif Boudaoud, Iheb Nassim Aouadj, Nassim
Tchoulak, Fatima Benbouzid-Sitayeb, Karima Benatchba, Hugh Leather, and
Riyadh Baghdadi
- Abstract summary: We present a work in progress about a deep learning based approach for automatic code optimization in polyhedral compilers.
The proposed technique explores combinations of affine and non-affine loop transformations to find the sequence of transformations that minimizes the execution time of a given program.
Preliminary results show that the proposed techniques achieve a 2.35x geometric mean speedup over state of the art polyhedral compilers.
- Score: 1.5699353548228476
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: In this paper, we present a work in progress about a deep learning based
approach for automatic code optimization in polyhedral compilers. The proposed
technique explores combinations of affine and non-affine loop transformations
to find the sequence of transformations that minimizes the execution time of a
given program. This exploration is guided by a deep learning based cost model
that evaluates the speedup that each sequence of transformations would yield.
Preliminary results show that the proposed techniques achieve a 2.35x geometric
mean speedup over state of the art polyhedral compilers (Pluto).
Related papers
- Automatic Differentiation of Optimization Algorithms with Time-Varying Updates [4.389150156866014]
We apply unrolled or automatic differentiation to a time-varying iterative process and provide convergence guarantees for the resulting derivative iterates.
We adapt these convergence results and apply them to proximal gradient descent with variable step size and FISTA when solving partly smooth problems.
Our theoretical and numerical results show that the convergence rate of the algorithm is reflected in its derivative iterates.
arXiv Detail & Related papers (2024-10-21T11:53:49Z) - LOOPer: A Learned Automatic Code Optimizer For Polyhedral Compilers [1.7529897611426233]
We introduce LOOPer, the first polyhedral autoscheduler that uses a deep-learning based cost model.
It supports the exploration of a large set of affine transformations, allowing the application of complex sequences of polyhedral transformations.
It also supports the optimization of programs with multiple loop nests and with rectangular and non-rectangular iteration domains.
arXiv Detail & Related papers (2024-03-18T07:22:31Z) - Fast Nonlinear Two-Time-Scale Stochastic Approximation: Achieving $O(1/k)$ Finite-Sample Complexity [2.5382095320488665]
This paper proposes to develop a new variant of the two-time-scale monotone approximation to find the roots of two coupled nonlinear operators.
Our key idea is to leverage the classic Ruppert-Polyak averaging technique to dynamically estimate the operators through their samples.
The estimated values of these averaging steps will then be used in the two-time-scale approximation updates to find the desired solution.
arXiv Detail & Related papers (2024-01-23T13:44:15Z) - Uncovering mesa-optimization algorithms in Transformers [61.06055590704677]
Some autoregressive models can learn as an input sequence is processed, without undergoing any parameter changes, and without being explicitly trained to do so.
We show that standard next-token prediction error minimization gives rise to a subsidiary learning algorithm that adjusts the model as new inputs are revealed.
Our findings explain in-context learning as a product of autoregressive loss minimization and inform the design of new optimization-based Transformer layers.
arXiv Detail & Related papers (2023-09-11T22:42:50Z) - Recurrent Generic Contour-based Instance Segmentation with Progressive
Learning [111.31166268300817]
We propose a novel deep network architecture, i.e., PolySnake, for generic contour-based instance segmentation.
Motivated by the classic Snake algorithm, the proposed PolySnake achieves superior and robust segmentation performance.
arXiv Detail & Related papers (2023-01-21T05:34:29Z) - Nesterov Accelerated ADMM for Fast Diffeomorphic Image Registration [63.15453821022452]
Recent developments in approaches based on deep learning have achieved sub-second runtimes for DiffIR.
We propose a simple iterative scheme that functionally composes intermediate non-stationary velocity fields.
We then propose a convex optimisation model that uses a regularisation term of arbitrary order to impose smoothness on these velocity fields.
arXiv Detail & Related papers (2021-09-26T19:56:45Z) - A Discrete Variational Derivation of Accelerated Methods in Optimization [68.8204255655161]
We introduce variational which allow us to derive different methods for optimization.
We derive two families of optimization methods in one-to-one correspondence.
The preservation of symplecticity of autonomous systems occurs here solely on the fibers.
arXiv Detail & Related papers (2021-06-04T20:21:53Z) - A Reinforcement Learning Environment for Polyhedral Optimizations [68.8204255655161]
We propose a shape-agnostic formulation for the space of legal transformations in the polyhedral model as a Markov Decision Process (MDP)
Instead of using transformations, the formulation is based on an abstract space of possible schedules.
Our generic MDP formulation enables using reinforcement learning to learn optimization policies over a wide range of loops.
arXiv Detail & Related papers (2021-04-28T12:41:52Z) - Optimization of Graph Total Variation via Active-Set-based Combinatorial
Reconditioning [48.42916680063503]
We propose a novel adaptive preconditioning strategy for proximal algorithms on this problem class.
We show that nested-forest decomposition of the inactive edges yields a guaranteed local linear convergence rate.
Our results suggest that local convergence analysis can serve as a guideline for selecting variable metrics in proximal algorithms.
arXiv Detail & Related papers (2020-02-27T16:33:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.