JaxDecompiler: Redefining Gradient-Informed Software Design
- URL: http://arxiv.org/abs/2403.10571v1
- Date: Thu, 14 Mar 2024 20:32:31 GMT
- Title: JaxDecompiler: Redefining Gradient-Informed Software Design
- Authors: Pierrick Pochelu,
- Abstract summary: JaxDecompiler is a tool that transforms any JAX function into an editable Python code.
This article introduces JaxDecompiler, a tool that transforms any JAX function into an editable Python code.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Among numerical libraries capable of computing gradient descent optimization, JAX stands out by offering more features, accelerated by an intermediate representation known as Jaxpr language. However, editing the Jaxpr code is not directly possible. This article introduces JaxDecompiler, a tool that transforms any JAX function into an editable Python code, especially useful for editing the JAX function generated by the gradient function. JaxDecompiler simplifies the processes of reverse engineering, understanding, customizing, and interoperability of software developed by JAX. We highlight its capabilities, emphasize its practical applications especially in deep learning and more generally gradient-informed software, and demonstrate that the decompiled code speed performance is similar to the original.
Related papers
- NExT: Teaching Large Language Models to Reason about Code Execution [50.93581376646064]
Large language models (LLMs) of code are typically trained on the surface textual form of programs.
We propose NExT, a method to teach LLMs to inspect the execution traces of programs and reason about their run-time behavior.
arXiv Detail & Related papers (2024-04-23T01:46:32Z) - JaxUED: A simple and useable UED library in Jax [1.5821811088000381]
We present JaxUED, an open-source library providing minimal dependency implementations of modern Unsupervised Environment Design (UED) algorithms in Jax.
Inspired by CleanRL, we provide fast, clear, understandable, and easily modifiable implementations, with the aim of accelerating research into UED.
arXiv Detail & Related papers (2024-03-19T18:40:50Z) - JAXbind: Bind any function to JAX [0.0]
JAXbind provides an easy-to-use Python interface for defining custom, so-called JAX primitives.
JAXbind allows a user to interface the JAX function engine with custom derivatives and rules, enabling all JAX transformations for the custom primitive.
arXiv Detail & Related papers (2024-03-13T16:50:04Z) - DrJAX: Scalable and Differentiable MapReduce Primitives in JAX [9.676195490442367]
DrJAX is a library designed to support large-scale distributed and parallel machine learning algorithms.
DrJAX embeds building blocks for MapReduce computations as primitives in JAX.
DrJAX computations can be translated directly to XLA HLO, enabling flexible integration with a wide array of ML training platforms.
arXiv Detail & Related papers (2024-03-11T19:51:01Z) - Automatic Functional Differentiation in JAX [8.536145202129827]
We extend JAX with the capability to automatically differentiate higher-order functions (functionals and operators)
By representing functions as a generalization of arrays, we seamlessly use JAX's existing primitive system to implement higher-order functions.
arXiv Detail & Related papers (2023-11-30T17:23:40Z) - JaxMARL: Multi-Agent RL Environments and Algorithms in JAX [105.343918678781]
We present JaxMARL, the first open-source, Python-based library that combines GPU-enabled efficiency with support for a large number of commonly used MARL environments.
Our experiments show that, in terms of wall clock time, our JAX-based training pipeline is around 14 times faster than existing approaches.
We also introduce and benchmark SMAX, a JAX-based approximate reimplementation of the popular StarCraft Multi-Agent Challenge.
arXiv Detail & Related papers (2023-11-16T18:58:43Z) - JaxPruner: A concise library for sparsity research [46.153423603424]
JaxPruner is an open-source library for sparse neural network research.
It implements popular pruning and sparse training algorithms with minimal memory and latency overhead.
arXiv Detail & Related papers (2023-04-27T10:45:30Z) - A modular software framework for the design and implementation of
ptychography algorithms [55.41644538483948]
We present SciCom, a new ptychography software framework aiming at simulating ptychography datasets and testing state-of-the-art reconstruction algorithms.
Despite its simplicity, the software leverages accelerated processing through the PyTorch interface.
Results are shown on both synthetic and real datasets.
arXiv Detail & Related papers (2022-05-06T16:32:37Z) - ReACC: A Retrieval-Augmented Code Completion Framework [53.49707123661763]
We propose a retrieval-augmented code completion framework, leveraging both lexical copying and referring to code with similar semantics by retrieval.
We evaluate our approach in the code completion task in Python and Java programming languages, achieving a state-of-the-art performance on CodeXGLUE benchmark.
arXiv Detail & Related papers (2022-03-15T08:25:08Z) - Extending C++ for Heterogeneous Quantum-Classical Computing [56.782064931823015]
qcor is a language extension to C++ and compiler implementation that enables heterogeneous quantum-classical programming, compilation, and execution in a single-source context.
Our work provides a first-of-its-kind C++ compiler enabling high-level quantum kernel (function) expression in a quantum-language manner.
arXiv Detail & Related papers (2020-10-08T12:49:07Z) - SymJAX: symbolic CPU/GPU/TPU programming [9.868558660605995]
SymJAX is a symbolic programming version of JAX simplifying graph input/output/updates and providing additional functionalities for general machine learning and deep learning applications.
From an user perspective SymJAX provides a la Theano experience with fast graph optimization/compilation and broad hardware support, along with Lasagne-like deep learning functionalities.
arXiv Detail & Related papers (2020-05-21T13:37:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.