SymJAX: symbolic CPU/GPU/TPU programming
- URL: http://arxiv.org/abs/2005.10635v1
- Date: Thu, 21 May 2020 13:37:25 GMT
- Title: SymJAX: symbolic CPU/GPU/TPU programming
- Authors: Randall Balestriero
- Abstract summary: SymJAX is a symbolic programming version of JAX simplifying graph input/output/updates and providing additional functionalities for general machine learning and deep learning applications.
From an user perspective SymJAX provides a la Theano experience with fast graph optimization/compilation and broad hardware support, along with Lasagne-like deep learning functionalities.
- Score: 9.868558660605995
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: SymJAX is a symbolic programming version of JAX simplifying graph
input/output/updates and providing additional functionalities for general
machine learning and deep learning applications. From an user perspective
SymJAX provides a la Theano experience with fast graph optimization/compilation
and broad hardware support, along with Lasagne-like deep learning
functionalities.
Related papers
- Benchmarking Predictive Coding Networks -- Made Simple [48.652114040426625]
We first propose a library called PCX, whose focus lies on performance and simplicity.
We use PCX to implement a large set of benchmarks for the community to use for their experiments.
arXiv Detail & Related papers (2024-07-01T10:33:44Z) - JaxDecompiler: Redefining Gradient-Informed Software Design [0.0]
JaxDecompiler is a tool that transforms any JAX function into an editable Python code.
This article introduces JaxDecompiler, a tool that transforms any JAX function into an editable Python code.
arXiv Detail & Related papers (2024-03-14T20:32:31Z) - DrJAX: Scalable and Differentiable MapReduce Primitives in JAX [9.676195490442367]
DrJAX is a library designed to support large-scale distributed and parallel machine learning algorithms.
DrJAX embeds building blocks for MapReduce computations as primitives in JAX.
DrJAX computations can be translated directly to XLA HLO, enabling flexible integration with a wide array of ML training platforms.
arXiv Detail & Related papers (2024-03-11T19:51:01Z) - BlackJAX: Composable Bayesian inference in JAX [8.834500692867671]
BlackJAX is a library implementing sampling and variational inference algorithms.
It is written in Python, using JAX to compile and run NumpPy-like samplers and variational methods on CPUs, GPUs, and TPUs.
arXiv Detail & Related papers (2024-02-16T16:21:02Z) - LLaGA: Large Language and Graph Assistant [73.71990472543027]
Large Language and Graph Assistant (LLaGA) is an innovative model to handle the complexities of graph-structured data.
LLaGA excels in versatility, generalizability and interpretability, allowing it to perform consistently well across different datasets and tasks.
Our experiments show that LLaGA delivers outstanding performance across four datasets and three tasks using one single model.
arXiv Detail & Related papers (2024-02-13T02:03:26Z) - JaxMARL: Multi-Agent RL Environments and Algorithms in JAX [105.343918678781]
We present JaxMARL, the first open-source, Python-based library that combines GPU-enabled efficiency with support for a large number of commonly used MARL environments.
Our experiments show that, in terms of wall clock time, our JAX-based training pipeline is around 14 times faster than existing approaches.
We also introduce and benchmark SMAX, a JAX-based approximate reimplementation of the popular StarCraft Multi-Agent Challenge.
arXiv Detail & Related papers (2023-11-16T18:58:43Z) - TpuGraphs: A Performance Prediction Dataset on Large Tensor
Computational Graphs [24.790481918123103]
This paper introduces TpuGraphs, a performance prediction dataset on full tensor programs.
Each graph in the dataset represents the main computation of a machine learning workload.
TpuGraphs provides 25x more graphs than the largest graph property prediction dataset.
arXiv Detail & Related papers (2023-08-25T17:04:35Z) - JaxPruner: A concise library for sparsity research [46.153423603424]
JaxPruner is an open-source library for sparse neural network research.
It implements popular pruning and sparse training algorithms with minimal memory and latency overhead.
arXiv Detail & Related papers (2023-04-27T10:45:30Z) - Graph Contrastive Learning Automated [94.41860307845812]
Graph contrastive learning (GraphCL) has emerged with promising representation learning performance.
The effectiveness of GraphCL hinges on ad-hoc data augmentations, which have to be manually picked per dataset.
This paper proposes a unified bi-level optimization framework to automatically, adaptively and dynamically select data augmentations when performing GraphCL on specific graph data.
arXiv Detail & Related papers (2021-06-10T16:35:27Z) - CommPOOL: An Interpretable Graph Pooling Framework for Hierarchical
Graph Representation Learning [74.90535111881358]
We propose a new interpretable graph pooling framework - CommPOOL.
It can capture and preserve the hierarchical community structure of graphs in the graph representation learning process.
CommPOOL is a general and flexible framework for hierarchical graph representation learning.
arXiv Detail & Related papers (2020-12-10T21:14:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.