JAXFit: Trust Region Method for Nonlinear Least-Squares Curve Fitting on
the GPU
- URL: http://arxiv.org/abs/2208.12187v1
- Date: Thu, 25 Aug 2022 16:13:29 GMT
- Title: JAXFit: Trust Region Method for Nonlinear Least-Squares Curve Fitting on
the GPU
- Authors: Lucas R. Hofer, Milan Krstaji\'c, Robert P. Smith
- Abstract summary: We implement a trust region method on the GPU for nonlinear least squares curve fitting problems using a new deep learning Python library called JAX.
Our open source package, JAXFit, works for both unconstrained and constrained curve fitting problems.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We implement a trust region method on the GPU for nonlinear least squares
curve fitting problems using a new deep learning Python library called JAX. Our
open source package, JAXFit, works for both unconstrained and constrained curve
fitting problems and allows the fit functions to be defined in Python alone --
without any specialized knowledge of either the GPU or CUDA programming. Since
JAXFit runs on the GPU, it is much faster than CPU based libraries and even
other GPU based libraries, despite being very easy to use. Additionally, due to
JAX's deep learning foundations, the Jacobian in JAXFit's trust region
algorithm is calculated with automatic differentiation, rather than than using
derivative approximations or requiring the user to define the fit function's
partial derivatives.
Related papers
- GeoCalib: Learning Single-image Calibration with Geometric Optimization [89.84142934465685]
From a single image, visual cues can help deduce intrinsic and extrinsic camera parameters like the focal length and the gravity direction.
Current approaches to this problem are based on either classical geometry with lines and vanishing points or on deep neural networks trained end-to-end.
We introduce GeoCalib, a deep neural network that leverages universal rules of 3D geometry through an optimization process.
arXiv Detail & Related papers (2024-09-10T17:59:55Z) - DrJAX: Scalable and Differentiable MapReduce Primitives in JAX [9.676195490442367]
DrJAX is a library designed to support large-scale distributed and parallel machine learning algorithms.
DrJAX embeds building blocks for MapReduce computations as primitives in JAX.
DrJAX computations can be translated directly to XLA HLO, enabling flexible integration with a wide array of ML training platforms.
arXiv Detail & Related papers (2024-03-11T19:51:01Z) - BlackJAX: Composable Bayesian inference in JAX [8.834500692867671]
BlackJAX is a library implementing sampling and variational inference algorithms.
It is written in Python, using JAX to compile and run NumpPy-like samplers and variational methods on CPUs, GPUs, and TPUs.
arXiv Detail & Related papers (2024-02-16T16:21:02Z) - XLB: A differentiable massively parallel lattice Boltzmann library in Python [0.0]
We introduce XLB library, a Python-based differentiable LBM library based on the JAX platform.
XLB's differentiability and data structure is compatible with the extensive JAX-based machine learning ecosystem.
XLB has been successfully scaled to handle simulations with billions of cells, achieving giga-scale lattice updates per second.
arXiv Detail & Related papers (2023-11-27T18:50:37Z) - JaxMARL: Multi-Agent RL Environments and Algorithms in JAX [105.343918678781]
We present JaxMARL, the first open-source, Python-based library that combines GPU-enabled efficiency with support for a large number of commonly used MARL environments.
Our experiments show that, in terms of wall clock time, our JAX-based training pipeline is around 14 times faster than existing approaches.
We also introduce and benchmark SMAX, a JAX-based approximate reimplementation of the popular StarCraft Multi-Agent Challenge.
arXiv Detail & Related papers (2023-11-16T18:58:43Z) - JaxPruner: A concise library for sparsity research [46.153423603424]
JaxPruner is an open-source library for sparse neural network research.
It implements popular pruning and sparse training algorithms with minimal memory and latency overhead.
arXiv Detail & Related papers (2023-04-27T10:45:30Z) - SequeL: A Continual Learning Library in PyTorch and JAX [50.33956216274694]
SequeL is a library for Continual Learning that supports both PyTorch and JAX frameworks.
It provides a unified interface for a wide range of Continual Learning algorithms, including regularization-based approaches, replay-based approaches, and hybrid approaches.
We release SequeL as an open-source library, enabling researchers and developers to easily experiment and extend the library for their own purposes.
arXiv Detail & Related papers (2023-04-21T10:00:22Z) - Going faster to see further: GPU-accelerated value iteration and
simulation for perishable inventory control using JAX [5.856836693166898]
We use the Python library JAX to implement value iteration and simulators of the underlying Markov decision processes in a high-level API.
Our method can extend use of value iteration to settings that were previously considered infeasible or impractical.
We compare the performance of the optimal replenishment policies to policies, fitted using simulation optimization in JAX which allowed the parallel evaluation of multiple candidate policy parameters.
arXiv Detail & Related papers (2023-03-19T14:20:44Z) - Data-Efficient Instance Segmentation with a Single GPU [88.31338435907304]
We introduce a data-efficient segmentation method we used in the 2021 VIPriors Instance Challenge.
Our solution is a modified version of Swin Transformer, based on the mmdetection which is a powerful toolbox.
Our method achieved the AP@0.50:0.95 (medium) of 0.592, which ranks second among all contestants.
arXiv Detail & Related papers (2021-10-01T07:36:20Z) - Hybrid Models for Learning to Branch [81.93868699246214]
We propose a new hybrid architecture for efficient branching on CPU machines.
The proposed architecture combines the expressive power of GNNs with computationally inexpensive multi-layer perceptrons (MLP) for branching.
arXiv Detail & Related papers (2020-06-26T21:03:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.