Julia Language in Machine Learning: Algorithms, Applications, and Open
Issues
- URL: http://arxiv.org/abs/2003.10146v2
- Date: Sun, 17 May 2020 10:52:22 GMT
- Title: Julia Language in Machine Learning: Algorithms, Applications, and Open
Issues
- Authors: Kaifeng Gao, Gang Mei, Francesco Piccialli, Salvatore Cuomo, Jingzhi
Tu, Zenan Huo
- Abstract summary: Machine learning is driving development across many fields in science and engineering.
Currently, the programming languages most commonly used to develop machine learning algorithms include Python, and C/C ++.
This paper summarizes the related research work and developments in the application of the Julia language in machine learning.
- Score: 5.666843255747851
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine learning is driving development across many fields in science and
engineering. A simple and efficient programming language could accelerate
applications of machine learning in various fields. Currently, the programming
languages most commonly used to develop machine learning algorithms include
Python, MATLAB, and C/C ++. However, none of these languages well balance both
efficiency and simplicity. The Julia language is a fast, easy-to-use, and
open-source programming language that was originally designed for
high-performance computing, which can well balance the efficiency and
simplicity. This paper summarizes the related research work and developments in
the application of the Julia language in machine learning. It first surveys the
popular machine learning algorithms that are developed in the Julia language.
Then, it investigates applications of the machine learning algorithms
implemented with the Julia language. Finally, it discusses the open issues and
the potential future directions that arise in the use of the Julia language in
machine learning.
Related papers
- The State of Julia for Scientific Machine Learning [0.0]
We take a modern look at Julia's features and ecosystem, assess the current state of the language, and discuss its viability and pitfalls.
We call for the community to address Julia's language-level issues that are preventing further adoption.
arXiv Detail & Related papers (2024-10-14T01:43:23Z) - A Comprehensive Guide to Combining R and Python code for Data Science, Machine Learning and Reinforcement Learning [42.350737545269105]
We show how to run Python's scikit-learn, pytorch and OpenAI gym libraries for building Machine Learning, Deep Learning, and Reinforcement Learning projects easily.
arXiv Detail & Related papers (2024-07-19T23:01:48Z) - A Declarative Query Language for Scientific Machine Learning [0.0]
We introduce a new declarative machine learning query language, called em MQL, for naive users.
We discuss two materials science experiments implemented using MQL on a materials science workflow system called MatFlow.
arXiv Detail & Related papers (2024-05-25T09:58:33Z) - CoLA: Exploiting Compositional Structure for Automatic and Efficient
Numerical Linear Algebra [62.37017125812101]
We propose a simple but general framework for large-scale linear algebra problems in machine learning, named CoLA.
By combining a linear operator abstraction with compositional dispatch rules, CoLA automatically constructs memory and runtime efficient numerical algorithms.
We showcase its efficacy across a broad range of applications, including partial differential equations, Gaussian processes, equivariant model construction, and unsupervised learning.
arXiv Detail & Related papers (2023-09-06T14:59:38Z) - Toolformer: Language Models Can Teach Themselves to Use Tools [62.04867424598204]
Language models (LMs) exhibit remarkable abilities to solve new tasks from just a few examples or textual instructions, especially at scale.
We show that LMs can teach themselves to use external tools via simple APIs and achieve the best of both worlds.
We introduce Toolformer, a model trained to decide which APIs to call, when to call them, what arguments to pass, and how to best incorporate the results into future token prediction.
arXiv Detail & Related papers (2023-02-09T16:49:57Z) - AI2: The next leap toward native language based and explainable machine
learning framework [1.827510863075184]
The proposed framework, named AI$2$, uses a natural language interface that allows a non-specialist to benefit from machine learning algorithms.
The primary contribution of the AI$2$ framework allows a user to call the machine learning algorithms in English, making its interface usage easier.
Another contribution is a preprocessing module that helps to describe and to load data properly.
arXiv Detail & Related papers (2023-01-09T14:48:35Z) - Flashlight: Enabling Innovation in Tools for Machine Learning [50.63188263773778]
We introduce Flashlight, an open-source library built to spur innovation in machine learning tools and systems.
We see Flashlight as a tool enabling research that can benefit widely used libraries downstream and bring machine learning and systems researchers closer together.
arXiv Detail & Related papers (2022-01-29T01:03:29Z) - Searching for More Efficient Dynamic Programs [61.79535031840558]
We describe a set of program transformations, a simple metric for assessing the efficiency of a transformed program, and a search procedure to improve this metric.
We show that in practice, automated search can find substantial improvements to the initial program.
arXiv Detail & Related papers (2021-09-14T20:52:55Z) - AVATAR: A Parallel Corpus for Java-Python Program Translation [77.86173793901139]
Program translation refers to migrating source code from one language to another.
We present AVATAR, a collection of 9,515 programming problems and their solutions written in two popular languages, Java and Python.
arXiv Detail & Related papers (2021-08-26T05:44:20Z) - Leveraging Language to Learn Program Abstractions and Search Heuristics [66.28391181268645]
We introduce LAPS (Language for Abstraction and Program Search), a technique for using natural language annotations to guide joint learning of libraries and neurally-guided search models for synthesis.
When integrated into a state-of-the-art library learning system (DreamCoder), LAPS produces higher-quality libraries and improves search efficiency and generalization.
arXiv Detail & Related papers (2021-06-18T15:08:47Z) - Machine Learning in Python: Main developments and technology trends in
data science, machine learning, and artificial intelligence [3.1314898234563295]
Python continues to be the most preferred language for scientific computing, data science, and machine learning.
This survey offers insight into the field of machine learning with Python, taking a tour through important topics to identify some of the core hardware and software paradigms that have enabled it.
arXiv Detail & Related papers (2020-02-12T05:20:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.