MLJ: A Julia package for composable machine learning
- URL: http://arxiv.org/abs/2007.12285v2
- Date: Tue, 3 Nov 2020 23:06:45 GMT
- Title: MLJ: A Julia package for composable machine learning
- Authors: Anthony D. Blaom, Franz Kiraly, Thibaut Lienart, Yiannis Simillides,
Diego Arenas, Sebastian J. Vollmer
- Abstract summary: MLJ is an open source software package for interacting with machine learning models written in Julia and other languages.
It provides tools and meta-algorithms for selecting, tuning, evaluating, composing and comparing those models.
- Score: 0.8661220987937566
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: MLJ (Machine Learing in Julia) is an open source software package providing a
common interface for interacting with machine learning models written in Julia
and other languages. It provides tools and meta-algorithms for selecting,
tuning, evaluating, composing and comparing those models, with a focus on
flexible model composition. In this design overview we detail chief novelties
of the framework, together with the clear benefits of Julia over the dominant
multi-language alternatives.
Related papers
- Towards Completeness-Oriented Tool Retrieval for Large Language Models [60.733557487886635]
Real-world systems often incorporate a wide array of tools, making it impractical to input all tools into Large Language Models.
Existing tool retrieval methods primarily focus on semantic matching between user queries and tool descriptions.
We propose a novel modelagnostic COllaborative Learning-based Tool Retrieval approach, COLT, which captures not only the semantic similarities between user queries and tool descriptions but also takes into account the collaborative information of tools.
arXiv Detail & Related papers (2024-05-25T06:41:23Z) - ALICE: Combining Feature Selection and Inter-Rater Agreeability for Machine Learning Insights [0.0]
This paper presents a new Python library called Automated Learning for Insightful Comparison and Evaluation (ALICE)
It merges conventional feature selection and the concept of inter-rater agreeability in a simple, user-friendly manner to seek insights into black box Machine Learning models.
The framework is proposed following an overview of the key concepts of interpretability in ML.
arXiv Detail & Related papers (2024-04-13T17:34:58Z) - OmniFusion Technical Report [7.332426123896801]
We propose an textit OmniFusion model based on a pretrained large language model (LLM)
We evaluate and compare several architecture design principles for better text and visual data coupling.
Experiments on 8 visual-language benchmarks show the top score for the best OmniFusion setup.
arXiv Detail & Related papers (2024-04-09T11:00:19Z) - CMULAB: An Open-Source Framework for Training and Deployment of Natural Language Processing Models [59.91221728187576]
This paper introduces the CMU Linguistic Linguistic Backend, an open-source framework that simplifies model deployment and continuous human-in-the-loop fine-tuning of NLP models.
CMULAB enables users to leverage the power of multilingual models to quickly adapt and extend existing tools for speech recognition, OCR, translation, and syntactic analysis to new languages.
arXiv Detail & Related papers (2024-04-03T02:21:46Z) - ToolEyes: Fine-Grained Evaluation for Tool Learning Capabilities of
Large Language Models in Real-world Scenarios [48.38419686697733]
We propose ToolEyes, a fine-grained system tailored for the evaluation of large language models' tool learning capabilities in authentic scenarios.
The system meticulously examines seven real-world scenarios, analyzing five dimensions crucial to LLMs in tool learning.
ToolEyes incorporates a tool library boasting approximately 600 tools, serving as an intermediary between LLMs and the physical world.
arXiv Detail & Related papers (2024-01-01T12:49:36Z) - CoLLiE: Collaborative Training of Large Language Models in an Efficient
Way [59.09824823710863]
CoLLiE is an efficient library that facilitates collaborative training of large language models.
With its modular design and comprehensive functionality, CoLLiE offers a balanced blend of efficiency, ease of use, and customization.
arXiv Detail & Related papers (2023-12-01T08:02:16Z) - Reformulating Vision-Language Foundation Models and Datasets Towards
Universal Multimodal Assistants [65.47222691674074]
Muffin framework employs pre-trained vision-language models to act as providers of visual signals.
UniMM-Chat dataset explores the complementarities of datasets to generate 1.1M high-quality and diverse multimodal instructions.
arXiv Detail & Related papers (2023-10-01T12:35:18Z) - Leveraging Language to Learn Program Abstractions and Search Heuristics [66.28391181268645]
We introduce LAPS (Language for Abstraction and Program Search), a technique for using natural language annotations to guide joint learning of libraries and neurally-guided search models for synthesis.
When integrated into a state-of-the-art library learning system (DreamCoder), LAPS produces higher-quality libraries and improves search efficiency and generalization.
arXiv Detail & Related papers (2021-06-18T15:08:47Z) - Physics-Informed Machine Learning Simulator for Wildfire Propagation [0.0]
This work is to evaluate the feasibility of re-implementing some key parts of the widely used Weather Research and Forecasting WRF-SFIRE simulator.
The main programming language used is Julia, a compiled language which offers better perfomance than interpreted ones.
arXiv Detail & Related papers (2020-12-12T14:13:26Z) - The JuliaConnectoR: a functionally oriented interface for integrating
Julia in R [0.0]
We develop the R package JuliaConnectoR, available from the CRAN repository and GitHub.
For maintainability and stability, we base communication between R and Julia on TCP.
This makes it easy to develop R extensions with Julia or to simply call functionality from Julia packages in R.
arXiv Detail & Related papers (2020-05-13T14:18:34Z) - Julia Language in Machine Learning: Algorithms, Applications, and Open
Issues [5.666843255747851]
Machine learning is driving development across many fields in science and engineering.
Currently, the programming languages most commonly used to develop machine learning algorithms include Python, and C/C ++.
This paper summarizes the related research work and developments in the application of the Julia language in machine learning.
arXiv Detail & Related papers (2020-03-23T09:31:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.