From Things' Modeling Language (ThingML) to Things' Machine Learning
(ThingML2)
- URL: http://arxiv.org/abs/2009.10632v1
- Date: Tue, 22 Sep 2020 15:44:57 GMT
- Title: From Things' Modeling Language (ThingML) to Things' Machine Learning
(ThingML2)
- Authors: Armin Moin, Stephan R\"ossler, Marouane Sayih, Stephan G\"unnemann
- Abstract summary: We enhance ThingML to support machine learning on the modeling level.
Our DSL allows one to define things, which are in charge of carrying out data analytics.
Our code generators can automatically produce the complete implementation in Java and Python.
- Score: 4.014524824655106
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we illustrate how to enhance an existing state-of-the-art
modeling language and tool for the Internet of Things (IoT), called ThingML, to
support machine learning on the modeling level. To this aim, we extend the
Domain-Specific Language (DSL) of ThingML, as well as its code generation
framework. Our DSL allows one to define things, which are in charge of carrying
out data analytics. Further, our code generators can automatically produce the
complete implementation in Java and Python. The generated Python code is
responsible for data analytics and employs APIs of machine learning libraries,
such as Keras, Tensorflow and Scikit Learn. Our prototype is available as open
source software on Github.
Related papers
- Deep Fast Machine Learning Utils: A Python Library for Streamlined Machine Learning Prototyping [0.0]
The Deep Fast Machine Learning Utils (DFMLU) library provides tools designed to automate and enhance aspects of machine learning processes.
DFMLU offers functionalities that support model development and data handling.
This manuscript presents an overview of DFMLU's functionalities, providing Python examples for each tool.
arXiv Detail & Related papers (2024-09-14T21:39:17Z) - DistML.js: Installation-free Distributed Deep Learning Framework for Web Browsers [40.48978035180545]
"DistML.js" is a library designed for training and inference of machine learning models within web browsers.
We provide a comprehensive explanation of DistML.js's design, API, and implementation, alongside practical applications.
arXiv Detail & Related papers (2024-07-01T07:13:14Z) - CMULAB: An Open-Source Framework for Training and Deployment of Natural Language Processing Models [59.91221728187576]
This paper introduces the CMU Linguistic Linguistic Backend, an open-source framework that simplifies model deployment and continuous human-in-the-loop fine-tuning of NLP models.
CMULAB enables users to leverage the power of multilingual models to quickly adapt and extend existing tools for speech recognition, OCR, translation, and syntactic analysis to new languages.
arXiv Detail & Related papers (2024-04-03T02:21:46Z) - Language Models are Universal Embedders [48.12992614723464]
We show that pre-trained transformer decoders can embed universally when finetuned on limited English data.
Our models achieve competitive performance on different embedding tasks by minimal training data.
These results provide evidence of a promising path towards building powerful unified embedders.
arXiv Detail & Related papers (2023-10-12T11:25:46Z) - COMEX: A Tool for Generating Customized Source Code Representations [7.151800146054561]
COMEX is a framework that allows researchers and developers to create and combine multiple code-views.
It can analyze both method-level snippets and program-level snippets by using both intra-procedural and inter-procedural snippets.
It is built on tree-sitter - a widely used incremental analysis tool that supports over 40 languages.
arXiv Detail & Related papers (2023-07-10T16:46:34Z) - CodeTF: One-stop Transformer Library for State-of-the-art Code LLM [72.1638273937025]
We present CodeTF, an open-source Transformer-based library for state-of-the-art Code LLMs and code intelligence.
Our library supports a collection of pretrained Code LLM models and popular code benchmarks.
We hope CodeTF is able to bridge the gap between machine learning/generative AI and software engineering.
arXiv Detail & Related papers (2023-05-31T05:24:48Z) - Toolformer: Language Models Can Teach Themselves to Use Tools [62.04867424598204]
Language models (LMs) exhibit remarkable abilities to solve new tasks from just a few examples or textual instructions, especially at scale.
We show that LMs can teach themselves to use external tools via simple APIs and achieve the best of both worlds.
We introduce Toolformer, a model trained to decide which APIs to call, when to call them, what arguments to pass, and how to best incorporate the results into future token prediction.
arXiv Detail & Related papers (2023-02-09T16:49:57Z) - OmniXAI: A Library for Explainable AI [98.07381528393245]
We introduce OmniXAI, an open-source Python library of eXplainable AI (XAI)
It offers omni-way explainable AI capabilities and various interpretable machine learning techniques.
For practitioners, the library provides an easy-to-use unified interface to generate the explanations for their applications.
arXiv Detail & Related papers (2022-06-01T11:35:37Z) - A Systematic Evaluation of Large Language Models of Code [88.34057460577957]
Large language models (LMs) of code have recently shown tremendous promise in completing code and synthesizing code from natural language descriptions.
The current state-of-the-art code LMs are not publicly available, leaving many questions about their model and data design decisions.
Although Codex is not open-source, we find that existing open-source models do achieve close results in some programming languages.
We release a new model, PolyCoder, with 2.7B parameters based on the GPT-2 architecture, which was trained on 249GB of code across 12 programming languages on a single machine.
arXiv Detail & Related papers (2022-02-26T15:53:55Z) - Data Analytics and Machine Learning Methods, Techniques and Tool for
Model-Driven Engineering of Smart IoT Services [0.0]
This dissertation proposes a novel approach to enhance the development of smart services for the Internet of Things (IoT) and smart Cyber-Physical Systems (CPS)
The proposed approach offers abstraction and automation to the software engineering processes, as well as the Data Analytics (DA) and Machine Learning (ML) practices.
We implement and validate the proposed approach by extending an open source modeling tool, called ThingML.
arXiv Detail & Related papers (2021-02-12T11:09:54Z) - ThingML+ Augmenting Model-Driven Software Engineering for the Internet
of Things with Machine Learning [4.511923587827301]
We present the current position of the research project ML-Quadrat, which aims to extend the methodology, modeling language and tool support of ThingML.
We argue that in many cases IoT/CPS services involve system components and physical processes, whose behaviors are not well understood in order to be modeled using state machines.
We plan to support two target platforms for code generation regarding Stream Processing and Complex Event Processing, namely Apache SAMOA and Apama.
arXiv Detail & Related papers (2020-09-22T15:45:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.