PredictChain: Empowering Collaboration and Data Accessibility for AI in
a Decentralized Blockchain-based Marketplace
- URL: http://arxiv.org/abs/2307.15168v1
- Date: Thu, 27 Jul 2023 19:56:18 GMT
- Title: PredictChain: Empowering Collaboration and Data Accessibility for AI in
a Decentralized Blockchain-based Marketplace
- Authors: Matthew T. Pisano and Connor J. Patterson and Oshani Seneviratne
- Abstract summary: We propose a blockchain-based marketplace called "PredictChain" for predictive machine-learning models.
This marketplace enables users to upload datasets for training predictive machine learning models, request model training on previously uploaded datasets, or submit queries to trained models.
- Score: 1.4364491422470593
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Limited access to computing resources and training data poses significant
challenges for individuals and groups aiming to train and utilize predictive
machine learning models. Although numerous publicly available machine learning
models exist, they are often unhosted, necessitating end-users to establish
their computational infrastructure. Alternatively, these models may only be
accessible through paid cloud-based mechanisms, which can prove costly for
general public utilization. Moreover, model and data providers require a more
streamlined approach to track resource usage and capitalize on subsequent usage
by others, both financially and otherwise. An effective mechanism is also
lacking to contribute high-quality data for improving model performance. We
propose a blockchain-based marketplace called "PredictChain" for predictive
machine-learning models to address these issues. This marketplace enables users
to upload datasets for training predictive machine learning models, request
model training on previously uploaded datasets, or submit queries to trained
models. Nodes within the blockchain network, equipped with available computing
resources, will operate these models, offering a range of archetype machine
learning models with varying characteristics, such as cost, speed, simplicity,
power, and cost-effectiveness. This decentralized approach empowers users to
develop improved models accessible to the public, promotes data sharing, and
reduces reliance on centralized cloud providers.
Related papers
- Exploring Quantum Neural Networks for Demand Forecasting [0.25128687379089687]
This paper presents an approach for training demand prediction models using quantum neural networks.
A classical recurrent neural network was used to compare the results.
They show a similar predictive capacity between the classical and quantum models.
arXiv Detail & Related papers (2024-10-19T13:01:31Z) - Update Selective Parameters: Federated Machine Unlearning Based on Model Explanation [46.86767774669831]
We propose a more effective and efficient federated unlearning scheme based on the concept of model explanation.
We select the most influential channels within an already-trained model for the data that need to be unlearned.
arXiv Detail & Related papers (2024-06-18T11:43:20Z) - Fantastic Gains and Where to Find Them: On the Existence and Prospect of
General Knowledge Transfer between Any Pretrained Model [74.62272538148245]
We show that for arbitrary pairings of pretrained models, one model extracts significant data context unavailable in the other.
We investigate if it is possible to transfer such "complementary" knowledge from one model to another without performance degradation.
arXiv Detail & Related papers (2023-10-26T17:59:46Z) - How Can We Train Deep Learning Models Across Clouds and Continents? An Experimental Study [57.97785297481162]
We evaluate the cost and throughput implications of training in different zones, continents, and clouds for representative CV, NLP, and ASR models.
We show how leveraging spot pricing enables a new cost-efficient way to train models with multiple cheap instance, trumping both more centralized and powerful hardware and even on-demand cloud offerings at competitive prices.
arXiv Detail & Related papers (2023-06-05T18:17:37Z) - iDML: Incentivized Decentralized Machine Learning [16.31868012716559]
We propose a novel blockchain-based incentive mechanism for completely decentralized and opportunistic learning architectures.
We leverage a smart contract not only for providing explicit incentives to end devices to participate but also to create a fully decentralized mechanism to inspect and reflect on the behavior of the learning architecture.
arXiv Detail & Related papers (2023-04-10T17:28:51Z) - Synthetic Model Combination: An Instance-wise Approach to Unsupervised
Ensemble Learning [92.89846887298852]
Consider making a prediction over new test data without any opportunity to learn from a training set of labelled data.
Give access to a set of expert models and their predictions alongside some limited information about the dataset used to train them.
arXiv Detail & Related papers (2022-10-11T10:20:31Z) - Learnware: Small Models Do Big [69.88234743773113]
The prevailing big model paradigm, which has achieved impressive results in natural language processing and computer vision applications, has not yet addressed those issues, whereas becoming a serious source of carbon emissions.
This article offers an overview of the learnware paradigm, which attempts to enable users not need to build machine learning models from scratch, with the hope of reusing small models to do things even beyond their original purposes.
arXiv Detail & Related papers (2022-10-07T15:55:52Z) - Integration of a machine learning model into a decision support tool to
predict absenteeism at work of prospective employees [0.0]
Productivity losses caused by absenteeism at work cost U.S. employers billions of dollars each year.
This study is to develop a decision support tool to predict absenteeism among potential employees.
arXiv Detail & Related papers (2022-02-02T03:49:01Z) - A Marketplace for Trading AI Models based on Blockchain and Incentives
for IoT Data [24.847898465750667]
An emerging paradigm in Machine Learning (ML) is a federated approach where the learning model is delivered to a group of heterogeneous agents partially, allowing agents to train the model locally with their own data.
The problem of valuation of models, as well as the questions of incentives for collaborative training and trading of data/models, have received limited treatment in the literature.
In this paper, a new ecosystem of ML model trading over a trusted ML-based network is proposed. The buyer can acquire the model of interest from the ML market, and interested sellers spend local computations on their data to enhance that model's quality
arXiv Detail & Related papers (2021-12-06T08:52:42Z) - Decentralized Federated Learning Preserves Model and Data Privacy [77.454688257702]
We propose a fully decentralized approach, which allows to share knowledge between trained models.
Students are trained on the output of their teachers via synthetically generated input data.
The results show that an untrained student model, trained on the teachers output reaches comparable F1-scores as the teacher.
arXiv Detail & Related papers (2021-02-01T14:38:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.