Combined Learning of Neural Network Weights for Privacy in Collaborative
Tasks
- URL: http://arxiv.org/abs/2205.00361v1
- Date: Sat, 30 Apr 2022 22:40:56 GMT
- Title: Combined Learning of Neural Network Weights for Privacy in Collaborative
Tasks
- Authors: Aline R. Ioste, Alan M. Durham, Marcelo Finger
- Abstract summary: CoLN, Combined Learning of Neural network weights, is a novel method to securely combine Machine Learning models over sensitive data with no sharing of data.
CoLN can contribute to secure collaborative research, as required in the medical area, where privacy issues preclude data sharing.
- Score: 1.1172382217477126
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce CoLN, Combined Learning of Neural network weights, a novel
method to securely combine Machine Learning models over sensitive data with no
sharing of data. With CoLN, local hosts use the same Neural Network
architecture and base parameters to train a model using only locally available
data. Locally trained models are then submitted to a combining agent, which
produces a combined model. The new model's parameters can be sent back to
hosts, and can then be used as initial parameters for a new training iteration.
CoLN is capable of combining several distributed neural networks of the same
kind but is not restricted to any single neural architecture. In this paper we
detail the combination algorithm and present experiments with feed-forward,
convolutional, and recurrent Neural Network architectures, showing that the
CoLN combined model approximates the performance of a hypothetical ideal
centralized model, trained using the combination of the local datasets. CoLN
can contribute to secure collaborative research, as required in the medical
area, where privacy issues preclude data sharing, but where the limitations of
local data demand information derived from larger datasets.
Related papers
- Federated Learning with Projected Trajectory Regularization [65.6266768678291]
Federated learning enables joint training of machine learning models from distributed clients without sharing their local data.
One key challenge in federated learning is to handle non-identically distributed data across the clients.
We propose a novel federated learning framework with projected trajectory regularization (FedPTR) for tackling the data issue.
arXiv Detail & Related papers (2023-12-22T02:12:08Z) - Set-based Neural Network Encoding Without Weight Tying [91.37161634310819]
We propose a neural network weight encoding method for network property prediction.
Our approach is capable of encoding neural networks in a model zoo of mixed architecture.
We introduce two new tasks for neural network property prediction: cross-dataset and cross-architecture.
arXiv Detail & Related papers (2023-05-26T04:34:28Z) - Privacy-Preserving Ensemble Infused Enhanced Deep Neural Network
Framework for Edge Cloud Convergence [18.570317928688606]
We propose a privacy-preserving ensemble infused enhanced Deep Neural Network (DNN) based learning framework in this paper.
In the convergence, edge server is used for both storing IoT produced bioimage and hosting algorithm for local model training.
We conduct several experiments to evaluate the performance of our proposed framework.
arXiv Detail & Related papers (2023-05-16T07:01:44Z) - Neural Attentive Circuits [93.95502541529115]
We introduce a general purpose, yet modular neural architecture called Neural Attentive Circuits (NACs)
NACs learn the parameterization and a sparse connectivity of neural modules without using domain knowledge.
NACs achieve an 8x speedup at inference time while losing less than 3% performance.
arXiv Detail & Related papers (2022-10-14T18:00:07Z) - Canoe : A System for Collaborative Learning for Neural Nets [4.547883122787855]
Canoe is a framework that facilitates knowledge transfer for neural networks.
Canoe provides new system support for dynamically extracting significant parameters from a helper node's neural network.
The evaluation of Canoe with different PyTorch and neural network models demonstrates that the knowledge transfer mechanism improves the model's adaptiveness to 3.5X compared to learning in isolation.
arXiv Detail & Related papers (2021-08-27T05:30:15Z) - Clustered Federated Learning via Generalized Total Variation
Minimization [83.26141667853057]
We study optimization methods to train local (or personalized) models for local datasets with a decentralized network structure.
Our main conceptual contribution is to formulate federated learning as total variation minimization (GTV)
Our main algorithmic contribution is a fully decentralized federated learning algorithm.
arXiv Detail & Related papers (2021-05-26T18:07:19Z) - MLDS: A Dataset for Weight-Space Analysis of Neural Networks [0.0]
We present MLDS, a new dataset consisting of thousands of trained neural networks with carefully controlled parameters.
This dataset enables new insights into both model-to-model and model-to-training-data relationships.
arXiv Detail & Related papers (2021-04-21T14:24:26Z) - Probabilistic Federated Learning of Neural Networks Incorporated with
Global Posterior Information [4.067903810030317]
In federated learning, models trained on local clients are distilled into a global model.
We propose a new method which extends the Probabilistic Federated Neural Matching.
Our new method outperforms popular state-of-the-art federated learning methods in both single communication round and additional communication rounds situation.
arXiv Detail & Related papers (2020-12-06T03:54:58Z) - Exploiting Heterogeneity in Operational Neural Networks by Synaptic
Plasticity [87.32169414230822]
Recently proposed network model, Operational Neural Networks (ONNs), can generalize the conventional Convolutional Neural Networks (CNNs)
In this study the focus is drawn on searching the best-possible operator set(s) for the hidden neurons of the network based on the Synaptic Plasticity paradigm that poses the essential learning theory in biological neurons.
Experimental results over highly challenging problems demonstrate that the elite ONNs even with few neurons and layers can achieve a superior learning performance than GIS-based ONNs.
arXiv Detail & Related papers (2020-08-21T19:03:23Z) - Consensus Driven Learning [0.0]
We propose a new method of distributed, decentralized learning that allows a network of nodes to coordinate their training using asynchronous updates over an unreliable network.
This is achieved by taking inspiration from Distributed Averaging Consensus algorithms to coordinate the various nodes.
We show that our coordination method allows models to be learned on highly biased datasets, and in the presence of intermittent communication failure.
arXiv Detail & Related papers (2020-05-20T18:24:19Z) - Model Fusion via Optimal Transport [64.13185244219353]
We present a layer-wise model fusion algorithm for neural networks.
We show that this can successfully yield "one-shot" knowledge transfer between neural networks trained on heterogeneous non-i.i.d. data.
arXiv Detail & Related papers (2019-10-12T22:07:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.