Federated Graph-based Networks with Shared Embedding
- URL: http://arxiv.org/abs/2210.01803v1
- Date: Mon, 3 Oct 2022 12:51:15 GMT
- Title: Federated Graph-based Networks with Shared Embedding
- Authors: Tianyi Yu, Pei Lai, Fei Teng
- Abstract summary: We propose Federated Graph-based Networks with Shared Embedding (Feras), which uses shared embedding data to train the network and avoids the direct sharing of original data.
Feras enables the training of current graph-based models in the federated learning framework for privacy concern.
- Score: 1.323497585762675
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Nowadays, user privacy is becoming an issue that cannot be bypassed for
system developers, especially for that of web applications where data can be
easily transferred through internet. Thankfully, federated learning proposes an
innovative method to train models with distributed devices while data are kept
in local storage. However, unlike general neural networks, although graph-based
networks have achieved great success in classification tasks and advanced
recommendation system, its high performance relies on the rich context provided
by a graph structure, which is vulnerable when data attributes are incomplete.
Therefore, the latter becomes a realistic problem when implementing federated
learning for graph-based networks. Knowing that data embedding is a
representation in a different space, we propose our Federated Graph-based
Networks with Shared Embedding (Feras), which uses shared embedding data to
train the network and avoids the direct sharing of original data. A solid
theoretical proof of the convergence of Feras is given in this work.
Experiments on different datasets (PPI, Flickr, Reddit) are conducted to show
the efficiency of Feras for centralized learning. Finally, Feras enables the
training of current graph-based models in the federated learning framework for
privacy concern.
Related papers
- Privacy-preserving design of graph neural networks with applications to
vertical federated learning [56.74455367682945]
We present an end-to-end graph representation learning framework called VESPER.
VESPER is capable of training high-performance GNN models over both sparse and dense graphs under reasonable privacy budgets.
arXiv Detail & Related papers (2023-10-31T15:34:59Z) - Distribution Shift Matters for Knowledge Distillation with Webly
Collected Images [91.66661969598755]
We propose a novel method dubbed Knowledge Distillation between Different Distributions" (KD$3$)
We first dynamically select useful training instances from the webly collected data according to the combined predictions of teacher network and student network.
We also build a new contrastive learning block called MixDistribution to generate perturbed data with a new distribution for instance alignment.
arXiv Detail & Related papers (2023-07-21T10:08:58Z) - Benchmarking FedAvg and FedCurv for Image Classification Tasks [1.376408511310322]
This paper focuses on the problem of statistical heterogeneity of the data in the same federated network.
Several Federated Learning algorithms, such as FedAvg, FedProx and Federated Curvature (FedCurv) have already been proposed.
As a side product of this work, we release the non-IID version of the datasets we used so to facilitate further comparisons from the FL community.
arXiv Detail & Related papers (2023-03-31T10:13:01Z) - SMARTQUERY: An Active Learning Framework for Graph Neural Networks
through Hybrid Uncertainty Reduction [25.77052028238513]
We propose a framework to learn a graph neural network with very few labeled nodes using a hybrid uncertainty reduction function.
We demonstrate the competitive performance of our method against state-of-the-arts on very few labeled data.
arXiv Detail & Related papers (2022-12-02T20:49:38Z) - Federated Learning on Non-IID Graphs via Structural Knowledge Sharing [47.140441784462794]
federated graph learning (FGL) enables clients to train strong GNN models in a distributed manner without sharing their private data.
We propose FedStar, an FGL framework that extracts and shares the common underlying structure information for inter-graph learning tasks.
We perform extensive experiments over both cross-dataset and cross-domain non-IID FGL settings, demonstrating FedStar's superiority.
arXiv Detail & Related papers (2022-11-23T15:12:16Z) - Scalable Collaborative Learning via Representation Sharing [53.047460465980144]
Federated learning (FL) and Split Learning (SL) are two frameworks that enable collaborative learning while keeping the data private (on device)
In FL, each data holder trains a model locally and releases it to a central server for aggregation.
In SL, the clients must release individual cut-layer activations (smashed data) to the server and wait for its response (during both inference and back propagation).
In this work, we present a novel approach for privacy-preserving machine learning, where the clients collaborate via online knowledge distillation using a contrastive loss.
arXiv Detail & Related papers (2022-11-20T10:49:22Z) - FedEgo: Privacy-preserving Personalized Federated Graph Learning with
Ego-graphs [22.649780281947837]
In some practical scenarios, graph data are stored separately in multiple distributed parties, which may not be directly shared due to conflicts of interest.
We propose FedEgo, a federated graph learning framework based on ego-graphs to tackle the challenges above.
arXiv Detail & Related papers (2022-08-29T15:47:36Z) - Privatized Graph Federated Learning [57.14673504239551]
We introduce graph federated learning, which consists of multiple units connected by a graph.
We show how graph homomorphic perturbations can be used to ensure the algorithm is differentially private.
arXiv Detail & Related papers (2022-03-14T13:48:23Z) - SpreadGNN: Serverless Multi-task Federated Learning for Graph Neural
Networks [13.965982814292971]
Graph Neural Networks (GNNs) are the first choice methods for graph machine learning problems.
Centralizing a massive amount of real-world graph data for GNN training is prohibitive due to user-side privacy concerns.
This work proposes SpreadGNN, a novel multi-task federated training framework.
arXiv Detail & Related papers (2021-06-04T22:20:47Z) - Graph-Based Neural Network Models with Multiple Self-Supervised
Auxiliary Tasks [79.28094304325116]
Graph Convolutional Networks are among the most promising approaches for capturing relationships among structured data points.
We propose three novel self-supervised auxiliary tasks to train graph-based neural network models in a multi-task fashion.
arXiv Detail & Related papers (2020-11-14T11:09:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.