Secure Decentralized Pliable Index Coding for Target Data Size
- URL: http://arxiv.org/abs/2602.03579v1
- Date: Tue, 03 Feb 2026 14:28:19 GMT
- Title: Secure Decentralized Pliable Index Coding for Target Data Size
- Authors: Anjali Padmanabhan, Danya Arun Bindhu, Nujoom Sageer Karat, Shanuja Sasi,
- Abstract summary: We propose a transmission scheme that coordinates client broadcasts to maximize coding efficiency.<n>We impose a strict security constraint that no client acquires more than the target $T$ number of messages.<n>We analyze the communication cost incurred by the proposed scheme under this security constraint.
- Score: 0.6999740786886535
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Decentralized Pliable Index Coding (DPIC) problem addresses efficient information exchange in distributed systems where clients communicate among themselves without a central server. An important consideration in DPIC is the heterogeneity of side-information and demand sizes. Although many prior works assume homogeneous settings with identical side-information cardinality and single message demands, these assumptions limit real-world applicability where clients typically possess unequal amounts of prior information. In this paper, we study DPIC problem under heterogeneous side-information cardinalities. We propose a transmission scheme that coordinates client broadcasts to maximize coding efficiency while ensuring that each client achieves a common target level $T$. In addition, we impose a strict security constraint that no client acquires more than the target $T$ number of messages, guaranteeing that each client ends up with exactly $T$ messages. We analyze the communication cost incurred by the proposed scheme under this security constraint.
Related papers
- DP-CSGP: Differentially Private Stochastic Gradient Push with Compressed Communication [71.60998478544028]
We propose Differentially Private Gradient Push with Compressed communication (termedfrac-CSGP) for decentralized learning graphs.<n>For general non-math and smooth objective functions, we show that our algorithm is designed to maintain high accuracy and efficient communication.
arXiv Detail & Related papers (2025-12-15T17:37:02Z) - Towards Federated Clustering: A Client-wise Private Graph Aggregation Framework [57.04850867402913]
Federated clustering addresses the challenge of extracting patterns from decentralized, unlabeled data.<n>We propose Structural Privacy-Preserving Federated Graph Clustering (SPP-FGC), a novel algorithm that innovatively leverages local structural graphs as the primary medium for privacy-preserving knowledge sharing.<n>Our framework achieves state-of-the-art performance, improving clustering accuracy by up to 10% (NMI) over federated baselines while maintaining provable privacy guarantees.
arXiv Detail & Related papers (2025-11-14T03:05:22Z) - STT-GS: Sample-Then-Transmit Edge Gaussian Splatting with Joint Client Selection and Power Control [77.56170394100022]
Edge Gaussian splatting (EGS) aggregates data from distributed clients and trains a global GS model at the edge server.<n>This paper formulates a novel GS-oriented objective function that distinguishes the view contributions of different clients.<n>It is found that the GS-oriented objective can be accurately predicted with low sampling ratios.
arXiv Detail & Related papers (2025-10-15T06:20:47Z) - Information-Theoretic Decentralized Secure Aggregation with Collusion Resilience [95.33295072401832]
We study the problem of decentralized secure aggregation (DSA) from an information-theoretic perspective.<n>We characterize the optimal rate region, which specifies the minimum achievable communication and secret key rates for DSA.<n>Our results establish the fundamental performance limits of DSA, providing insights for the design of provably secure and communication-efficient protocols.
arXiv Detail & Related papers (2025-08-01T12:51:37Z) - Optimizing Cross-Client Domain Coverage for Federated Instruction Tuning of Large Language Models [87.49293964617128]
Federated domain-specific instruction tuning (FedDIT) for large language models (LLMs) aims to enhance performance in specialized domains using distributed private and limited data.<n>We empirically establish that cross-client domain coverage, rather than data heterogeneity, is the pivotal factor.<n>We introduce FedDCA, an algorithm that explicitly maximizes this coverage through diversity-oriented client center selection and retrieval-based augmentation.
arXiv Detail & Related papers (2024-09-30T09:34:31Z) - Communication-Efficient Federated Knowledge Graph Embedding with Entity-Wise Top-K Sparsification [49.66272783945571]
Federated Knowledge Graphs Embedding learning (FKGE) encounters challenges in communication efficiency stemming from the considerable size of parameters and extensive communication rounds.
We propose bidirectional communication-efficient FedS based on Entity-Wise Top-K Sparsification strategy.
arXiv Detail & Related papers (2024-06-19T05:26:02Z) - Federated Multi-Target Domain Adaptation [99.93375364579484]
Federated learning methods enable us to train machine learning models on distributed user data while preserving its privacy.
We consider a more practical scenario where the distributed client data is unlabeled, and a centralized labeled dataset is available on the server.
We propose an effective DualAdapt method to address the new challenges.
arXiv Detail & Related papers (2021-08-17T17:53:05Z) - DeFed: A Principled Decentralized and Privacy-Preserving Federated
Learning Algorithm [10.487593244018933]
Federated learning enables a large number of clients to participate in learning a shared model while maintaining the training data stored in each client.
Here we propose a principled decentralized federated learning algorithm (DeFed), which removes the central client in the classical Federated Averaging (FedAvg) setting.
The proposed algorithm is proven to reach the global minimum with a convergence rate of $O(1/T)$ when the loss function is smooth and strongly convex, where $T$ is the number of iterations in gradient descent.
arXiv Detail & Related papers (2021-07-15T07:39:19Z) - Decentralized Federated Averaging [17.63112147669365]
Federated averaging (FedAvg) is a communication efficient algorithm for the distributed training with an enormous number of clients.
We study the decentralized FedAvg with momentum (DFedAvgM), which is implemented on clients that are connected by an undirected graph.
arXiv Detail & Related papers (2021-04-23T02:01:30Z) - Shuffled Model of Federated Learning: Privacy, Communication and
Accuracy Trade-offs [30.58690911428577]
We consider a distributed empirical risk minimization (ERM) optimization problem with communication efficiency and privacy requirements.
We develop (optimal) communication-efficient schemes for private mean estimation for several $ell_p$ spaces.
We demonstrate that one can get the same privacy, optimization-performance operating point developed in recent methods that use full-precision communication.
arXiv Detail & Related papers (2020-08-17T09:41:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.