Roughness-Informed Federated Learning
- URL: http://arxiv.org/abs/2602.10595v1
- Date: Wed, 11 Feb 2026 07:35:45 GMT
- Title: Roughness-Informed Federated Learning
- Authors: Mohammad Partohaghighi, Roummel Marcia, Bruce J. West, YangQuan Chen,
- Abstract summary: Federated Learning (FL) enables collaborative model training across distributed clients.<n>FL faces challenges in non-independent and identically distributed (non-IID) settings due to client drift.<n>We propose RI-FedAvg, a novel FL that mitigates client drift by incorporating a Roughness Index (RI)-based regularization term.
- Score: 3.8218584696400484
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated Learning (FL) enables collaborative model training across distributed clients while preserving data privacy, yet faces challenges in non-independent and identically distributed (non-IID) settings due to client drift, which impairs convergence. We propose RI-FedAvg, a novel FL algorithm that mitigates client drift by incorporating a Roughness Index (RI)-based regularization term into the local objective, adaptively penalizing updates based on the fluctuations of local loss landscapes. This paper introduces RI-FedAvg, leveraging the RI to quantify the roughness of high-dimensional loss functions, ensuring robust optimization in heterogeneous settings. We provide a rigorous convergence analysis for non-convex objectives, establishing that RI-FedAvg converges to a stationary point under standard assumptions. Extensive experiments on MNIST, CIFAR-10, and CIFAR-100 demonstrate that RI-FedAvg outperforms state-of-the-art baselines, including FedAvg, FedProx, FedDyn, SCAFFOLD, and DP-FedAvg, achieving higher accuracy and faster convergence in non-IID scenarios. Our results highlight RI-FedAvg's potential to enhance the robustness and efficiency of federated learning in practical, heterogeneous environments.
Related papers
- FedZMG: Efficient Client-Side Optimization in Federated Learning [0.19116784879310023]
Federated Zero Mean Gradients (FedZMG) is a parameter-free, client-side optimization algorithm designed to tackle client-drift.<n>FedZMG projects local gradients onto a zero-mean hyperplane, effectively neutralizing the "intensity" or "bias" shifts inherent in heterogeneous data distributions.
arXiv Detail & Related papers (2026-02-20T17:45:28Z) - Fractional-Order Federated Learning [4.1751058176413105]
Federated learning (FL) allows remote clients to train a global model collaboratively while protecting client privacy.<n>Despite its privacy-preserving benefits, FL has significant drawbacks, including slow convergence, high communication cost, and non-independent-and-identically-distributed (non-IID) data.
arXiv Detail & Related papers (2026-02-17T06:25:23Z) - Adaptive Dual-Weighting Framework for Federated Learning via Out-of-Distribution Detection [53.45696787935487]
Federated Learning (FL) enables collaborative model training across large-scale distributed service nodes.<n>In real-world service-oriented deployments, data generated by heterogeneous users, devices, and application scenarios are inherently non-IID.<n>We propose FLood, a novel FL framework inspired by out-of-distribution (OOD) detection.
arXiv Detail & Related papers (2026-02-01T05:54:59Z) - ETR: Outcome-Guided Elastic Trust Regions for Policy Optimization [6.716883192613149]
We propose textbfElastic textbfTrust textbfETR, a dynamic mechanism that aligns optimization constraints with signal quality.<n>ETR consistently outperforms GRPO, achieving superior accuracy while effectively mitigating policy entropy degradation.
arXiv Detail & Related papers (2026-01-07T09:19:53Z) - Knowledge-Informed Neural Network for Complex-Valued SAR Image Recognition [51.03674130115878]
We introduce the Knowledge-Informed Neural Network (KINN), a lightweight framework built upon a novel "compression-aggregation-compression" architecture.<n>KINN establishes a state-of-the-art in parameter-efficient recognition, offering exceptional generalization in data-scarce and out-of-distribution scenarios.
arXiv Detail & Related papers (2025-10-23T07:12:26Z) - FedGPS: Statistical Rectification Against Data Heterogeneity in Federated Learning [103.45987800174724]
Federated Learning (FL) confronts a significant challenge known as data heterogeneity, which impairs model performance and convergence.<n>We propose textbfFedGPS, a novel framework that seamlessly integrates statistical distribution and gradient information from others.
arXiv Detail & Related papers (2025-10-23T06:10:11Z) - Distributionally Robust Federated Learning with Outlier Resilience [8.69285602685459]
We study distributionally robust federated learning with explicit outlier resilience.<n>We reformulate the problem as a tractable Lagrangian penalty optimization, which admits robustness certificates.<n>Building on this reformulation, we propose the distributionally outlier-robust federated learning algorithm and establish its convergence guarantees.
arXiv Detail & Related papers (2025-09-29T08:42:12Z) - NDCG-Consistent Softmax Approximation with Accelerated Convergence [67.10365329542365]
We propose novel loss formulations that align directly with ranking metrics.<n>We integrate the proposed RG losses with the highly efficient Alternating Least Squares (ALS) optimization method.<n> Empirical evaluations on real-world datasets demonstrate that our approach achieves comparable or superior ranking performance.
arXiv Detail & Related papers (2025-06-11T06:59:17Z) - Mobilizing Personalized Federated Learning in Infrastructure-Less and
Heterogeneous Environments via Random Walk Stochastic ADMM [0.14597673707346284]
This paper explores the challenges of implementing Federated Learning (FL) in practical scenarios featuring isolated nodes with data heterogeneity.
To overcome these challenges, we propose a novel mobilizing personalized FL approach, which aims to facilitate mobility and resilience.
We develop a novel optimization algorithm called Random Walk Alternating Direction Method of Multipliers (RWSADMM)
arXiv Detail & Related papers (2023-04-25T03:00:18Z) - Uncertainty-Aware Source-Free Adaptive Image Super-Resolution with Wavelet Augmentation Transformer [60.31021888394358]
Unsupervised Domain Adaptation (UDA) can effectively address domain gap issues in real-world image Super-Resolution (SR)
We propose a SOurce-free Domain Adaptation framework for image SR (SODA-SR) to address this issue, i.e., adapt a source-trained model to a target domain with only unlabeled target data.
arXiv Detail & Related papers (2023-03-31T03:14:44Z) - Adaptive Federated Learning via New Entropy Approach [14.595709494370372]
Federated Learning (FL) has emerged as a prominent distributed machine learning framework.
In this paper, we propose an adaptive FEDerated learning algorithm based on ENTropy theory (FedEnt) to alleviate the parameter deviation among heterogeneous clients.
arXiv Detail & Related papers (2023-03-27T07:57:04Z) - FedSkip: Combatting Statistical Heterogeneity with Federated Skip
Aggregation [95.85026305874824]
We introduce a data-driven approach called FedSkip to improve the client optima by periodically skipping federated averaging and scattering local models to the cross devices.
We conduct extensive experiments on a range of datasets to demonstrate that FedSkip achieves much higher accuracy, better aggregation efficiency and competing communication efficiency.
arXiv Detail & Related papers (2022-12-14T13:57:01Z) - Stochastic Optimization of Areas Under Precision-Recall Curves with
Provable Convergence [66.83161885378192]
Area under ROC (AUROC) and precision-recall curves (AUPRC) are common metrics for evaluating classification performance for imbalanced problems.
We propose a technical method to optimize AUPRC for deep learning.
arXiv Detail & Related papers (2021-04-18T06:22:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.