On the Effects of Data Heterogeneity on the Convergence Rates of
Distributed Linear System Solvers
- URL: http://arxiv.org/abs/2304.10640v2
- Date: Fri, 16 Feb 2024 00:02:49 GMT
- Title: On the Effects of Data Heterogeneity on the Convergence Rates of
Distributed Linear System Solvers
- Authors: Boris Velasevic, Rohit Parasnis, Christopher G. Brinton, Navid Azizan
- Abstract summary: We consider the problem of solving a large-scale system of linear equations with the help of a set of machines.
We compare two classes of algorithms with a particular focus on the most efficient method from each class.
Our analysis results in a number of novel insights besides showing that APC is the most efficient method in realistic scenarios.
- Score: 10.103350854870992
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider the fundamental problem of solving a large-scale system of linear
equations. In particular, we consider the setting where a taskmaster intends to
solve the system in a distributed/federated fashion with the help of a set of
machines, who each have a subset of the equations. Although there exist several
approaches for solving this problem, missing is a rigorous comparison between
the convergence rates of the projection-based methods and those of the
optimization-based ones. In this paper, we analyze and compare these two
classes of algorithms with a particular focus on the most efficient method from
each class, namely, the recently proposed Accelerated Projection-Based
Consensus (APC) and the Distributed Heavy-Ball Method (D-HBM). To this end, we
first propose a geometric notion of data heterogeneity called angular
heterogeneity and discuss its generality. Using this notion, we bound and
compare the convergence rates of the studied algorithms and capture the effects
of both cross-machine and local data heterogeneity on these quantities. Our
analysis results in a number of novel insights besides showing that APC is the
most efficient method in realistic scenarios where there is a large data
heterogeneity. Our numerical analyses validate our theoretical results.
Related papers
- Exact and Heuristic Algorithms for Constrained Biclustering [0.0]
Biclustering, also known as co-clustering or two-way clustering, simultaneously partitions the rows and columns of a data matrix to reveal submatrices with coherent patterns.<n>We study constrained biclustering with pairwise constraints, namely must-link and cannot-link constraints, which specify whether objects should belong to the same or different biclusters.
arXiv Detail & Related papers (2025-08-07T15:29:22Z) - Similarity-based fuzzy clustering scientific articles: potentials and challenges from mathematical and computational perspectives [0.0]
Fuzzy clustering plays a vital role in analyzing publication data.<n>This problem can be formulated as a constrained optimization model, where the goal is to minimize the discrepancy between the similarity observed from data and the similarity derived from a predicted distribution.<n>We analyze potentials and challenges of the approach from both mathematical and computational perspectives.
arXiv Detail & Related papers (2025-06-04T15:10:31Z) - Quantized Hierarchical Federated Learning: A Robust Approach to
Statistical Heterogeneity [3.8798345704175534]
We present a novel hierarchical federated learning algorithm that incorporates quantization for communication-efficiency.
We offer a comprehensive analytical framework to evaluate its optimality gap and convergence rate.
Our findings reveal that our algorithm consistently achieves high learning accuracy over a range of parameters.
arXiv Detail & Related papers (2024-03-03T15:40:24Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Distributed Linear Regression with Compositional Covariates [5.085889377571319]
We focus on the distributed sparse penalized linear log-contrast model in massive compositional data.
Two distributed optimization techniques are proposed for solving the two different constrained convex optimization problems.
In the decentralized topology, we introduce a distributed coordinate-wise descent algorithm for obtaining a communication-efficient regularized estimation.
arXiv Detail & Related papers (2023-10-21T11:09:37Z) - Comparison of Single- and Multi- Objective Optimization Quality for
Evolutionary Equation Discovery [77.34726150561087]
Evolutionary differential equation discovery proved to be a tool to obtain equations with less a priori assumptions.
The proposed comparison approach is shown on classical model examples -- Burgers equation, wave equation, and Korteweg - de Vries equation.
arXiv Detail & Related papers (2023-06-29T15:37:19Z) - Linearization Algorithms for Fully Composite Optimization [61.20539085730636]
This paper studies first-order algorithms for solving fully composite optimization problems convex compact sets.
We leverage the structure of the objective by handling differentiable and non-differentiable separately, linearizing only the smooth parts.
arXiv Detail & Related papers (2023-02-24T18:41:48Z) - Amortized Implicit Differentiation for Stochastic Bilevel Optimization [53.12363770169761]
We study a class of algorithms for solving bilevel optimization problems in both deterministic and deterministic settings.
We exploit a warm-start strategy to amortize the estimation of the exact gradient.
By using this framework, our analysis shows these algorithms to match the computational complexity of methods that have access to an unbiased estimate of the gradient.
arXiv Detail & Related papers (2021-11-29T15:10:09Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Communication-efficient distributed eigenspace estimation [31.69089186688224]
We develop a communication-efficient distributed algorithm for computing the leading invariant subspace of a data matrix.
Our algorithm uses a novel alignment scheme that minimizes the Procrustean distance between local solutions and a reference solution.
We show that our algorithm achieves a similar error rate to that of a centralized estimator.
arXiv Detail & Related papers (2020-09-05T02:11:22Z) - Asymptotic Analysis of an Ensemble of Randomly Projected Linear
Discriminants [94.46276668068327]
In [1], an ensemble of randomly projected linear discriminants is used to classify datasets.
We develop a consistent estimator of the misclassification probability as an alternative to the computationally-costly cross-validation estimator.
We also demonstrate the use of our estimator for tuning the projection dimension on both real and synthetic data.
arXiv Detail & Related papers (2020-04-17T12:47:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.