Distributed Gaussian Mean Estimation under Communication Constraints:
Optimal Rates and Communication-Efficient Algorithms
- URL: http://arxiv.org/abs/2001.08877v1
- Date: Fri, 24 Jan 2020 04:19:47 GMT
- Title: Distributed Gaussian Mean Estimation under Communication Constraints:
Optimal Rates and Communication-Efficient Algorithms
- Authors: T. Tony Cai, Hongji Wei
- Abstract summary: Minimax rates of convergence, which characterize the tradeoff between the communication costs and statistical accuracy, are established.
Communication-efficient and statistically optimal procedures are developed.
- Score: 2.842794675894731
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study distributed estimation of a Gaussian mean under communication
constraints in a decision theoretical framework. Minimax rates of convergence,
which characterize the tradeoff between the communication costs and statistical
accuracy, are established in both the univariate and multivariate settings.
Communication-efficient and statistically optimal procedures are developed. In
the univariate case, the optimal rate depends only on the total communication
budget, so long as each local machine has at least one bit. However, in the
multivariate case, the minimax rate depends on the specific allocations of the
communication budgets among the local machines.
Although optimal estimation of a Gaussian mean is relatively simple in the
conventional setting, it is quite involved under the communication constraints,
both in terms of the optimal procedure design and lower bound argument. The
techniques developed in this paper can be of independent interest. An essential
step is the decomposition of the minimax estimation problem into two stages,
localization and refinement. This critical decomposition provides a framework
for both the lower bound analysis and optimal procedure design.
Related papers
- Optimal Data Splitting in Distributed Optimization for Machine Learning [85.99744701008802]
This study focuses on an optimal ratio of distributed data between the server and local machines for any costs of communications and local computations.
The running times of the network are compared between uniform and optimal distributions.
The superior theoretical performance of our solutions is experimentally validated.
arXiv Detail & Related papers (2024-01-15T16:30:12Z) - Quantization for decentralized learning under subspace constraints [61.59416703323886]
We consider decentralized optimization problems where agents have individual cost functions to minimize subject to subspace constraints.
We propose and study an adaptive decentralized strategy where the agents employ differential randomized quantizers to compress their estimates.
The analysis shows that, under some general conditions on the quantization noise, the strategy is stable both in terms of mean-square error and average bit rate.
arXiv Detail & Related papers (2022-09-16T09:38:38Z) - Distributed Nonparametric Estimation under Communication Constraints [0.0]
We provide a general framework for understanding the behavior of distributed estimation under communication constraints.
We derive minimax lower and matching upper bounds in the distributed regression, density estimation, classification, Poisson regression and volatility estimation models.
arXiv Detail & Related papers (2022-04-21T19:04:50Z) - Escaping Saddle Points with Bias-Variance Reduced Local Perturbed SGD
for Communication Efficient Nonconvex Distributed Learning [58.79085525115987]
Local methods are one of the promising approaches to reduce communication time.
We show that the communication complexity is better than non-local methods when the local datasets is smaller than the smoothness local loss.
arXiv Detail & Related papers (2022-02-12T15:12:17Z) - Acceleration in Distributed Optimization Under Similarity [72.54787082152278]
We study distributed (strongly convex) optimization problems over a network of agents, with no centralized nodes.
An $varepsilon$-solution is achieved in $tildemathcalrhoObig(sqrtfracbeta/mu (1-)log1/varepsilonbig)$ number of communications steps.
This rate matches (up to poly-log factors) for the first time lower complexity communication bounds of distributed gossip-algorithms applied to the class of problems of interest.
arXiv Detail & Related papers (2021-10-24T04:03:00Z) - DESTRESS: Computation-Optimal and Communication-Efficient Decentralized
Nonconvex Finite-Sum Optimization [43.31016937305845]
Internet-of-things, networked sensing, autonomous systems and federated learning call for decentralized algorithms for finite-sum optimizations.
We develop DEcentralized STochastic REcurSive methodDESTRESS for non finite-sum optimization.
Detailed theoretical and numerical comparisons show that DESTRESS improves upon prior decentralized algorithms.
arXiv Detail & Related papers (2021-10-04T03:17:41Z) - Distributed Nonparametric Function Estimation: Optimal Rate of
Convergence and Cost of Adaptation [1.332560004325655]
Distributed minimax estimation and distributed adaptive estimation under communication constraints are studied.
We quantify the exact communication cost for adaptation and construct an optimally adaptive procedure for distributed estimation.
arXiv Detail & Related papers (2021-07-01T02:16:16Z) - Log-Likelihood Ratio Minimizing Flows: Towards Robust and Quantifiable
Neural Distribution Alignment [52.02794488304448]
We propose a new distribution alignment method based on a log-likelihood ratio statistic and normalizing flows.
We experimentally verify that minimizing the resulting objective results in domain alignment that preserves the local structure of input domains.
arXiv Detail & Related papers (2020-03-26T22:10:04Z) - A Unified Theory of Decentralized SGD with Changing Topology and Local
Updates [70.9701218475002]
We introduce a unified convergence analysis of decentralized communication methods.
We derive universal convergence rates for several applications.
Our proofs rely on weak assumptions.
arXiv Detail & Related papers (2020-03-23T17:49:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.