Compression and Data Similarity: Combination of Two Techniques for
Communication-Efficient Solving of Distributed Variational Inequalities
- URL: http://arxiv.org/abs/2206.09446v1
- Date: Sun, 19 Jun 2022 16:38:56 GMT
- Title: Compression and Data Similarity: Combination of Two Techniques for
Communication-Efficient Solving of Distributed Variational Inequalities
- Authors: Aleksandr Beznosikov, Alexander Gasnikov
- Abstract summary: In this paper we consider a combination of two popular approaches: compression and data similarity.
We show that this synergy can be more effective than each of the approaches separately in solving distributed smooth strongly monotonic variational inequalities.
- Score: 137.6408511310322
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Variational inequalities are an important tool, which includes minimization,
saddles, games, fixed-point problems. Modern large-scale and computationally
expensive practical applications make distributed methods for solving these
problems popular. Meanwhile, most distributed systems have a basic problem - a
communication bottleneck. There are various techniques to deal with it. In
particular, in this paper we consider a combination of two popular approaches:
compression and data similarity. We show that this synergy can be more
effective than each of the approaches separately in solving distributed smooth
strongly monotonic variational inequalities. Experiments confirm the
theoretical conclusions.
Related papers
- Accelerated Stochastic ExtraGradient: Mixing Hessian and Gradient Similarity to Reduce Communication in Distributed and Federated Learning [50.382793324572845]
Distributed computing involves communication between devices, which requires solving two key problems: efficiency and privacy.
In this paper, we analyze a new method that incorporates the ideas of using data similarity and clients sampling.
To address privacy concerns, we apply the technique of additional noise and analyze its impact on the convergence of the proposed method.
arXiv Detail & Related papers (2024-09-22T00:49:10Z) - Similarity, Compression and Local Steps: Three Pillars of Efficient Communications for Distributed Variational Inequalities [91.12425544503395]
Variational inequalities are used in various applications ranging from equilibrium search to adversarial learning.
Most distributed approaches have a bottleneck - the cost of communications.
The three main techniques to reduce the total number of communication rounds and the cost of one such round are the similarity of local functions, compression of transmitted information, and local updates.
The methods presented in this paper have the best theoretical guarantees of communication complexity and are significantly ahead of other methods for distributed variational inequalities.
arXiv Detail & Related papers (2023-02-15T12:11:27Z) - SARAH-based Variance-reduced Algorithm for Stochastic Finite-sum
Cocoercive Variational Inequalities [137.6408511310322]
We consider the problem of finite-sum cocoercive variational inequalities.
For strongly monotone problems it is possible to achieve linear convergence to a solution using this method.
arXiv Detail & Related papers (2022-10-12T08:04:48Z) - Distributed Methods with Compressed Communication for Solving
Variational Inequalities, with Theoretical Guarantees [115.08148491584997]
We present the first theoretically grounded distributed methods for solving variational inequalities and saddle point problems using compressed communication: MASHA1 and MASHA2.
New algorithms support bidirectional compressions, and also can be modified for setting with batches and for federated learning with partial participation of clients.
arXiv Detail & Related papers (2021-10-07T10:04:32Z) - Joint Wasserstein Distribution Matching [89.86721884036021]
Joint distribution matching (JDM) problem, which aims to learn bidirectional mappings to match joint distributions of two domains, occurs in many machine learning and computer vision applications.
We propose to address JDM problem by minimizing the Wasserstein distance of the joint distributions in two domains.
We then propose an important theorem to reduce the intractable problem into a simple optimization problem, and develop a novel method to solve it.
arXiv Detail & Related papers (2020-03-01T03:39:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.