Papers
arxiv:2501.05737

Efficient Gradient Tracking Algorithms for Distributed Optimization Problems with Inexact Communication

Published on Jan 10
Authors:
,

Abstract

Distributed optimization problems usually face inexact communication issues induced by communication quantization, differential privacy protection, or channels noise. Most existing algorithms need two-timescale setting of the stepsize of gradient descent and the parameter of noise suppression to ensure the convergence to the optimal solution. In this paper, we propose two single-timescale algorithms, VRA-DGT and VRA--DSGT, for distributed deterministic and stochastic optimization problems with inexact communication respectively. VRA-DGT integrates the Variance-Reduced Aggregation (VRA) mechanism with the distributed gradient tracking framework, which achieves a convergence rate of Oleft(k^{-1}right) in the mean-square sense when the objective function is strongly convex and smooth. For distributed stochastic optimization problem,VRA-DSGT, where a hybrid variance reduction technique has been introduced in VRA-DGT, VRA-DGT,, maintains the convergence rate of Oleft(k^{-1}right) for strongly convex and smooth objective function. Simulated experiments on logistic regression problem with real-world data verify the effectiveness of the proposed algorithms.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2501.05737 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2501.05737 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2501.05737 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.