WIAS Preprint No. 2690, (2020)

On primal and dual approaches for distributed stochastic convex optimization over networks



Authors

  • Dvinskikh, Darina
  • Gorbunov, Eduard
  • Gasnikov, Alexander
  • Dvurechensky, Alexander
  • Uribe, César A.

2010 Mathematics Subject Classification

  • 90C25 90C06 90C90

Keywords

  • Convex and non-convex optimization, stochastic optimization, first-order method, adaptive method, gradient descent, complexity bounds, mini-batch

DOI

10.20347/WIAS.PREPRINT.2690

Abstract

We introduce a primal-dual stochastic gradient oracle method for distributed convex optimization problems over networks. We show that the proposed method is optimal in terms of communication steps. Additionally, we propose a new analysis method for the rate of convergence in terms of duality gap and probability of large deviations. This analysis is based on a new technique that allows to bound the distance between the iteration sequence and the optimal point. By the proper choice of batch size, we can guarantee that this distance equals (up to a constant) to the distance between the starting point and the solution.

Appeared in

Download Documents