WIAS Preprint No. 2690, (2020)
On primal and dual approaches for distributed stochastic convex optimization over networks
Authors
- Dvinskikh, Darina
ORCID: 0000-0003-1201-2343 - Gorbunov, Eduard
- Gasnikov, Alexander
ORCID: 0000-0003-1201-2343 - Dvurechensky, Alexander
- Uribe, César A.
2010 Mathematics Subject Classification
- 90C25 90C06 90C90
Keywords
- Convex and non-convex optimization, stochastic optimization, first-order method, adaptive method, gradient descent, complexity bounds, mini-batch
DOI
Abstract
We introduce a primal-dual stochastic gradient oracle method for distributed convex optimization problems over networks. We show that the proposed method is optimal in terms of communication steps. Additionally, we propose a new analysis method for the rate of convergence in terms of duality gap and probability of large deviations. This analysis is based on a new technique that allows to bound the distance between the iteration sequence and the optimal point. By the proper choice of batch size, we can guarantee that this distance equals (up to a constant) to the distance between the starting point and the solution.
Appeared in
- 2019 IEEE 58th Conference on Decision and Control (CDC), IEEE Xplore, 2019, pp. 7435--7440, DOI 10.1109/CDC40024.2019.9029798 .
Download Documents