WIAS Preprint No. 2695, (2020)

On accelerated alternating minimization



Authors

  • Guminov, Sergey
  • Dvurechensky, Pavel
    ORCID: 0000-0003-1201-2343
  • Gasnikov, Alexander

2010 Mathematics Subject Classification

  • 90C30 90C25 68Q25

Keywords

  • Convex optimization, acceleration, non-convex optimization, alternating minimization

DOI

10.20347/WIAS.PREPRINT.2695

Abstract

Alternating minimization (AM) optimization algorithms have been known for a long time and are of importance in machine learning problems, among which we are mostly motivated by approximating optimal transport distances. AM algorithms assume that the decision variable is divided into several blocks and minimization in each block can be done explicitly or cheaply with high accuracy. The ubiquitous Sinkhorn's algorithm can be seen as an alternating minimization algorithm for the dual to the entropy-regularized optimal transport problem. We introduce an accelerated alternating minimization method with a $1/k^2$ convergence rate, where $k$ is the iteration counter. This improves over known bound $1/k$ for general AM methods and for the Sinkhorn's algorithm. Moreover, our algorithm converges faster than gradient-type methods in practice as it is free of the choice of the step-size and is adaptive to the local smoothness of the problem. We show that the proposed method is primal-dual, meaning that if we apply it to a dual problem, we can reconstruct the solution of the primal problem with the same convergence rate. We apply our method to the entropy regularized optimal transport problem and show experimentally, that it outperforms Sinkhorn's algorithm.

Download Documents