WIAS Preprint No. 2820, (2021)

Tensor methods for strongly convex strongly concave saddle point problems and strongly monotone variational inequalities



Authors

  • Ostroukhov, Petr
  • Kamalov, Rinat
  • Dvurechensky, Pavel
    ORCID: 0000-0003-1201-2343
  • Gasnikov, Alexander

2020 Mathematics Subject Classification

  • 90C30 90C25 68Q25

Keywords

  • Variational inequality, saddle point problem, high-order smoothness, tensor methods, gradient norm minimization

DOI

10.20347/WIAS.PREPRINT.2820

Abstract

In this paper we propose three tensor methods for strongly-convex-strongly-concave saddle point problems (SPP). The first method is based on the assumption of higher-order smoothness (the derivative of the order higher than 2 is Lipschitz-continuous) and achieves linear convergence rate. Under additional assumptions of first and second order smoothness of the objective we connect the first method with a locally superlinear converging algorithm in the literature and develop a second method with global convergence and local superlinear convergence. The third method is a modified version of the second method, but with the focus on making the gradient of the objective small. Since we treat SPP as a particular case of variational inequalities, we also propose two methods for strongly monotone variational inequalities with the same complexity as the described above.

Download Documents