Pedro Pérez-Aros (Universidad de Chile)

This presentation introduces the Boosted Double-Proximal Subgradient Algorithm (BDSA), a novel splitting method for tackling general structured nonsmooth and nonconvex optimization problems expressed as sums and differences of composite functions. BDSA combines subgradients derived from data and proximal steps, augmented by a linesearch strategy to boost performance. We establish the convergence of BDSA under the Kurdyka–Łojasiewicz property and analyze its convergence rate. The talk concludes with numerical experiments illustrating the algorithm’s effectiveness and robustness across various test scenarios.