LeAP--SSN: A semismooth Newton method with global convergence rates
Authors
- Alphonse, Amal
ORCID: 0000-0001-7616-3293 - Dvurechensky, Pavel
ORCID: 0000-0003-1201-2343 - Papadopoulos, Ioannis
ORCID: 0000-0003-3522-8761 - Sirotenko, Clemens
ORCID: 0009-0002-7514-1011
2020 Mathematics Subject Classification
- 49J52 49M15 65K10 90C26
Keywords
- Semismooth Newton method, nonsmooth analysis, global convergence, superlinear convergence
DOI
Abstract
We propose LeAP-SSN (Levenberg?Marquardt Adaptive Proximal SemismoothNewton method), a semismooth Newton-type method with a simple, parameter-free globalisation strategy that guarantees convergence from arbitrary starting points in nonconvex settings to stationary points, and under a Polyak?Łojasiewicz condition, to a global minimum, in Hilbert spaces. The method employs an adaptive Levenberg?Marquardt regularisation for the Newton steps, combined with backtracking, and does not require knowledge of problem-specific constants. We establish global nonasymptotic rates: O(1/k) for convex problems in terms of objective values, O(1/sqrtk) under nonconvexity in terms of subgradients, and linear convergence under a Polyak?Łojasiewicz condition. The algorithm achieves superlinear convergence under mild semismoothness and Dennis?Moré or partial smoothness conditions, even for non-isolated minimisers. By combining strong global guarantees with superlinear local rates in a fully parameter-agnostic framework, LeAP-SSN bridges the gap between globally convergent algorithms and the fast asymptotics of Newton's method. The practical efficiency of the method is illustrated on representative problems from imaging, contact mechanics, and machine learning.
Download Documents

