Forschungsgruppe "Stochastische Algorithmen und Nichtparametrische Statistik"
Seminar "Modern Methods in Applied Stochastics and Nonparametric Statistics" Winter Semester 2024/25
|
|
10.09.2024 | |
| |
17.09.2024 | |
|
|
24.09.2024 | |
| |
01.10.2024 | |
| |
08.10.2024 | |
| |
15.10.2024 | Goncalo dos Reis (University of Edinburgh) |
HVP 11 a, R. 313 | Simulation of mean-field SDEs: Some recent results We review two results in the simulation for SDE of McKean-Vlasov type (MV-SDE). The first block of results addresses simulation of MV-SDEs having super-linear growth in the spatial and the interaction component in the drift, and non-constant Lipschitz diffusion coefficient. The 2nd block is far more curious. It addresses the study the weak convergence behaviour of the Leimkuhler-Matthews method, a non-Markovian Euler-type scheme with the same computational cost as the Euler scheme, for the approximation of the stationary distribution of a one-dimensional McKean-Vlasov Stochastic Differential Equation (MV-SDE). The particular class under study is known as mean-field (overdamped) Langevin equations (MFL). We provide weak and strong error results for the scheme in both finite and infinite time. We work under a strong convexity assumption. Based on a careful analysis of the variation processes and the Kolmogorov backward equation for the particle system associated with the MV-SDE, we show that the method attains a higher-order approximation accuracy in the long-time limit (of weak order convergence rate 3/2) than the standard Euler method (of weak order 1). While we use an interacting particle system (IPS) to approximate the MV-SDE, we show the convergence rate is independent of the dimension of the IPS and this includes establishing uniform-in-time decay estimates for moments of the IPS, the Kolmogorov backward equation and their derivatives. The theoretical findings are supported by numerical tests. This presentation is (loosely) based on the joint work [1], [2]. References: [1] X. Chen, G. dos Reis, Wolfgang Stockinger, and Zac Wilde. "Improved weak convergence for the long time simulation of Mean-field Langevin equations" arXiv preprint arXiv:2405.01346 (2024). [2] X. Chen, G. dos Reis. “Euler simulation of interacting particle systems and McKean-Vlasov SDEs with fully superlinear growth drifts in space and interaction" IMA Journal of Numerical Analysis, 2022, Vol. 42, No. 1. |
22.10.2024 | |
|
|
29.10.2024 | Prof. Dr. Walter Schachermayer (University of Vienna) |
Optimal Martingale Transport on R^d | |
05.11.2024 | |
|
|
12.11.2024 | Fabio Bugini (TU Berlin) |
Rough SDEs: Connections to rough PDEs and Malliavin calculus |
|
19.11.2024 | Peter K. Friz (TU & WIAS Berlin) |
Simulation and weak error bounds for local stochastic volatility models Local stochastic volatility refers to a popular model class in applied mathematical finance that allows for "calibration-on-the-fly", typically via a particle method, derived from a formal McKean-Vlasov equation. Well-posedness of this limit is a well-known problem in the field; the general case is largely open, despite recent progress in Markovian situations. Our take is to start with a well-defined Euler approximation to the formal McKean-Vlasov equation, followed by a newly established half-step-scheme, allowing for good approximations of conditional expectations. In a sense, we do Euler first, particle second in contrast to previous works that start with the particle approximation. We show weak rate one, plus error terms that account for the said approximation. The case of particle approximation is discussed in detail and the error rate is given in dependence of all parameters. Joint work with B. Jourdain (Paris) and T. Wagenhofer (Berlin). |
|
26.11.2024 | |
|
|
03.12.2024 | Nikita Doikov (EPFL, Switzerland) |
First talk | Stochastic second-order optimization with momentum In this talk, we discuss stochastic second-order algorithms for solving general non-convex optimization problems. We propose using a special version of momentum to stabilize the stochastic gradient and Hessian estimates in Newton's method. We show that momentum provably improves the variance of stochastic estimates and allows the method to converge for any noise level. Using the cubic regularization technique, we prove a global convergence rate for our method on general non-convex problems to a second-order stationary point, even when using only a single stochastic data sample per iteration. This starkly contrasts with all existing stochastic second-order methods for non-convex problems, which typically require large batches. Joint work with El Mahdi Chayti and Martin Jaggi. |
03.12.2024 | Anton Rodomanov (CISPA Helmholtz Center) |
Second talk | Optimizing (L_0, L_1)-smooth functions by gradient methods We study gradient methods for solving an optimization problem with an (L_0, L_1)-smooth objective function. This problem class generalizes that of Lipschitz-smooth problems and has gained interest recently, as it captures a broader range of machine learning applications. We provide novel insights into the properties of this function class and develop a general framework for analyzing optimization methods for (L_0, L_1)-smooth functions in a principled manner. While our convergence rate estimates recover existing results for minimizing the gradient norm for nonconvex problems, our approach allows us to significantly improve the current state-of-the-art complexity results in the case of convex problems. We show that both the gradient method with Polyak stepsizes and the normalized gradient method, without any knowledge of the parameters L_0 and L_1, achieve the same complexity bounds as the method with the knowledge of these constants. In addition to that, we show that a carefully chosen accelerated gradient method can be applied to (L_0, L_1)-smooth functions, further improving previously known results. |
10.12.2024 | Wilfried Kenmoe Nzali (WIAS Berlin) |
|
|
17.12.2024 | Alexey Kroshnin (WIAS Berlin) |
On concentration inequalities for unbounded matrix super-martingales |
|
07.01.2025 | |
|
last reviewed: November 14, 2024 by Christine Schneider