Forschungsgruppe "Stochastische Algorithmen und Nichtparametrische Statistik"

Research Seminar "Mathematical Statistics" Winter Semester 19/20

  • Place: Weierstrass-Institute for Applied Analysis and Stochastics, Erhard-Schmidt-Hörsaal, Mohrenstraße 39, 10117 Berlin
  • Time: Wednesdays, 10.00 a.m. - 12.30 p.m.
16.10.19 Alexey Onatskiy (Cambridge University)
Cointegration the modern way
23.10.19 Prof. Vladimir Spokoiny (WIAS Berlin)
Bayesian inference for nonlinear inverse problems
We discuss the properties of the posterior for a wide class of statistical models including nonlinear generalised regression and deep neuronal networks, nonlinear inverse problems, nonparametric diffusion, error-in-operator and IV models. The new calming approach helps to treat all such problems in a unified manner and to obtain tight finite sample results about Gaussian approximation of the posterior with an explicit error bound in term of so called effective dimension.
30.10.19 N. N.

06.11.19 Charles Manski (North Western University, USA)
Patient care under uncertainty (Hermann Otto Hirschfeld Lecture 2019)
13.11.19 Merle Behr (University of California, Berkeley)
The seminar takes place at room 406, 4th floor! Learning compositional structures
20.11.19 Nikita Zhivotowskii (Google Zurich, Switzerland):
Robust covariance estimation for vectors with bounded kurtosis
Let X be a centered random vector and assume that we want to estimate its covariance matrix. In this talk I will discuss the following result: if the random X satisfies the bounded kurtosis assumption, there is a covariance matrix estimator that given a sequence of n independent random vectors distributed according to X exhibits the optimal performance one would expect had X been a gaussian vector. The procedure also improves the current state-of-the-art regarding high probability bounds in the sub-gaussian case (sharp results were only known in expectation or with constant probability). In both scenarios the new bound does not depend explicitly on the dimension, but rather on the effective rank of the covariance matrix of X. The talk is based on the joint work with S. Mendelson "Robust covariance estimation under L4-L2 moment equivalence", to appear in AoS 2019.
27.11.19 Prof. Alain Celisse (Université Lille, France)
Kernelized change-points detection procedure
04.12.19 Nils Bertschinger (Goethe Universität Frankfurt a. M)
The seminar takes place at room 406, 4th floor! Systemic Greeks: Measuring risk in financial networks
Since the latest financial crisis, the idea of systemic risk has received considerable interest. In particular, contagion effects arising from cross-holdings between interconnected financial firms have been studied extensively. Drawing inspiration from the field of complex networks, these attempts are largely unaware of models and theories for credit risk of individual firms. Here, we note that recent network valuation models extend the seminal structural risk model of Merton (1974). Furthermore, we formally compute sensitivities to various risk factors -- commonly known as Greeks -- in a network context. In the end, we present some numerical illustrations and discuss possible implications for measuring systemic risk as well as insurance pricing.
11.12.19 N.N.

18.12.19 N.N.
08.01.20 Dominik Liebl (Universität Bonn)
Fast and fair simultaneous confidence bands for functional parameters
Quantifying uncertainty using confidence regions is a central goal of statistical inference. Despite this, methodologies for confidence bands in Functional Data Analysis are underdeveloped compared to estimation and hypothesis testing. This work represents a major leap forward in this area by presenting a new methodology for constructing simultaneous confidence bands for functional parameter estimates. These bands possess a number of striking qualities: (1) they have a nearly closed-form expression, (2) they give nearly exact coverage, (3) they have a finite sample correction, (4) they do not require an estimate of the full covariance of the parameter estimate, and (5) they can be constructed adaptively according to a desired criteria. One option for choosing bands we find especially interesting is the concept of fair bands which allows us to do fair (or equitable) inference over subintervals and could be especially useful in longitudinal studies over long time scales. Our bands are constructed by integrating and extending tools from Random Field Theory, an area that has yet to overlap with Functional Data Analysis. Authors: Dominik Liebl (University Bonn) and Matthew Reimherr (Penn State University)
15.01.20 Sven Wang (University of Cambridge)
Convergence rates for penalised least squares estimators in PDE-constrained regression problems
22.01.20 Jorge Mateu (Universitat Jaume I, Castellon)
Complex spatial and spatio-temporal point process dependencies: Linear ANOVA-type models, metrics and barycenters and predictive stochastic models of crime
29.01.20 Nadja Klein (Humboldt-Universität zu Berlin)
The seminar takes place at room 406, 4th floor! Bayesian regression copulas
05.02.20 Tim Sullivan (FU Berlin)
A rigorous theory of conditional mean embeddings
Conditional mean embeddings (CME) have proven themselves to be a powerful tool in many machine learning applications. They allow the efficient conditioning of probability distributions within the corresponding reproducing kernel Hilbert spaces (RKHSs) by providing a linear-algebraic relation for the kernel mean embeddings of the respective probability distributions. Both centered and uncentered covariance operators have been used to define CMEs in the existing literature. In this talk, we develop a mathematically rigorous theory for both variants, discuss the merits and problems of either, and significantly weaken the conditions for applicability of CMEs. In the course of this, we demonstrate a beautiful connection to Gaussian conditioning in Hilbert spaces.
12.02.20 Alexandra Carpentier (Universität Magdeburg)
Adaptive inference and its relations to sequential decision making
Adaptive inference - namely adaptive estimation and adaptive confidence statements - is particularly important in high of in nite dimensional models in statistics. Indeed whenever the dimension becomes high or infinite, it is important to adapt to the underlying structure of the problem. While adaptive estimation is often possible, it is often the case that adaptive and honest con dence sets do not exist. This is known as the adaptive inference paradox. And this has consequences in sequential decision making. In this talk, I will present some classical results of adaptive inference and discuss how they impact sequential decision making. (based on joint works with Andrea Locatelli, Matthias Loeffler, Olga Klopp and Richard Nickl)


last reviewed: January 27, 2020 by Christine Schneider