Research Group "Stochastic Algorithms and Nonparametric Statistics"
Seminar "Modern Methods in Applied Stochastics and Nonparametric
Statistics" Winter semester 2010/2011
last reviewed: March, 07, 2011, Christine Schneider, Karsten Tabelow
14.09.2010
Masashi Sugiyama and Hirotaka Hachiya (Tokyo Institute of Technology, Japan)
Density Ratio Estimation: A New Versatile Tool for Machine Learning
Abstract: Recently, we developed a new ML framework that allows us to
systematically avoid density estimation. The key idea is to directly
estimate the ratio of density functions, not densities themselves.
Our framework includes various ML tasks such as importance sampling
(e.g., covariate shift adaptation, transfer learning, multitask
learning), divergence estimation (e.g., two-sample test, outlier
detection, change detection in time-series), mutual information
estimation (e.g., independence test, independent component analysis,
feature selection, sufficient dimension reduction, causal inference),
and conditional probability estimation (e.g., probabilistic
classification, conditional density estimation).
In this talk, we introduce the density ratio framework, review methods
of density ratio estimation, and show various real-world applications
including brain-computer interface, speech recognition, image
recognition, and robot control.
22.09.2010
Roger J.A. Laeven (Tilburg University, CentER and Eurandom)
Non-parametric estimation for multivariate Lévy processes
Abstract: This paper proposes two non-parametric estimators for the dependence function of a
multivariate Lévy process, and derives the estimators' properties. In addition, an independence test is constructed.
The estimators and test are applicable despite the presence of a continuous Brownian component in the process.
Finally, the estimators and test are implemented on Monte Carlo simulations and on asset returns data.
26.10.2010
Speaker: tba
Title tba
Abstract: tba
02.11.2010
Henryk Zähle (Saarbrücken)
Limit theorems and robustness for tail-dependent statistical functionals
Abstract: In the context of nonparametric statistics, we shall address central and
noncentral limit theorems, Marcinkiewicz-Zygmund LLNs
and qualitative robustness for tail-dependent statistical functionals. These
properties will be derived for plug-in estimates
based on strictly stationary time series exhibiting weak dependence (e.g.,
mixing) or strong dependence (long-memory). The key
tools are the new concepts of quasi-Hadamard differentiability and
quasi-Hölder continuity as well as results on weighted
empirical processes. The theoretical results will be illustrated by means of
fairly general L- and V-functionals as well as distribution-invariant risk measures.
09.11.2010
Robert Hable (Universität Bayreuth)
Asymptotic Normality of Support Vector Machines
Abstract: In nonparametric classification and regression problems,
support vector machines (SVMs) recently attract much attention
in theoretical and in applied statistics.
In an abstract sense, SVMs can be seen as M-estimators for a parameter
in a (typically infinite dimensional) reproducing kernel Hilbert space.
After a short introduction into the theory and recent results on
SVMs, it is shown that the difference between the
empirical SVM $f_{L,\mathbf{D}_{n},\lambda}$
and the theoretical SVM $f_{L,P,\lambda}$
is asymptotically normal with rate $\sqrt{n}$. That is,
$\sqrt{n}(f_{L,\mathbf{D}_{n},\lambda}-f_{L,P,\lambda})$
converges weakly to a
Gaussian process in the reproducing kernel Hilbert space.
This is done by an application of the functional delta-method
and by showing that the SVM-functional
$P\mapsto f_{L,P,\lambda}$ is
suitably Hadamard-differentiable.
16.11.2010
Vladimir Spokoiny (WIAS Berlin / HU Berlin)
Oracle Inequalities in Instrumental Variable Estimation with Shape Constraints
Abstract: The problem of
recovering the response function with shape constraints
is considered in the instrumental variables set-up.
The proposed approach reduces this problem to the problem of
convex optimization.
The shape constrains are considered as a family linear inequalities.
The target function is recovered by optimizing its smoothness
(roughness) subject to the shape constrains and the conditions
of no systematic component in the specially constructed residuals.
23.11.2010
Alexander Gushchin (Steklov Mathematical Institute)
On superreplication prices in a general dynamic
market model
Abstract: We consider a general dynamic market model,
determined by a family $\mathcal {X}$ of nonnegative càdlàg
adapted processes $X$ on $[0,T]$ with $X_0=1$. It is meant that if
an investor has an initial wealth $x>0$, then the set of her/his
wealth processes corresponding to all admissible strategies is
$x\mathcal {X}$. One of the problems that we discuss is the
following: Under which assumptions on $\mathcal {X}$ the
superreplication price of any nonnegative contingent claim $f$ can
be calculated as the supremum of expectations of $f$ over all
supermartingale measures, i.e. all measures $Q$ such that $Q$ is
absolutely continuous with respect to the original probability $P$
and every $X \in \mathcal {X}$ is a supermartingale under $Q$. We
discuss also a connection between this problem and the dual problem
in utility maximization.
No prerequisites from mathematical finance are needed.
30.11.2010
Volker Krätschmer (WIAS Berlin)
Central limit theorems for coherent distribution-invariant risk measures
Abstract: During the last decade a new class of statistical functionals, the so called coherent distribution-invariant
risk measures, has become popular in some applied fields. They are building blocks in quantitative risk management,
and they have been suggested as a systematic approach for calculations of insurance premia.
These functionals are often quite complicated which makes the estimation of their values difficult from a practical point of view.
The canonical approach suggests to replace the unknown distribution function with its empirical counterpart based on
observed data and then to plug this estimate into the risk measure to obtain its estimate.
In the talk the limit distributions of the resulting plug-in estimators based on
strongly mixing time series will be developed. The investigations are based on a new representation result for
coherent distribution-invariant risk measures which allows to reduce considerations to the convergence of
stochastic processes. At the end of the talk it will be discussed in which way the limit distributions might be utilized to construct
asymptotic confidence intervals.
07.12.2010
Saskia Becker (WIAS Berlin)
Regularization of statistical inverse problems and the
Bakushinskii veto
Abstract: In the deterministic context Bakushinskii's theorem excludes
the existence of purely data driven convergent regularization for
ill-posed problems. In this talk, I will discuss that in the statistical
setting we can either construct a counter example or develop an
equivalent formulation depending on the considered class of probability
distributions. Hence, Bakushinskii's theorem does not generalize to the
statistical context, although this has often been assumed in the past.
To arrive at this conclusion, I will deduce from the classic theory new
concepts for a general study of statistical inverse problems and perform
a systematic clarification of the key ideas of statistical regularization.
04.01.2011
Mathias Becker (WIAS Berlin)
Exponential moments of self-intersection local times in subcritical dimensions
Abstract:
Fix $p>1$, not necessarily integer, with $p(d-2)0$ that are bounded from
above, possibly tending to zero. The speed is identified in terms of mixed
powers of $t$ and $\theta_t$, and the precise rate is characterized in terms
of a variational formula. As a corollary, we obtain a large-deviation principle
for $\|\ell_t\|_p/(t r_t)$ for deviation functions $r_t$ satisfying
$tr_t\gg\E[\|\ell_t\|_p]$.
11.01.2011
Felix Weidemann (HU Berlin)
A generalisation of the Hobson-Rogers-Model
Abstract: In this talk we consider a stockpricemodel, originally
suggested by D.G. Hobson and L.C.G. Rogers,
in which the price is modelled as a SDE with delay driven by a Brownian Motion. The kind of pastdependence
in this certain model will be explained in detail and methods for optionpricing, similar to those
in the Black-Scholes-Model, will be derived. In another part we will introduce jumps into the SDDE.
We will study completeness and the set of equivalent martingale-measures in this newly created
jump-diffusion-model.
12.01.2011 (additional talk, Wednesday! Erhard-Schmidt-Hörsaal, EG!)
Yavor Stoev
Fourier based calibration of stochastic volatility interest rate model with application to risk managment
Abstract: We discuss the application of a Fourier method for nonparametric volatility estimation in stochastic volatility interest rate model setting. A Heath-Jarrow-Morton type of term-structure evolution with driving volatility of Hobson-Rogers parametric form is introduced and its parameters calibrated to the obtained Fourier volatility estimator. Further, by relaxing our model assumptions and using a dynamic approach for scenario generation we simulate future distributions of risk factors and compute Value-at-Risk (VaR) estimates for the purpose of risk management.
25.01.2011
Ronnie L. Loeffen (WIAS Berlin)
Applications of splitting schemes to financial diffusion models
Abstract: A common problem in quantitative finance is to compute, accurately and
fast, the expectation of a multi-dimensional diffusion process at a
fixed time. In higher dimensions, discretization schemes in combination
with (Quasi) Monte Carlo form a good approach. One such scheme is the
splitting method of Ninomiya-Victoir (NV) which involves solving various
auxiliary ordinary differential equations. Consequently, the NV-scheme
is especially attractive when all underlying ODEs can be solved in
closed-form. By introducing a slight generalization of the NV-scheme,
we show that the class of diffusion models for which all ODE solutions
are explicit, can be significantly increased. We further construct a
particular example for which this applies and show some numerical
results that illustrate the savings in computation time.
Joint work with C. Bayer (University of Vienna) and P. Friz (TU
Berlin/WIAS).
08.02.2011
Christian Bayer
Nonlinear Expectation
Abstract: We give an overview of the Kusuoka-Lyons-Victoir method of cubature on
Wiener space and some of its variants (mostly the Ninomiya-Victoir
method) and extensions, for instance to processes with jumps and
stochastic partial differential equations. We also present some
applications in finance.
15.02.2011
Marcel Ladkau (WIAS Berlin)
Nonlinear Expectation
Abstract: In my talk I consider the optimal
stopping problem for general dynamic
monetary
utility functionals. Sufficient conditions for the Bellman principle and the
existence of optimal stopping times are provided.
As an example for general dynamic monetary utility functionals I focus on
discrete g-expectations as a special case of backward stochastic difference
equations. I present a simple but general model and develop the theory of
backward stochastic difference equations for that particular model.
08.03.2011
Clayton Scott (University of Michigan, Ann Arbor)
Robust Kernel Density Estimation
Abstract: I will discuss a method for
nonparametric, multivariate density
estimation that exhibits robustness to contamination of the training
sample. This method achieves robustness by combining a traditional
kernel density estimator (KDE) with ideas from robust M-estimation. The
KDE based on a Gaussian kernel is interpreted as a sample mean in the
reproducing kernel Hilbert space associated with the kernel. This mean
is robustly estimated through the use of a robust loss, yielding the
robust kernel density estimator (RKDE). The RKDE can be computed with a
"kernelized" iteratively re-weighted least squares algorithm. The
robustness of the RKDE will be quantified in terms of the influence
function, asymptotics, and experimental results in density estimation
and anomaly detection.
15.03.2011
Mikhail Malyutov (Northeastern University, Boston)
Search for active inputs
Abstract:Search for sparse active inputs and related estimation problem are one of
most popular in recent statistical developments under the title 'Compressed sensing'.
The number od papers published in the last 5 years exceeded thousand.
My first joint paper on search for active inputs was published 39 years ago..
Since then I published several dozen papers and found the limit rate for successful search
under the optimal design of experiment for IID noise related to the capacity bounds for Multi Access channels of communication.
Recently these results were generalized to noise with memory. Crucial role here play
simplified homogeneity tests based on universal compression and their exponential tails.
In my talk I'll concentrate mostly on these recent developments