WIAS Preprint No. 196, (1995)

On Large Deviation Efficiency in Statistical Inference



Authors

  • Puhalskii, Anatolii
  • Spokoiny, Vladimir
    ORCID: 0000-0002-2040-3427

2010 Mathematics Subject Classification

  • 62G20 60F10 62G05 62G10

Keywords

  • Large deviation principle, statistical experiments, large deviation efficiency, minimax risks, Bahadur efficiency, Chernoff function

Abstract

This paper presents a general approach to statistical problems with criteria based on probabilities of large deviations. The underlying idea, which originates from similarity in the definitions of the large deviation principle and weak convergence, is to develop a large deviation analogue of asymptotic decision theory. We consider a sequence of statistical experiments over an arbitrary parameter set and introduce for it the concept of the large deviation principle (LDP) which parallels the concept of weak convergence of experiments. Our main result, in analogy with Le Cam's minimax theorem, states that the LDP provides an asymptotic lower bound for the sequence of appropriately defined minimax risks. We show next that the bound is tight and give a method of constructing decisions whose asymptotic risk is arbitrarily close to the bound. The construction is further specified for hypotheses testing and estimation problems. We apply the results to a number of standard statistical models: an i.i.d. sample, regression, the change-point model and others. For each model, we check the LDP; after that, considering first a hypotheses testing problem and then an estimation problem, we calculate asymptotic minimax risks and indicate corresponding decisions.

Appeared in

  • Bernoulli, 4 (1998) No. 2, pp. 203-272.

Download Documents