A Minimax Result for the Kullback Leibler Bayes Risk

  • It is of basic interest to assess the quality of the decisions of a statistician, based on the outcoming data of a statistical experiment, in the context of a given model class P of probability distributions. The statistician picks a particular distribution P , suffering a loss by not picking the 'true' distribution P' . There are several relevant loss functions, one being based on the the relative entropy function or Kullback Leibler information distance. In this paper we prove a general 'minimax risk equals maximin (Bayes) risk' theorem for the Kullback Leibler loss under the hypothesis of a dominated and compact family of distributions over a Polish observation space with suitably integrable densities. We also find that there is always an optimal Bayes strategy (i.e. a suitable prior) achieving the minimax value. Further, we see that every such minimax optimal strategy leads to the same distribution P in the convex closure of the model class. Finally, we give some examples to illustrate the results and to indicate, how the minimax result reflects in the structure of least favorable priors. This paper is mainly based on parts of this author's doctorial thesis.

Download full text files

Export metadata

Additional Services

Search Google Scholar
Metadaten
Author:Jürgen Krob, Holger Scholl
URN:urn:nbn:de:hbz:386-kluedo-7315
Document Type:Preprint
Language of publication:English
Year of Completion:1997
Year of first Publication:1997
Publishing Institution:Technische Universität Kaiserslautern
Date of the Publication (Server):2000/04/03
Faculties / Organisational entities:Kaiserslautern - Fachbereich Mathematik
DDC-Cassification:5 Naturwissenschaften und Mathematik / 510 Mathematik
Licence (German):Standard gemäß KLUEDO-Leitlinien vor dem 27.05.2011