A Minimax Result for the Kullback Leibler Bayes Risk

  • It is of basic interest to assess the quality of the decisions of a statistician, based on the outcoming data of a statistical experiment, in the context of a given model class P of probability distributions. The statistician picks a particular distribution P , suffering a loss by not picking the 'true' distribution P' . There are several relevant loss functions, one being based on the the relative entropy function or Kullback Leibler information distance. In this paper we prove a general 'minimax risk equals maximin (Bayes) risk' theorem for the Kullback Leibler loss under the hypothesis of a dominated and compact family of distributions over a Polish observation space with suitably integrable densities. We also find that there is always an optimal Bayes strategy (i.e. a suitable prior) achieving the minimax value. Further, we see that every such minimax optimal strategy leads to the same distribution P in the convex closure of the model class. Finally, we give some examples to illustrate the results and to indicate, how the minimax result reflects in the structure of least favorable priors. This paper is mainly based on parts of this author's doctorial thesis.

Volltext Dateien herunterladen

Metadaten exportieren

Weitere Dienste

Suche bei Google Scholar
Metadaten
Verfasser*innenangaben:Jürgen Krob, Holger Scholl
URN:urn:nbn:de:hbz:386-kluedo-7315
Dokumentart:Preprint
Sprache der Veröffentlichung:Englisch
Jahr der Fertigstellung:1997
Jahr der Erstveröffentlichung:1997
Veröffentlichende Institution:Technische Universität Kaiserslautern
Datum der Publikation (Server):03.04.2000
Fachbereiche / Organisatorische Einheiten:Kaiserslautern - Fachbereich Mathematik
DDC-Sachgruppen:5 Naturwissenschaften und Mathematik / 510 Mathematik
Lizenz (Deutsch):Standard gemäß KLUEDO-Leitlinien vor dem 27.05.2011