Bootstrapping neural networks

  • Knowledge about the distribution of a statistical estimator is important for various purposes like, for example, the construction of confidence intervals for model parameters or the determiation of critical values of tests. A widely used method to estimate this distribution is the so-called bootstrap which is based on an imitation of the probabilistic structure of the data generating process on the basis of the information provided by a given set of random observations. In this paper we investigate this classical method in the context of artificial neural networks used for estimating a mapping from input to output space. We establish consistency results for bootstrap estimates of the distribution of parameter estimates.

Export metadata

  • Export Bibtex
  • Export RIS

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Jürgen Franke, Michael Neumann
URN (permanent link):urn:nbn:de:hbz:386-kluedo-4804
Serie (Series number):Report in Wirtschaftsmathematik (WIMA Report) (38)
Document Type:Preprint
Language of publication:English
Year of Completion:1998
Year of Publication:1998
Publishing Institute:Technische Universität Kaiserslautern
Faculties / Organisational entities:Fachbereich Mathematik
DDC-Cassification:510 Mathematik

$Rev: 12793 $