Bootstrapping neural networks

  • Knowledge about the distribution of a statistical estimator is important for various purposes like, for example, the construction of confidence intervals for model parameters or the determiation of critical values of tests. A widely used method to estimate this distribution is the so-called bootstrap which is based on an imitation of the probabilistic structure of the data generating process on the basis of the information provided by a given set of random observations. In this paper we investigate this classical method in the context of artificial neural networks used for estimating a mapping from input to output space. We establish consistency results for bootstrap estimates of the distribution of parameter estimates.

Download full text files

Export metadata

Additional Services

Search Google Scholar
Metadaten
Author:Jürgen Franke, Michael Neumann
URN:urn:nbn:de:hbz:386-kluedo-4804
Series (Serial Number):Report in Wirtschaftsmathematik (WIMA Report) (38)
Document Type:Preprint
Language of publication:English
Year of Completion:1998
Year of first Publication:1998
Publishing Institution:Technische Universität Kaiserslautern
Date of the Publication (Server):2000/04/03
Faculties / Organisational entities:Kaiserslautern - Fachbereich Mathematik
DDC-Cassification:5 Naturwissenschaften und Mathematik / 510 Mathematik
Licence (German):Standard gemäß KLUEDO-Leitlinien vor dem 27.05.2011