Model Selection and Model Averaging for Neural Networks

Herbert Lee

Abstract:

Neural networks are a useful statistical tool for nonparametric regression. In this thesis, I develop a methodology for doing nonparametric regression within the Bayesian framework. I address the problem of model selection and model averaging, including estimation of normalizing constants and searching of the model space in terms of both the optimal number of hidden nodes in the network as well as the best subset of explanatory variables. I demonstrate how to use a noninformative prior for a neural network, which is useful because of the difficulty in interpreting the parameters. I also prove the asymptotic consistency of the posterior for neural networks.

Keywords: Nonparametric regression, Bayesian statistics, Noninformative prior, Asymptotic consistency, Normalizing constants, Bayesian random searching, BARS



Here is the full postscript text of this thesis. It is 2,065,914 bytes long.
If that doesn't work, you can try the pdf version.