SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Morgan PH. Expert. Syst. 2008; 25(4): 394-413.

Copyright

(Copyright © 2008, John Wiley and Sons)

DOI

10.1111/j.1468-0394.2008.00466.x

PMID

unavailable

Abstract

The aim of this work is to avoid overfitting by seeking parsimonious neural network models and hence to provide better out-of-sample predictions. The resulting sparse networks are easier to interpret as simple rules which, in turn, could give greater insight into the structure of the data. Fully connected feedforward neural networks are pruned through optimization of an estimated Schwartz model selection criterion using differential evolution to produce a sparse network. A quantity, α, which indicates how close a parameter is to zero is used to estimate the number of model parameters which are being pruned out. The value of α is incorporated into a function of the Schwartz information criterion to form an objective function whose maxima, as α tends to zero, define parsimonious neural network models for a given data set. Since there is a multiplicity of maxima, differential evolution, with its greater capacity for global optimization, is used to optimize this objective function. The value of α is progressively reduced during the evolution of the population of models in the manner of a sequential unconstrained optimization technique. The method is illustrated by results on four sets of data.

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print