Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 1 von 17
IEEE transactions on neural networks, 1998-03, Vol.9 (2), p.266-280
1998
Volltextzugriff (PDF)

Details

Autor(en) / Beteiligte
Titel
Neural-network design for small training sets of high dimension
Ist Teil von
  • IEEE transactions on neural networks, 1998-03, Vol.9 (2), p.266-280
Ort / Verlag
New York, NY: IEEE
Erscheinungsjahr
1998
Quelle
IEEE/IET Electronic Library
Beschreibungen/Notizen
  • We introduce a statistically based methodology for the design of neural networks when the dimension d of the network input is comparable to the size n of the training set. If one proceeds straightforwardly, then one is committed to a network of complexity exceeding n. The result will be good performance on the training set but poor generalization performance when the network is presented with new data. To avoid this we need to select carefully the network architecture, including control over the input variables. Our approach to selecting a network architecture first selects a subset of input variables (features) using the nonparametric statistical process of difference-based variance estimation and then selects a simple network architecture using projection pursuit regression (PPR) ideas combined with the statistical idea of slicing inverse regression (SIR). The resulting network, which is then retrained without regard to the PPR/SIR determined parameters, is one of moderate complexity (number of parameters significantly less than n) whose performance on the training set can be expected to generalize well. The application of this methodology is illustrated in detail in the context of short-term forecasting of the demand for electric power from an electric utility.

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX