Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
We present a result applicable to classification learning algorithms that generate decision trees or rules using the information entropy minimization heuristic for discretizing continuous-valued attributes. The result serves to give a better understanding of the entropy measure, to point out that the behavior of the information entropy heuristic possesses desirable properties that justify its usage in a formal sense, and to improve the efficiency of evaluating continuous-valued attributes for cut value selection. Along with the formal proof, we present empirical results that demonstrate the theoretically expected reduction in evaluation effort for training data sets from real-world domains.
Sprache
Englisch
Identifikatoren
ISSN: 0885-6125
eISSN: 1573-0565
DOI: 10.1007/BF00994007
Titel-ID: cdi_proquest_miscellaneous_25775711
Format
–
Weiterführende Literatur
Empfehlungen zum selben Thema automatisch vorgeschlagen von bX