Ergebnis 24 von 1053
Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Machine learning, 2015-01, Vol.98 (1-2), p.157-180
2015

Details

Autor(en) / Beteiligte
Titel
Unsupervised feature selection with ensemble learning
Ist Teil von
  • Machine learning, 2015-01, Vol.98 (1-2), p.157-180
Ort / Verlag
New York: Springer US
Erscheinungsjahr
2015
Link zum Volltext
Quelle
SpringerLink (Online service)
Beschreibungen/Notizen
  • In this paper, we show that the way internal estimates are used to measure variable importance in Random Forests are also applicable to feature selection in unsupervised learning. We propose a new method called Random Cluster Ensemble (RCE for short), that estimates the out-of-bag feature importance from an ensemble of partitions. Each partition is constructed using a different bootstrap sample and a random subset of the features. We provide empirical results on nineteen benchmark data sets indicating that RCE, boosted with a recursive feature elimination scheme (RFE) (Guyon and Elisseeff, Journal of Machine Learning Research, 3:1157–1182, 2003 ), can lead to significant improvement in terms of clustering accuracy, over several state-of-the-art supervised and unsupervised algorithms, with a very limited subset of features. The method shows promise to deal with very large domains. All results, datasets and algorithms are available on line ( http://perso.univ-lyon1.fr/haytham.elghazel/RCE.zip ).
Sprache
Englisch
Identifikatoren
ISSN: 0885-6125
eISSN: 1573-0565
DOI: 10.1007/s10994-013-5337-8
Titel-ID: cdi_hal_primary_oai_HAL_hal_01339161v1

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX