Ergebnis 9 von 4042
Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Information and software technology, 2012-08, Vol.54 (8), p.820-827
2012

Details

Autor(en) / Beteiligte
Titel
Evaluating prediction systems in software project estimation
Ist Teil von
  • Information and software technology, 2012-08, Vol.54 (8), p.820-827
Ort / Verlag
Amsterdam: Elsevier B.V
Erscheinungsjahr
2012
Link zum Volltext
Quelle
ScienceDirect Journals (5 years ago - present)
Beschreibungen/Notizen
  • Software engineering has a problem in that when we empirically evaluate competing prediction systems we obtain conflicting results. To reduce the inconsistency amongst validation study results and provide a more formal foundation to interpret results with a particular focus on continuous prediction systems. A new framework is proposed for evaluating competing prediction systems based upon (1) an unbiased statistic, Standardised Accuracy, (2) testing the result likelihood relative to the baseline technique of random ‘predictions’, that is guessing, and (3) calculation of effect sizes. Previously published empirical evaluations of prediction systems are re-examined and the original conclusions shown to be unsafe. Additionally, even the strongest results are shown to have no more than a medium effect size relative to random guessing. Biased accuracy statistics such as MMRE are deprecated. By contrast this new empirical validation framework leads to meaningful results. Such steps will assist in performing future meta-analyses and in providing more robust and usable recommendations to practitioners.
Sprache
Englisch
Identifikatoren
ISSN: 0950-5849
eISSN: 1873-6025
DOI: 10.1016/j.infsof.2011.12.008
Titel-ID: cdi_proquest_miscellaneous_1031306036

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX