Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 10 von 623

Details

Autor(en) / Beteiligte
Titel
Examining the Precision of Cut Scores Within a Generalizability Theory Framework: A Closer Look at the Item Effect
Ist Teil von
  • Journal of educational measurement, 2020-06, Vol.57 (2), p.216-229
Ort / Verlag
Madison: Wiley-Blackwell
Erscheinungsjahr
2020
Link zum Volltext
Quelle
Applied Social Sciences Index & Abstracts (ASSIA)
Beschreibungen/Notizen
  • An Angoff standard setting study generally yields judgments on a number of items by a number of judges (who may or may not be nested in panels). Variability associated with judges (and possibly panels) contributes error to the resulting cut score. The variability associated with items plays a more complicated role. To the extent that the mean item judgments directly reflect empirical item difficulties, the variability in Angoff judgments over items would not add error to the cut score, but to the extent that the mean item judgments do not correspond to the empirical item difficulties, variability in mean judgments over items would add error to the cut score. In this article, we present two generalizability‐theory–based analyses of the proportion of the item variance that contributes to error in the cut score. For one approach, variance components are estimated on the probability (or proportion‐correct) scale of the Angoff judgments, and for the other, the judgments are transferred to the theta scale of an item response theory model before estimating the variance components. The two analyses yield somewhat different results but both indicate that it is not appropriate to simply ignore the item variance component in estimating the error variance.

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX