Random Item Response Model Approaches to Evaluating Item Format Effects
Abstract
The PISA 2006 science assessment is composed of open response, multiple-choice, and constructed multiple choice items. The current study introduced the random item response models to investigate the item format effects on item difficulties, and these models include the linear logistic test model with ra
The PISA 2006 science assessment is composed of open response, multiple-choice, and constructed multiple choice items. The current study introduced the random item response models to investigate the item format effects on item difficulties, and these models include the linear logistic test model with random item effects (i.e., the LLTM-R) and the hierarchical item response model (i.e., the hierarchical IRM). In this study these models were applied to the PISA 2006 science data set to explore the relationship between items' format and their difficulties. The empirical analysis results in the PISA 2006 science assessment first find that the LLTM-R and the hierachical IRM provides equivalent item difficulty estimates compared with those from the Rasch model and the LLTM, and also clearly show that the item difficulties are substantially affected by item formats. This result implies that item difficulties may be different to each other depending on the item format although they deal with the same content.
ndom item effects (i.e., the LLTM-R) and the hierarchical item response model (i.e., the hierarchical IRM). In this study these models were applied to the PISA 2006 science data set to explore the relationship between items' format and their difficulties. The empirical analysis results in the PISA 2006 science assessment first find that the LLTM-R and the hierachical IRM provides equivalent item difficulty estimates compared with those from the Rasch model and the LLTM, and also clearly show that the item difficulties are substantially affected by item formats. This result implies that item difficulties may be different to each other depending on the item format although they deal with the same content.
Full Text:
PDFDOI: https://doi.org/10.5296/jse.v8i3.13387
Refbacks
- There are currently no refbacks.
Copyright (c) 2018 Journal of Studies in Education
Journal of Studies in Education ISSN 2162-6952
Email: jse@macrothink.org
Copyright © Macrothink Institute
To make sure that you can receive messages from us, please add the 'macrothink.org' domain to your e-mail 'safe list'. If you do not receive e-mail in your 'inbox', check your 'bulk mail' or 'junk mail' folders.