MetadataShow full item record
Equivalence of Reading and Listening Comprehension Across Test Media
Whether an ability test delivered on either paper or computer provides the same information is an important question in applied psychometrics. Besides the validity, it is also the fairness of a measure that is at stake if the test medium affects performance. This study provides a comprehensive review of existing equivalence research in the field of reading and listening comprehension in English as a foreign language and specifies factors that are likely to have an impact on equivalence. Taking into account these factors, comprehension measures were developed and tested with N = 442 high school students. Using multigroup confirmatory factor analysis, it is shown that reading and listening comprehension both were measurement invariant across test media. Nevertheless, it is argued that equivalence of data gathered on paper and computer depends on the specific measure or construct, the participants or the recruitment mechanisms, and the software and hardware realizations. Therefore, equivalence research is required for specific instantiations unless generalizable knowledge about factors affecting equivalence is available. Multigroup confirmatory factor analysis is an appropriate and effective tool for the assessment of the comparability of test scores across test media.