figshare
Browse

Impact of Test Item Precision on Student Performance and Item Performance

Download (1.57 MB)
thesis
posted on 2025-12-02, 10:50 authored by Muhammad AfzalMuhammad Afzal
<p dir="ltr">Many nations around the globe made the decision at the beginning of 2020 to partially or entirely go online and close schools to limit the spread of coronavirus (COVID-19). Pakistan's situation was no different from this one. For these main levels, this major shift in the schooling paradigm has brought substantial improvements in teaching, learning and evaluation practices at school level. Too many inquiries and explorations, this phenomenon opened doors. This includes access to internet services, the well-being of young students, and the usefulness of online teaching. The efficacy of e-assessment has been one of the main issues. Assessment is a broad term that includes several topics, such as styles of evaluation, design of evaluation, data of evaluation, effect of evaluation, etc. For almost all school systems in Pakistan, remote online delivery and evaluation are novel experiences that present many difficulties, especially valid and reliable e-assessments developed to evaluate the educational outcomes of young children. The present work describes the effect of the design of the e-assessment item on the assessment item and student performance in a private school system in Karachi, Pakistan. In compliance with e-assessment item writing guidelines, five e-assessment test items were developed and five e-assessment test items were designed that were imprecise and not according to e-assessment item writing guidelines. Between these two e-assessment items, the difference in item difficulty was assessed. It also compared the effect on student performance of standard (precise) and non-standard (imprecise) e-assessment products. E-assessment items that were non-standard or imprecise were 0.37 points more complicated than the standard or precise e-assessment items. In an MCQ style item and ambiguous stem, double negative statement emerged as the most relevant items on the item difficulty scale, i.e. 0.20 and 0.10, respectively. Mean student performance score was compared in both settings and a major difference was observed in the mean score of students when standard or correct mean e-assessment results were compared with imprecise or non-standard items, i.e. 17 and 9, respectively. The study therefore concludes that student performance was found to be more difficult and adversely affected by non-standard or imprecise e-assessment items. Therefore, for a legitimate and accurate e-assessment at schools, e-assessment item writing guidelines and principles should be addressed.</p>

History

Usage metrics

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC