UEWScholar Repository

Using reliability and item analysis to evaluate a teacher-developed test in educational measurement and evaluation

Show simple item record

dc.contributor.author Quaigrain K.
dc.contributor.author Arhin A.K.
dc.date.accessioned 2022-10-31T15:05:40Z
dc.date.available 2022-10-31T15:05:40Z
dc.date.issued 2017
dc.identifier.issn 2331186X
dc.identifier.other 10.1080/2331186X.2017.1301013
dc.identifier.uri http://41.74.91.244:8080/handle/123456789/506
dc.description Quaigrain, K., Department of Educational Studies, University of Education Winneba, Kasoa, Ghana; Arhin, A.K., Department of Interdisciplinary Studies, University of Education Winneba - Kumasi Campus, College of Technology Education, P.O. Box 1277, Kumasi, Ghana en_US
dc.description.abstract Item analysis is essential in improving items which will be used again in later tests; it can also be used to eliminate misleading items in a test. The study focused on item and test quality and explored the relationship between difficulty index (p-value) and discrimination index (DI) with distractor efficiency (DE). The study was conducted among 247 first-year students pursuing Diploma in Education at Cape Coast Polytechnic. Fifty multiple-choice questions were administered as an end of semester examination in Educational Measurement course. Internal consistency reliability of the test was 0.77 using Kuder�Richardson 20 coefficient (KR-20). The mean score was 29.23 with a standard deviation of 6.36. Mean difficulty index (p) value and DI were 58.46% (SD 21.23%) and 0.22 (SD 0.17), respectively. DI was noted to be a maximum at a p-value range between 40 and 60%. Mean DE was 55.04% (SD 24.09%). Items having average difficulty and high discriminating power with functional distractors should be integrated into future tests to improve the quality of the assessment. Using DI, it was observed that 30 (60%) of the test items fell into the reasonably good or acceptable value ranges. � 2017 The Author(s). This open access article is distributed under a Creative Commons Attribution (CC-BY) 4.0 license. en_US
dc.publisher Taylor and Francis Ltd. en_US
dc.subject difficulty index en_US
dc.subject discrimination index en_US
dc.subject distractor efficiency en_US
dc.subject item analysis en_US
dc.subject non-functional distractor (NFD) en_US
dc.title Using reliability and item analysis to evaluate a teacher-developed test in educational measurement and evaluation en_US
dc.type Article en_US


Files in this item

Files Size Format View

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record

Search UEWScholar


Browse

My Account