Item analysis of multiple choice questions of a genetic term exam in an Egyptian Veterinary college for a viable question bank

Document Type : Research paper

Authors

1 Mansoura University, Egypt

2 Department of Biological Sciences, Purdue University Fort Wayne, IN 46805, USA

Abstract

Multiple-choice questions (MCQs) are useful for assessing student performance because they objectively cover a wide range of topics. Its dependability and validity are determined by how well it is built. Defective Items detected by item analysis must be checked for and optimized for item writing flaws. This study used item analysis to evaluate the test items for difficulty levels and discriminating power with functional distractors. A total of 623 students wrote a summative examination in Physiology. It comprised 60 single-response MCQs and 20 true or false (T/F) questions as a part of a 2 h paper for 25 marks. Items were categorized according to their difficulty index, discrimination index, and distractor efficiency. Among 60 MCQs, 32 had zero non-functioning distractors (NFD); 19 had one, 8 had two, and 1 had three. DIFI and DI were in the acceptable range; however, T/F items showed 15% (n=3), 10% (n=2), and 5% (n=1) fair, poor, and negative discrimination, respectively. Non-functional distractors (NFD) were found as 53.3 % (n=32) of questions have 0 NFD of 100% DE, 7.92% (n=19) have DE of 66.6%, 3.33% (n=8) have DE of 33.3%. About 0.42% (n=1) have DE of 0%. Therefore, item analysis is a valuable tool for identifying poorly constructed test items and optimizing these items is an essential step in producing a high-quality question bank.

Keywords

Main Subjects