Greener Journal of Biological Sciences

Excellence and Timeliness

  
  

Choose Language

Ad.


By Izah, SC; Odubo, TC; Ajumobi, VE; Osinowo, O (2022).

Greener Journal of Biological Sciences

Vol. 12(1), pp. 11-22, 2022

ISSN: 2276-7762

Copyright ©2022, the copyright of this article is retained by the author(s)

http://gjournals.org/GJBS

 

 

 

 

 

Item Analysis of Objective Structured Practical Examination (OSPE) Used as an Assessment Tool for First-Year Microbiology Students.

 

 

Sylvester Chibueze Izah1*, Tamaraukepreye Catherine Odubo1, Victor Emeka Ajumobi1 and Olugbenro Osinowo2

 

 

1Department of Microbiology, Faculty of Science, Bayelsa Medical University, Yenagoa, Bayelsa State, Nigeria.

2Department of Surgery, Faculty of Clinical Sciences, Bayelsa Medical University, Yenagoa, Bayelsa State, Nigeria.

 

 

 

ARTICLE INFO

ABSTRACT

 

Article No.: 112621137

Type: Research

Full Text: PDF, HTML, PHP, EPUB

 

Item analysis is used to examine students' responses to items to determine the quality of an assessment tool. This study aimed at assessing the quality of the objective structured practical examination (OSPE) on Introductory Microbiology. 41 first-year Microbiology major students of Bayelsa Medical University, Nigeria took part in the OSPE containing 40 items. Marks were not deducted for wrong answers as decided by the department and each item carried one mark, and 40% was the pass mark. The items were analyzed for the difficulty index, discrimination index, distractor efficiency, and reliability index (Cronbach’s alpha). Also, key distributions, numbers, and percentage passed were determined. Results showed that the discrimination index had 15 (37.5 %), 2 (5.0 %), and 3 (7.5 %) translated as excellent, good, and acceptable. The difficulty index revealed that 4 (10.0 %) of the items were ideal, while the remaining 36 (90.0 %) were difficult. The 40 items had 160 distractors, of which 71 (44.4 %) and 89 (55.6 %) were functional and non-functional distractors, respectively. The difficulty index indicated positive significant relationship with the discrimination index (r = 1.000) and distractor efficiency (r = 0.408) at p = 0.01. The overall reliability analysis of the item was 0.754, an indication that it is good for classroom assessment, but some items need improvement. Too easy items and poor distractors may have caused the poor difficulty index. Therefore, there is the need for item flaws and technical pitfalls to be carried out to correct the errors in subsequent assessments. From the findings in this study it is recommended that OSPE be more widely adopted among more science-based departments and item analysis be a standard practice in departments of every university.

 

Accepted:  29/11/2021

Published: 31/03/2022

 

*Corresponding Author

Sylvester Izah

E-mail: chivestizah@ gmail.com

Phone: +2347030192466

 

Keywords:

Assessment tool, Internal Consistency, Item analysis, Microbiology items, OSPE.

 

 

 

 

    

Return to Content       

Full Text: PDFHTMLPHPEPUB

   

 

 

REFERENCES

 

1.      Armour-Thomas, E., and E.W. Gordon, 2013. “Toward an understanding of assessment as a dynamic component of pedagogy”.  Gordon Commission on the future of assessment in education. Retrieved June 30 2020. https://www.ets.org/Media/Research/pdf/armour_thomas_gordon_understanding_assessment.pdf.

 

2.      Bhat, D., P. Murugesh, and NB. Pushpa, 2020. “Objective structured practical examination: As an assessment tool in newly introduced competency based anatomy curriculum”. Indian Journal of Clinical Anatomy and Physiology7(1): 81-86.

 

3.      Centre for Educational Research and Innovation (CERI), 2008. “Assessment for learning formative assessment”. Retrieved from oecd.org/site/educeri21st/40600533.pdf. Accessed December 16, 2020.

 

4.      Charania, JS 2015. "Item analysis of multiple choice questions given to first year medical students-concept building and MCQs". International Journal of Research in Humanities and Social Sciences, 3(9): 18 – 29.

 

5.      Chhaya, J., H. Bhabhor, J. Devalia, U. Machhar, and A. Kavishvar, 2018. “A study of quality check on multiple choice questions (MCQs) using item analysis for differentiating good and poor performing students”. Healthline Journal, 9(1): 24 – 29.

 

6.      Frantz, J.M., M. Rowe, D.A. Hess, A.J. Rhoda, B.L. Sauls, and L. Wegner, 2013. “Student and staff perceptions and experiences of the introduction of objective structured practical examinations:  A pilot study”. African Journal of Health Professions Education 5(2):72-74.

 

7.      Gajjar, S., R. Sharma, P. Kumar, and M. Rana, 2014. “Item and test analysis to identify quality multiple choice questions (MCQs) from an assessment of medical students of Ahmedabad, Gujarat”. Indian journal of community medicine: official publication of Indian Association of Preventive & Social Medicine, 39(1): 17-20.

 

8.      Harden, R.M., and RG Cairncross, 1980. “Assessment of practical skills: The objective structured practical examination (OSPE)”. Studies in Higher Education 5(2): 187-196.

 

9.      Hingorjo, M.R., and F. Jaleel 2012. “Analysis of one-best MCQs: the difficulty index, discrimination index and distractor efficiency”.  Journal of the Pakistan Medical Association, 62(2): 142 - 147.

 

10.   Izah, S.C., T.C. Odubo, V.E. Ajumobi, and K.E. Torru, 2021. “Item analysis of Multiple Choice Questions (MCQs) from a formative assessment of first year microbiology major students”. Research and Review Insights, 5:1-6. doi: 10.15761/RRI.1000166.

 

11.   Jaswal, S., J. Chattwal, J. Kaur, S. Gupta, and T. Singh, 2015. “Assessment for learning with Objectively Structured Practical Examination in Biochemistry”. International Journal of Applied and Basic Medical Research 5:S71-S75.

 

12.   Mahjabeen, W., S. Alam, U. Hassan, T. Zafar, R. Butt, S. Konain, and M. Rizvi, 2017. “Difficulty index, discrimination index and distractor efficiency in multiple choice questions”. Annals of Pakistan Institute of Medical Sciences, 3(4):310 – 315.

 

13.   Mard, S.A., and S. Ghafouri, 2020. “Objective Structured Practical Examination in Experimental Physiology Increased Satisfaction of Medical Students”. Advances in Medical Education and Practice 11: 651—659.

 

14.   Mokkapati, A., G. Pavani, S.M. Dass, and M.S. Rao, 2016. “Objective structured practical examination as a formative assessment tool for IInd MBBS microbiology students”. International Journal of Medical Science and Public Health, 4(10): 4535-4540.

 

15.   Munjal, K., P.K. Bandi, A. Varma, and S. Nandedkar, 2012. “Assessment of medical students by OSPE method in pathology”. Internet Journal of Medical Update, 7(1):2-7.

 

16.   Nigam, R., and P. Mahawar, 2011. “Critical analysis of performance of MBBS students using OSPE & TDPE–A comparative study”. National Journal of Community Medicine2(3), 322-24.

 

17.   Patel, R.M. 2017. “Use of item analysis to improve quality of multiple choice questions in II MBBS”. Journal of Education Technology in Health Sciences, 4(1): 22 – 29.

 

18.   Rao, C., H.K. Prasad, K. Sajitha, H. Permi, and J. Shetty, 2016. “Item Analysis Of Multiple Choice Questions: Assessing An Assessment Tool In Medical Students”. International Journal of Educational And Psychological Researches2(4): 201.

 

19.   Relwani, N.R., R.A. Wadke, S. Anjenaya, and P.N. Sawardekar, 2016. Effectiveness of objective structured practical examination as a formative assessment tool as compared to traditional method for MBBS students. International Journal of Community Medicine and Public Health3(12):3526-3532.

 

20.   Vijaya, D.S., Alan, 2014. “Comparative Study to Evaluate Practical Skills in Physiology Among 1st Phase Medical Under Graduates At Jnmc Belgaum: Traditional Practical Examinations Versus Objective Structure Practical Examinations (TPE V/S OSPE)”. International Journal of Educational Research and Technology, 5(1):126-134.

 

21.   Vishwakarma, K., M. Sharma, P.S. Matreja, and V.P. Giri, 2016. “Introducing objective structured practical examination as a method of learning and evaluation for undergraduate pharmacology”. Indian Journal of Pharmacology 48(Suppl 1): S47–S51.

 

 

Cite this Article: Izah, SC; Odubo, TC; Ajumobi, VE; Osinowo, O (2022). Item Analysis of Objective Structured Practical Examination (OSPE) Used as an Assessment Tool for First-Year Microbiology Students. Greener Journal of Biological Sciences, 12(1): 11-22.

 


Call for Papers

Call for Scholarly Articles


Authors from around the world are invited to send scholarly articles that suits the scope of this journal. The journal is currently open to submissions and will process and publish articles promptly.


The journal is centered on quality and goes about its processes in a very timely fashion. Seasoned editors/reviewers will be consulted to review each article(s), profer quality evaluations and polish the articles with expertise before publication.


Simply send your article(s) as an e-mail attachment to gjbs@acad.gjournals.org or manuscripts.igj@gmail.com.

             


Ad.


Search

Login Form

Newsletter


Other Journals