By Izah, SC; Odubo, TC; Ajumobi, VE; Osinowo, O (2022).
|
Greener Journal of
Educational Research Vol. 12(1), pp. 01-12, 2022 ISSN: 2276-7789 Copyright ©2022, the
copyright of this article is retained by the author(s) |
|
Attitudes
to and evaluation of Objective Structured Practical Examination by first-year
Microbiology Students of a Tertiary Educational Institution in Nigeria.
Sylvester Chibueze Izah1*;
Tamaraukepreye Catherine Odubo1; Victor Emeka Ajumobi1; Olugbenro
Osinowo2
1Department
of Microbiology, Faculty of Science, Bayelsa Medical
University, Yenagoa, Bayelsa
State, Nigeria.
2Department
of Surgery, Faculty of Clinical Sciences, Bayelsa
Medical University, Yenagoa, Bayelsa
State, Nigeria.
1.0
INTRODUCTION
Assessment is one of the vital
measures of learning in any discipline, and it is essential to determine
students' competencies (Relwani et al., 2016).
Assessment is done in all aspects of the educational sector. Still, in most
educational sectors, assessment is done with the sole aim of determining the
students and or the number of students that passed or failed a particular
examination. However, assessment needs to be done holistically for evaluation
and decision-making.
There are different types of
assessments used in evaluating formative and summative examinations. The common
ones include essays or theory (short and long), which is mainly used to assess
the cognitive domain and the students writing abilities (composition) of
students, multiple-choice questions (MCQ), Objective Structured Clinical
Examination (OSCE), and Objective Structured Practical Examination (OSPE). The
OSCE and OSPE are used to assess the three domains (cognitive,
psychomotor, and affective)
of learning. Of these, Essay and MCQ are
commonly used in all disciplines. However, in the health profession, OSPE and
OSCE are also common, where they are used to assess students' performance,
clinical knowledge, and skills accurately needed to make an informed decision
(Frantz et al., 2013).
The Objective Structured Clinical
Examination developed in the mid-1970s was used to assess clinical proficiency
in an objective and structured way (Harden and Gleeson, 1979; Frantz et al.,
2013), and it entails students moving through several stations or benches where
they are assessed separately following specific defined standards in the form
of a worksheet (Frantz et al., 2013).
According to Harden and Cairncross (1980), Frantz et al. (2013), Mard and Ghafouri (2020), OSPE
was modified for its use in practical examinations from OSCE in 1975. The main
advantage of OSPE is that it is used to assess an individual’s practical
ability in both summative and formative assessments outside the clinical
context, and students obtain feedback as an integral component of their
learning processes (Frantz et al., 2013). OSPE creates objectivity (because all
students are exposed to the same questions and time interval) (standardization)
in assessment and covers a wide range of scope as compared to traditional
assessments, which are unstructured and never reliable because different
students are assessed by other examiners and sometimes with additional questions
or same questions with different approaches (Frantz et al., 2013). OSPE is used
to evaluate the individuals’ practical skills (Sai et
al., 2020) and used to identify students' strengths and weaknesses and
focus on intervention (Faldessai et al., 2014). Feedback from
students using OSPE has shown an increased satisfaction in a course, validity,
and reliability compared with the traditional approach (Rahman
et al., 2007; Ananthakrishnan, 1993; Mard and Mard and Ghafouri, 2020).
The shortcomings of the traditional
practical examination are characterized by subjectivity. To a large extent,
these have been addressed by OSPE due to their objectivity, which has taken the
assessment of practical skills to a different dimension. Like OSCE, OSPE is
common in medical disciplines; it can also be applied in other related fields,
especially Microbiology.
Microbes are found in diverse
environments, where they play both beneficial and detrimental roles to living
organisms and non-living things. Thus its study should be done holistically by
incorporating practical exercises for all the sub-disciplines, topics, or
courses under the microbiology curriculum. Conventional microbiological
processes are time-consuming; hence, using traditional practical examinations
may create broad subjectivity in its application. OSPE minimizes the challenges
associated with the microbial processes during the examination and the
subjectivity created using conventional techniques for assessment. To evaluate
the effectiveness of a particular assessment tool, the student's attitude and
perception towards the pattern and conduct of the examinations must be taken
into consideration; that is why the present study is essential.
Information
on students' attitudes towards OSPE as an assessment tool for first-year
microbiology students is scarce in the literature, particularly in Bayelsa Medical University, Nigeria. Due to the numerous advantages of OSPE, it was adopted in
the Department of Microbiology at the Bayelsa Medical
University (BMU), Bayelsa State, Nigeria. For a
formative end-of-semester examination, the practical examination was conducted
for first-year microbiology students as part of their first semester practical
examination (Introductory Microbiology). At the end of the examination, feedback
was obtained from the students using a structured questionnaire to determine
their attitude towards using OSPE as an assessment tool for the practical
examinations. Therefore, this study focuses on the attitude of the first-year
Microbiology major students on the use of OSPE for practical examinations. The
findings of this study will be helpful in decision-making to the scientific
community, especially the Microbiology discipline.
2.0 MATERIALS AND METHODS
2.1 Design, setting, and sample
This study's qualitative data was
obtained from an open-ended questionnaire served to all forty-one (41)
first-year microbiology major students immediately after the first OSPE in the
Department of Microbiology at the Bayelsa Medical
University in December 2020. The students were briefed on the essence of OSPE
before the examination by the course lecturers.
2.2. OSPE Organization
The
OSPE was conducted for introductory microbiology, a core course for first-year students
in the microbiology programme. Twelve (12) stations
were set up, consisting of 10 workbenches and two rest stations. The stations
were arranged to move quickly to the next station completing a practical
technique and answering related theoretical questions (Frantz et al., 2013).
Students spent 5 minutes at each of the stations. The time duration was
regulated with the assistance of a laboratory technologist that served as the
time-keeper. The students were divided into four groups of 12 students each
except the last group, which had five students. Each group was supervised/invigilated
by the course lecturers and laboratory technologists.
The
exercise was carried out in two batches using three laboratories. Two
laboratories were used simultaneously for the first 24 students, while the
remaining 17 were quarantined in the third laboratory. At the end of the OSPE
for the first batch, the answer scripts were retrieved, and the students moved
to the third laboratory, while the last set consisting of 17 students was moved
to the laboratory used for the first 24 students. The movement was done in a
way that both groups did not meet. At the end of the OSPE, the students were
given a structured questionnaire to test their attitude and perception towards
OSPE for effective evaluation for all practical examinations. The examiners
(course lecturers) reviewed the OSPE instructions before the assessment period.
2.3 Data collection and procedure
Two open-ended questionnaires used
for data collection focused on the effectiveness, degree of difficulty,
adequacy of time, preference for practical examinations, suitable format for
assessing knowledge, previous knowledge, opportunity to answer all questions,
test for a broader scope, level of difficulty, and emotional stress associated
with OSPE. The eleven items in the questionnaires were designed using the Likert scale (5 - 0) of different categories. Anonymity was
ensured during responses by students.
2.4 Data analysis
The data were processed using
Microsoft Excel files and cross-checked by two staff of the Department
independently. The responses of the students were analyzed using SPSS version
20. The data were summarized into mean, mode, median and standard deviation,
and simple percent. Pearson’s correlation, analysis of variance, and
reliability (Cronbach’s alpha) were also carried out
to show the statistical significance of the items. The result was summarized
into bar charts and plotted using SPSS (Figures 1-11).
3.0 RESULTS AND DISCUSSIONS
The attitude of the first-year Microbiology major
students to the OSPE is shown in Table 1 and Figures 1 - 11. Table 1 shows the
descriptive statistics of the students’ responses on attitudes to the use of
OSPE as an assessment tool. Table 2 shows Pearson's correlation matrix of all
the items on the students' responses to the attitude to OSPE for evaluation. Based
on the items tested, the mean values ranged from 3.56±1.23 – 4.22±0.97 for the
arithmetic mean, 4.00 – 5.00 for median mean, and 4.00 – 5.00 for mode except
for items 6 and 7 that had a low mean value of 1.00 for both median and mode,
and 1.54±0.98 – 1.85±1.24 for the arithmetic mean.
Table 1: Descriptive statistics of the students’
responses on attitudes to the use of OSPE as an assessment tool
|
Statistics
|
Questions |
||||||||||
|
Q1 |
Q2 |
Q3 |
Q4 |
Q5 |
Q6 |
Q7 |
Q8 |
Q9 |
Q10 |
Q11 |
|
|
N |
41.00 |
41.00 |
41.00 |
41.00 |
41.00 |
41.00 |
41.00 |
41.00 |
41.00 |
41.00 |
41.00 |
|
Mean |
3.88 |
4.20 |
3.56 |
4.05 |
4.00 |
1.54 |
1.85 |
3.83 |
4.22 |
3.98 |
4.10 |
|
Median |
4.00 |
4.00 |
4.00 |
5.00 |
4.00 |
1.00 |
1.00 |
4.00 |
4.00 |
4.00 |
4.00 |
|
Mode |
4.00 |
4.00 |
5.00 |
5.00 |
5.00 |
1.00 |
1.00 |
4.00 |
5.00 |
5.00 |
5.00 |
|
Std.
Deviation |
1.03 |
0.64 |
1.23 |
1.16 |
0.97 |
0.98 |
1.24 |
1.14 |
0.96 |
0.99 |
1.07 |
Table
2: Pearson’s Correlation matrix of all the items on the students' responses on
the effectiveness of OSPE as an assessment tool for first-year microbiology
students
|
Questions |
Q1 |
Q2 |
Q3 |
Q4 |
Q5 |
Q6 |
Q7 |
Q8 |
Q9 |
Q10 |
Q11 |
|
Q1 |
1.000 |
|
|
|
|
|
|
|
|
|
|
|
Q2 |
-0.115 |
1.000 |
|
|
|
|
|
|
|
|
|
|
Q3 |
0.392* |
-0.016 |
1.000 |
|
|
|
|
|
|
|
|
|
Q4 |
0.612** |
-0.147 |
0.314* |
1.000 |
|
|
|
|
|
|
|
|
Q5 |
0.199 |
0.040 |
0.000 |
0.464** |
1.000 |
|
|
|
|
|
|
|
Q6 |
-0.033 |
0.068 |
0.222 |
0.020 |
-0.052 |
1.000 |
|
|
|
|
|
|
Q7 |
0.182 |
-0.121 |
0.287 |
0.075 |
0.083 |
0.688** |
1.000 |
|
|
|
|
|
Q8 |
0.366* |
0.013 |
0.232 |
0.347* |
0.406** |
-0.005 |
0.195 |
1.000 |
|
|
|
|
Q9 |
0.583** |
0.010 |
.444** |
0.662** |
0.373* |
0.058 |
-0.014 |
0.400** |
1.000 |
|
|
|
Q10 |
0.218 |
0.245 |
0.156 |
0.437** |
0.546** |
0.040 |
-0.044 |
0.330* |
0.480** |
1.000 |
|
|
Q11 |
0.170 |
0.264 |
0.225 |
0.117 |
0.000 |
0.164 |
0.144 |
0.302 |
0.295 |
0.239 |
1.000 |
Source:
Authors
*. Correlation is
significant at the 0.05 level (2-tailed).
**. Correlation is
significant at the 0.01 level (2-tailed).
N=41
The responses
on the effectiveness of OSPE as an assessment tool by first-year Microbiology
major students indicates that 29.27%, 41.46%, 21.95%, 2.44%, and 4.88% agree
that it is an excellent, good, average, poor, and very poor tool for assessing
practical in the microbiology programme, respectively
(Figure 1). Approximately 92.86% of the students indicated that OSPE is
effective in evaluating their knowledge. OSPE as an effective tool for
assessing students of microbiology on practical's correlates positively with
time adequacy (r=0.392) and opportunity for all students to answer the same
question (r=0.366) at p<0.05; and Department to adopt OSPE for all practical
examination (r=0.612) and its performance compared to other traditional
practical examination (r=0.583) at p<0.01 (Table 2).

Figure 1: Responses on the effectiveness of
OSPE as an assessment tool by first-year microbiology students (Source:
Authors)
Based on the
degree of difficulty, responses showed that 31.71%, 56.10%, 12.20%, 0.00%, and
0.00% opined that the OSPE was not difficult, fairly difficult, moderately
difficult, very difficult, and extremely difficult, respectively (Figure 2). Overall,
31.70% indicated that it is not difficult, suggesting that they may score high
if OSPE is used as an assessment tool. At the same time, about 68.30% showed
that it is slightly difficult, meaning they may score an average to good. Based
on the responses to very difficult and extremely difficult, no student found it
very difficult, and therefore, none of the students were hoping to fail the
OSPE.

Figure 2: Responses on the degree of
difficulty of OSPE as an assessment tool by first-year microbiology students
(Source: Authors)
On the time
adequacy, responses show that 29.27%, 24.39%, 24.39%, 17.07%, and 4.88% agreed that
the time given was adequate, adequate, moderately adequate, average, and
grossly inadequate, respectively (Figure 3). Only less than 5% of the students
indicated that the time was not acceptable. Thus the remaining 95% agreed that
there was sufficient time for the OSPE. The time adequacy correlates with
performance compared to other traditional practical examinations (r=0.314) at
p<0.05, and the notion that it is easier to pass and score high compared to
conventional practical examination (r=0.444) at p<0.01 (Table 2).

Figure 3: Responses on the adequacy of time
in an OSPE by first-year microbiology students (Source: Authors).
On the
preference of the Department to adopt OSPE for all practical examination,
51.22%, 17.07%, 19.51%, 9.76%, and 2.44% agree strongly, agree, moderately
agree, fairly disagree, and strongly disagree, respectively (Figure 4). Again
87.74% opined that they prefer OSPE for all practical examinations in the
Department. However, with varying degrees of acceptability, many of them
representing over 50% of the students accepted OSPE firmly. The choice of OSPE
for assessing all practical examinations in the Department correlates with a
better format for evaluating students' knowledge (r=0.464), easier to pass, and
score high (r=0.662). It covers a wide range of scope or knowledge (r=0.437) at
p<).01 and a better opportunity for students to answer some questions
(r=0.347) at p<0.05 (Table 2).

Figure 4: Responses on the preference of OSPE
for all practical examinations in the Department of Microbiology (Source:
Authors)
On the
suitability of assessing knowledge using OSPE, 39.02%, 29.27%, 24.39%, 7.32%,
and 0.00 % agree strongly, agree, moderately agree, fairly disagree, and
strongly disagree, respectively (Figure 5). Again 92.68% opined that OSPE is
suitable for their practical examination, though with varying degrees of
acceptability, but a high number of them representing over 39% of the students
accepted OSPE firmly. OSPE as a better format of assessing students’ knowledge
correlates with opportunity for all students to answer the same questions
(r=0.406) and cover a wide range of scope (r=0.546) at p<.01, and easier to
pass and score better when compared to traditional practical examination
(r=0.373) at p<0.05 (Table 2).

Figure 5: Responses on the suitability of
OSPE for assessing first-year microbiology students in a practical examination
(Source: Authors)
On
the test of students' previous knowledge on OSPE, 68.29%, 19.51%, 4.88%, 4.88%
and 2.44 % responded that they never, rarely, occasionally, frequently and very
frequently taken part in an OSPE. Again, this suggests that approximately 88%
of the students had not used OSPE before the examinations (Figure 6). The test
of students’ previous knowledge on OSPE correlates with knowledge about it
(r=0.688) at p<0.05 (Table 2).
On
the knowledge of OSPE as an assessment tool, 58.54%, 17.07%, 9.76%, 9.76%, and
4.88% responded that they never, rarely, occasionally, frequently, and very
often taken part in an OSPE. Again, this suggests that approximately 75.61% of
the students had not heard about OSPE before (Figure 7).

Figure 6: Responses on the use of OSPE before
use in the assessment of introductory microbiology course for first-year
microbiology students (Source: Authors)

Figure 7: Responses on the knowledge of OSPE
as an assessment tool by first-year microbiology students (Source: Authors)
On the
perspective that OSPE allows students to answer some questions, 31.71, 39.02%,
14.63%, 9.76%, and 4.88% agree strongly, agree, moderately agree, fairly
disagree, and strongly disagree, respectively (Figure 8). OSPE gave all students
equal opportunity to answer the same questions correlates with easiness to
score better when compared to traditional practical examination (r=0.400;
p<0.01) and covers a wide range of knowledge (r=0.330; p<0.05) (Table 2).


Figure 8: Responses on the merit (all
students answering the same question) of OSPE as an assessment tool by
first-year microbiology students (Source: Authors)
On the
easiness to pass and score better, 48.78%, 31.71%, 14.63%, 2.44%, and 2.44%
agree strongly, agree, moderately agree, fairly disagree, and strongly
disagree, respectively (Figure 9). The easiness to pass and score a high grade
in OSPE when compared to other forms of practical examination showed a strong
significant relationship with a wide range of knowledge covered compared with
the traditional practical examination (r=0.480; p<0.01) (Table 2).

Figure 9: Responses on the easiness to score
better and pass compared to the traditional methods by first-year microbiology
students (Source: Authors)
On the
perspective that OSPE covers a wide range of knowledge compared to the
traditional practical examination, 35.59%, 34.15%, 19.51%, 9.76%, and 0.00%
agree strongly, agree, moderately agree, fairly disagree, and strongly
disagree, respectively (Figure 10).
On the
degree of emotional stress (the feeling of psychological strain and uneasiness that one may fail the
examination) experienced during OSPE practical examination compared to
the traditional practical examination by first-year microbiology students,
46.34%, 29.27%, 14.63%, 7.32%, and 2.44% agree strongly, agree, moderately
agree, fairly disagree and strongly disagree, respectively (Figure 11).

Figure 10: Responses on the merit (broader
scope) of OSPE as an assessment tool by first-year microbiology students
(Source: Authors)

Figure 11: Responses on the degree of
emotional stress during OSPE practical examination compared to the traditional
practical examination by first-year microbiology students (Source: Authors)
The
findings of this study had some similarities with previous works. For instance,
Jena et al. (2015) reported that time given for OSPE was sufficient (78%);
students score higher in OSPE than in conventional examination (66%), OSPE
should be included in students’ assessment (66%), and students were appraised of the OSPE pattern (83%) among undergraduate
pathology students in a tertiary institution in India. Sai
et al. (2020) reported that OSPE is relevant (88%), the time provided was
adequate (90%), easier to pass compared to conventional practical examination
(70%), be adopted as part of the assessment method in the Department (90%),
provides better tendency to score high grade (87), less stressful (84%) and
improve students practical skills (91%) among first-year allied health sciences
students in basic medical sciences. Relwani et al.
(2016) reported that a large number of students agreed that OSPE is fairer
compared to the conventional practical examination (95.30%), covers a wide
range of knowledge (100%), easier to pass, and score better (82.20%), should be
adopted in the Department (95.70%), decline in emotional stress (85.60%) among MB,
BS students Department of Community Medicine of MGM Medical College,
India. Faldessai et al. (2014) reported that OSPE is an effective
tool for examination and learning (90.60%), stressful (22.60%), less valuable
compared to traditional practical examination (12.00%), time inadequacies
(37.30%) among undergraduate students in Biochemistry. Mard and Ghafouri (2020) compared the traditional practical
examination and OSPE. They reported that OSPE is preferable to question
relevance (75.00%), time adequacy (64.00%), fairness (61.00%), easier to pass
(57.00%), better method of assessment (59.00%), chance to score high grade
(48.00%) and less stressful (51.00%) among medical students in experimental
physiology. The similarity in attitude suggests the objectivity and reliability
of OSPE as an effective assessment tool.
The
significant positive correlation among most of the items suggests that the
items are influenced by each other. Therefore, since the effectiveness of OSPE
as an assessment tool is influenced by the time given, there is a need for the Department
to adopt it as an assessment tool because it allows students to answer some
questions (thus eliminating variability and standardizing the examination) and
easier to score high grade (thus eliminating high failure rate). The choice of
the Department to adopt it suggests that it will be a better way of assessing
the students’ knowledge, enabling them to answer all questions and score very
high. Again, the opportunity for all students to answer all questions also
increased the scope of knowledge coverage, making it easier for the students to
pass and score very high grades.
The
reliability analysis (Cronbach’s alpha) of the student's
responses to items on the effectiveness of OSPE as an assessment tool for the
first year microbiology programme was 0.754, which
indicates that it is suitable for assessment and decision-making, a few items
could be improved upon. The Cronbach alpha shows the
validity of the responses of the students toward OSPE as an assessment tool.
This reinforced that students' have indicated that OSPE is a suitable method of
assessing a practical examination (Natu and Singh,
1994; Shenoy et al., 2017). The Cronbach
alpha in the students' responses (questionnaire) is similar to the value of
0.790 reported in Pharmacology assessment using OSPE for second-year MB, BS
students (Jain et al., 2021), 0.800
reported in the reaction of carbohydrate (biochemistry) using OSPE for
first-year MB,BS students (Krishna et al., 2011).
The
Cronbach alpha recorded is in accordance with values
recorded in the OSPE examination in different parts of the World. Hosseini et al.
(2013) reported Cronbach alpha values of 0.907 in
assessing Biochemistry laboratory skills (Hosseini et al., 2013), Shenoy
et al. (2017) reported Cronbach alpha values of 0.724
- 0.845 in pharmacology for second-year MB,BS students. However, Cronbach's alpha value of ≥ 0.724 has been reported
as high-internal consistency in assessment (Shenoy et
al., 2017).
The
deletion of any of the items will cause a change in Cronbach’s
alpha values. The range of 70 – 80 is suitable for classroom assessment as
specified by Patel (2017). Apart from items 2, 6, and 7, deletion of any of the
items will cause a fall in the Cronbach's alpha
values, which is an indication that items 1, 3, 4, 5, 8, 9, 10, and 11 play a
significant role in the reliability of the effectiveness of OSPE as an
assessment tool among first-year Microbiology Students (Table 3). Thus, the
internal consistency is good, and it showed a higher correlation among the
items. Again, it also showed that students' responses would likely reflect on
their performance. Based on the internal consistency measure of the 41 first-year
Microbiology major students, removing some problematic items will increase the
overall reliability of the response. Overall, there was a significant difference
(p = 0.000) between the items to which the students responded (Table 4). This
further showed variability in the perceptions about the items on the evaluation
score sheet.
.
Table 3: Reliability analysis (Cronbach’s alpha) of the students’ responses if an item is
deleted on the effectiveness of OSPE as an assessment tool for first-year
microbiology programme (Source: Authors)
|
Item-Total
Statistics |
|||||
|
Questions |
Scale
Mean if Item Deleted |
Scale
Variance if Item Deleted |
Corrected
Item-Total Correlation |
Squared
Multiple Correlation |
Cronbach's Alpha
if Item Deleted |
|
Q1 |
35.3171 |
31.472 |
.513 |
.549 |
.722 |
|
Q2 |
35.0000 |
37.800 |
.032 |
.229 |
.768 |
|
Q3 |
35.6341 |
31.038 |
.434 |
.341 |
.732 |
|
Q4 |
35.1463 |
29.978 |
.561 |
.603 |
.713 |
|
Q5 |
35.1951 |
33.261 |
.378 |
.510 |
.739 |
|
Q6 |
37.6585 |
34.830 |
.232 |
.605 |
.756 |
|
Q7 |
37.3415 |
32.930 |
.282 |
.662 |
.755 |
|
Q8 |
35.3659 |
30.938 |
.492 |
.354 |
.723 |
|
Q9 |
34.9756 |
30.674 |
.644 |
.649 |
.707 |
|
Q10 |
35.2195 |
32.176 |
.474 |
.473 |
.727 |
|
Q11 |
35.0976 |
33.140 |
.340 |
.271 |
.744 |
Table
4: Analysis of variance of items and students’ responses
|
ANOVA |
||||||
|
|
Sum of
Squares |
df |
Mean
Square |
F |
Sig |
|
|
Between People |
139.858 |
40 |
3.496 |
|
|
|
|
Within People |
Between Items |
365.388 |
10 |
36.539 |
42.524 |
.000 |
|
Residual |
343.703 |
400 |
.859 |
|
|
|
|
Total |
709.091 |
410 |
1.729 |
|
|
|
|
Total |
848.949 |
450 |
1.887 |
|
|
|
|
Grand mean = 3.5632 |
||||||
4.0
CONCLUSIONS
This study
showed the attitude of first-year microbiology major students to the use of
OSPE for their practical examination. The findings showed that OSPE is an
effective tool for assessing the students’ knowledge, giving broader coverage
to knowledge, the opportunity to answer all questions, and making it easier to
score high grades. The study suggested its adoption in the Department of
Microbiology as a tool for assessing all practical examinations. This study
also revealed that OSPE was well accepted by the students compared to
traditional patterns of examination and indicated the importance of the role of
the students in developing new examination assessment tools. No assessment tool
is perfect, but this marks a considerable step towards having more objective
and reliable examinations. Based on the findings of this study, there is a need
for the Department of Microbiology to adopt this assessment tool for all
practical examinations of the students. In addition, other disciplines,
particularly non-clinical fields, need to adopt OSPE as a reliable, objective,
and standard tool for assessing practical skills. The Department also needs to
carry out this exercise for the next four years and compare the results.
Acknowledgment
The authors would like to express their thanks to
the following Laboratory Technologists of Bayelsa
Medical University that that participated in the invigilation of the OSPE; Ms. Timipre Grace Tuaboboh, Ms. Biembele Virtuous Temple, Ms. Sebhaziba
Benjamin Ezem, Ms. Blessing Muji
Olagoke, Ms. Ann Tugwell Ototo, Mrs Christy Koroye, Mr Henry Ebiowei Alpha and Mr Samuel
Philemon Bokene.
Conflicts of
interest/Competing interests
The authors declare no competing of interest.
Ethical considerations
Ethical approval was obtained from
the Research and Ethics Committee of the Bayelsa
Medical University, Nigeria with Ethical approval number REC/2021/0008.
REFERENCES
1.
Ananthakrishnan, N. (1993). Objective structured clinical/practical examination
(OSCE/OSPE). Journal of Postgraduate Medicine 39(2), 82–84.
2.
Faldessai, N., Dharwadkar, A., and Mohanty, S.
(2014). Objective-Structured Practical Examination: A Tool to Gauge Perception
and Performance of Students in Biochemistry. Asian Journal of Multidisciplinary Studies 2(8), 32 – 38.
3.
Frantz,
J. M., Rowe, M., Hess, D. A., Rhoda, A. J., Sauls, B.
L., and Wegner, L. (2013). Student and staff perceptions and experiences of the
introduction of Objective Structured Practical Examinations: A pilot study. African Journal of Health Professions
Education 5(2):72-74.
4.
Harden,
R. M., and Cairncross, R. G. (1980). Assessment of
practical skills: The objective structured practical examination (OSPE). Studies in Higher Education 5(2),
187-196.
5.
Harden,
R. M., and Gleeson, F. A. (1979). Assessment of clinical competence using an
Objective Structured Clinical Examination (OSCE). ASME Medical Education booklet 13(1), 39-54.
6.
Jain, S., Kalra, R.,
Mani, P., and
Goswami,
P. S. (2021). Introduction
and Evaluation of Objective Structured Practical Examination as an Assessment
Tool in Pharmacology for Second Year Medical Students. Journal of Clinical and Diagnostic Research 15(3), FC01-FC04
7.
Jena,
M., Bhat, V., and Indrani,
K. (2015). Relevance of objective structured practical examination reinforced
with multiple choice questions as a tool of learning and retention of knowledge
in pathology among the undergraduate students. Journal of Medical Sciences and Health 1(2), 30-33.
8.
Krishna,
M. N., Ashakiran, S., Deena, M., Mamatha,
K., Shyamali, C., Ganesh, G., and Nandini,
T. (2011). OSPE as a learning and evaluation tool for biochemistry: first
experience. Journal of Clinical and
Biomedical Sciences 1(2), 64 – 69.
9.
Mard, S.
A., and Ghafouri, S. (2020). Objective Structured
Practical Examination in Experimental Physiology Increased Satisfaction of
Medical Students. Advances in Medical
Education and Practice 11, 651—659.
10.
Natu, M.
V., and Singh, T. (1994). Medical education objective structured practical
examination (OSPE) in pharmacology – Students' point of view. Indian Journal of Pharmacology 26, 188–189.
11.
Patel RM (2017). Use of item
analysis to improve quality of multiple-choice questions in II MBBS. Journal of Education Technology in Health Sciences
4(1), 22 – 29.
12.
Rahman, N.,
Ferdousi, S., Hoq, N., Amin,
R., and Kabir, J. (2007). Evaluation of objective
structured practical examination and traditional practical examination. Mymensingh Medical Journal 16(1), 7–11.
13.
Relwani, N. R., Wadke,
R. A., Anjenaya, S., and Sawardekar,
P. N. (2016). Effectiveness of objective structured practical examination as a
formative assessment tool as compared to traditional method for M. B. B. S
students. International Journal
of Community Medicine and Public Health 3(12), 3526 – 3532.
14.
Hosseini,
S., Hosseini, F., Vartanoosian,
J., and Farzinfard, F. (2013) Validity and
reliability of OSCE/OSPE in assessing Biochemistry laboratory skills of
freshman Nursing students of Shahid Beheshti University of Medical Sciences, ICERI2013
Proceedings, pp. 6062-6069.
15.
Sai, S.
K. G., Padmanabha, B. V., and Mukkadan,
J. K. (2020). Objective structured practical examination: Perceptions of the
1st year allied health sciences students in basic medical sciences. National Journal of Physiology, Pharmacy and
Pharmacology 10(7), 530 –
532.
16.
Shenoy, P.
J., Kamath, P., Sayeli, V.,
and Pai, S. (2017). Standardization and validation of
objective structured practical examination in pharmacology: Our experience and
lessons learned. Indian Journal of Pharmacology 49(4), 270–274.
|
Cite this Article: Izah, SC; Odubo, TC; Ajumobi, VE; Osinowo, O (2022). Attitudes to and evaluation of
Objective Structured Practical Examination by first-year Microbiology
Students of a Tertiary Educational Institution in Nigeria. Greener Journal of
Educational Research, 12(1): 01-12. |