Table of Contents Author Guidelines Submit a Manuscript
Education Research International
Volume 2016, Article ID 4806398, 4 pages
http://dx.doi.org/10.1155/2016/4806398
Research Article

An Audit of the Medical Students’ Perceptions regarding Objective Structured Clinical Examination

1Medical Unit A, KTH Peshawar, Peshawar 25000, Pakistan
2Medical Unit A, LRH Peshawar, Peshawar, Pakistan

Received 26 April 2016; Revised 18 June 2016; Accepted 29 June 2016

Academic Editor: Eddie Denessen

Copyright © 2016 Abidullah Khan et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Objective. To record the perceptions of the final year MBBS students of Khyber Medical College (KMC) Peshawar regarding Objective Structured Clinical Examination (OSCE) conducted in the year 2016. Materials and Methods. This study was conducted in April 2016 which is in fact a reaudit of our similar survey done back in 2015. A total of 250 final year MBBS students participated by filling in a validated and pretested questionnaire already used by Russel et al. and Khan et al. in similar but separate studies including questions regarding exam content, quality of performance, OSCE validity and reliability, and so forth. The data was analyzed using SPSS version 20. Results. The study group comprised 160 (64%) males and 90 (36%) females. 220 (88%) stated that exam was fair and comprehensive; 94% believed OSCE was more stressful and mentally tougher. 96% of the students considered OSCE as valid and reliable and 87% were happy with its use in clinical competence assessment. Conclusion. Majority of students declared their final year OSCE as fair, comprehensive, standardized, less biased, and reliable format of examination but believed it was more stressful and mentally tougher than traditional examination methods.

1. Background

Objective structured clinical examination (OSCE) was introduced by Harden and colleagues in 1975. Since its origin in the 70s, objective structured clinical examination (OSCE) has received worldwide acceptance and appreciation as a fair and standardized format to assess the clinical competences of medical students and residents [1].

Medical educationists have long been trying to devise a valid and reliable assessment method in medicine and surgery. After a long story of efforts, OSCE became the cornerstone of medical assessment throughout the world. The objective structured clinical examination (OSCE) is an approach for student assessment in which different aspects of clinical competence are evaluated in a comprehensive, consistent, and structured manner with close attention to the objectivity of the process [2, 3]. In order to refine this system of clinical exam, it is vital to understand how do students taking OSCE feel and think about it.

The history of medical education in Pakistan reveals that long and short cases, essay writing, multiple choice questions (MCQs), instruments and specimen based oral interviews, and so forth have been the most popular forms of clinical competence assessment for decades with questionable validity and reliability. The body awarding the postgraduate medical degrees in Pakistan, the College of Physicians and Surgeons of Pakistan (CPSP), initiated OSCE as a method of clinical assessment in the 1990s which was later adopted by the Pakistan Medical and Dental Council (PMDC) at the undergraduate level as well. Khyber Medical University took a step forward by substituting traditional viva examination with OSCE in 2010 in the province of Khyber Pakhtunkhwa (KPK), Pakistan. As per this initiative, all medical and dental schools in KPK embraced OSCE as a part of final exam for assessing clinical competencies of students [4].

In order to know how good or bad OSCE is or to improve the quality of OSCE further, it needs to be monitored continuously to identify and then correct any shortcomings therein. For this to happen, recording the reflective thinking and both positive criticism and negative criticism of the examinees taking OSCE is one of the different parameters to evaluate this examination format. The students perceptions regarding OSCE help to find out areas of strength and weakness and their feedback brings reforms such as redesigning of curriculum and learning objectives, training the faculty in the conduct of OSCE, involving more external examiners, and establishment of a skills’ lab to help improve this assessment tool [5]. We conducted this student survey in April 2016 which is in fact a repetition of what we did in April 2015 to evaluate the current OSCE system of KMC Peshawar and improve any shortcomings in the view of the perceptions recorded on structured questionnaire by final professional MBBS students.

2. Materials and Methods

This cross-sectional observational study included 250 final year MBBS students of Khyber Medical College (KMC) Peshawar who took part in the annual clinical evaluation in the subject of General Medicine conducted by Khyber Medical University (KMU) in the department of medicine of the Khyber Teaching Hospital (KTH) Peshawar in April 2016. The study was approved by the Ethics Committee of KMC/KTH and an informed written consent was obtained from every participant. Data was collected on a structured questionnaire used by Russel et al. and Khan et al. in similar but separate studies in the past. The questionnaire had closed-ended questions related to the OSCE evaluation like syllabus, fairness, stress factor, impact of gender, ethnicity and personality on the individual and overall results, OSCE administration, quality of performance testing, validity and reliability, and students rating of different assessment formats and recommendation for the future use. The last section of the questionnaire invited open comments from the candidates about OSCE. The questionnaire was pilot-tested by a group of ten house officers who had recently passed their exam from KMC, not only to check the quality and content of the questionnaire but also to rectify any errors therein in the light of their suggestions.

This exam was one week long. All the 250 participating students were divided into seven groups each comprising roughly 36 students to be examined each day. For the purpose of convenience each group of 36 students was split into subgroups of roughly 18 students each and examined in two parallel OSCE circuits. Each OSCE circuit comprised 18 stations designed as follows: 2 rest stations, 4 short cases, 2 interactive stations for counseling and communication skills, 2 data interpretation stations, and two stations each for clinical scenario, CT scan, Chest X-ray, and ECG interpretation. On the day of OSCE, an orientation class was arranged for each group of candidates in the morning.

The content in each OSCE station was decided by the senior faculty members of the department of medicine of KMC/KTH according to the syllabus of KMU for the final year MBBS students. A key was formed for each station before the exam had begun and all the examiners were briefed about the nature and content of the OSCE before the exam could take place and their queries addressed.

Each station lasted for 4 minutes and two bells were rung, one at the start and one just 30 seconds before the closure of each individual station, to alert them to complete their remaining task. The students were moved in an anticlockwise fashion in the OSCE circuit. At the completion of one circuit, the participating students were segregated to avoid crossover and leaking of the material to the unexamined students who were waiting for their turn in the demonstration room of the department of medicine.

The questionnaire was distributed to every student at the completion of each OSCE circuit. They filled in the form in the presence of an investigator so that they could be helped in case they did not understand any term or question. As all the 250 medical students of the final year filled the questionnaire, the response rate was 100%. The data was analyzed using SPSS version 20.

3. Results

The study comprised 160 (64%) male and 90 (36%) female students. Data was obtained on a questionnaire already used by Russel et al. and Khan et al. who did similar but separate surveys in the past and results were broadly divided into overall fairness of the OSCE, curriculum involvement, time prioritization and management skills, administration, stress factor, level of prior education regarding OSCE, legibility of instructions, validity and reliability, and recommendations for future use. Finally the students’ “rating of different assessment formats” including OSCE and “quality of performance testing” in OSCE was done (Tables 1 and 2).

Table 1: Students rating of different assessment formats.
Table 2: Quality of performance testing in OSCE.

Two hundred and twenty students (88%) argued that the examination was fair and comprehensive; however the rest of thirty students (12%) either disagreed or remained neutral. When asked about the overall administration of the OSCE, 250 (100%) candidates declared the exam well structured and well governed. Two hundred and thirty-five (94%) of the examinees found OSCE more stressful than conventional exam formats, 5% believed they were more relax than usual, and the remaining 1% opted for neutrality. In contrast to the traditional exam formats like short essay questions (SEQs), long questions, viva voce, and so forth, 240 (96%) of the students said that OSCE did give an opportunity to compensate for the areas of weakness in the exam and 243 (97%) participants were of the opinion that the exam was less biased. All the candidates (100%) said the exam instructions were clear and legible and that they had enough prior knowledge with regard to what would happen in OSCE. In terms of overall validity and reliability, 96% of the students felt satisfied; however the remainder had some reservations as they believed that factors like gender, ethnicity, and personality did affect the results of OSCE. Of all the 250 students included in the survey, 87% said they would recommend OSCE as a more practical, fair, and reliable means of clinical assessment in future, while the remaining 13% opposed their views.

Forty percent of the candidates believed that OSCE is more demanding in certain examinee related personal skills like time prioritization and its effective management during exam as opposed to 30% of the students who believed so for essay writing. 87% of the participants stated that their communication and counseling skills were challenged and tested more in OSCE than in any other examination formats they had experienced. In the open comment section of the questionnaire, 68% of the candidates recommended that, while preparing for OSCE, the future candidates must alter their learning style and skills by concentrating not only on theory but also on refining their practical attributes like patient counseling and communication skills as well as time prioritization and its effective management during the OSCE.

4. Discussion

OSCE is an important part of clinical assessment over the world. In our study 88% of the candidates declared OSCE as very fair and comprehensive way of examining medical students, a finding which was consistent with the observations of Shitu et al. [68].

Our study found that 96% of the students were happy with the validity and reliability of OSCE. These results are in accordance with those obtained by Pierre et al. [9]. This survey also confirmed that only 4% of the candidates believed that gender, ethnicity, and personality did affect the individual and overall assessment score in OSCE which is consistent with the studies done in the near past [6, 9].

Vast majority of the students (97%) said that the exam was less biased in comparison to the conventional formats of assessment including MCQs, SEQS, long questions, and viva, covered a wide area of knowledge, and did provide an opportunity to compensate for the areas of weakness. These results are consistent with the perceptions of the nurses regarding OSCE in a study done in 2012 by Selim et al. [10].

Of all the examinees, 94% found the OSCE more stressful and mentally tougher than traditional exam formats despite the fact that they had been briefed and educated well regarding the nature and format of the exam they were going to take. A study done by Yedidia et al. concluded that preexam briefing of the students regarding OSCE had positive influence on performance; however the stress factor was found to be higher in other studies as well [11, 12]. Allen et al. reported that anxiety level escalated and remained constantly high throughout OSCE stations. It was also noted that fear of exam and tense OSCE environment without any rest station and on top of that the rude and apathetic approach of the examiners during the exam contributed adversely to the student performance [13].

The current study showed that most of the students were satisfied with the content and sequencing of the stations and believed the exam was fairly logical. Similar students’ attitude was observed in a study by El-Nemer and Kandeel [14]. Moreover students claimed that this exam format helped them identify their clinical areas of weakness and provided them an opportunity to modify and improve their learning style and skills [15]. All the examinees in our study stated that the OSCE instructions were clear and legible which is in contrast to a Pakistani study where students felt embarrassed with regard to understanding the questions and instructions [16]. Despite the fact that traditional exam formats like MCQ, SEQs, long questions, and viva have been in practice for decades, most of the candidates felt happy with the introduction of OSCE as a mean of clinical assessment and were of the view that they would prefer and recommend this exam format for use in future. A similar result was reported by Pierre et al. in his student survey back in 2004 in the West Indies [9].

It is worth mentioning that all the results were in accordance with what the authors expected; however it is notable that only few of the candidates (4%) believed that the OSCE scores were affected by factors like ethnicity, gender, or personality which is in sharp contrast with what we found in our first audit in April 2015, where the majority felt a role for these factors.

5. Conclusion

The examinees believed that OSCE is a fair, valid, reliable, easier, and the most comprehensive way of clinical assessment in medicine; however it is mentally tougher and more intimidating in comparison to other exam formats like MCQs, SEQs, and long viva. It was perceived that factors like ethnicity, gender, personality, and so forth do not affect the outcome of OSCE making it less biased, practical, and standardized way of assessing clinical competence; moreover as OSCE demands excellent communication, counseling, time management, and organizational skills, it is recommended that, before appearing in OSCE, candidates have to identify and improve areas of their clinical weakness and alter their learning style and skills in a relevant way to be able to score high in any OSCE in the future.

Competing Interests

The authors declare that they have no competing interests.

References

  1. R. M. Harden, M. Stevenson, W. W. Downie, and G. M. Wilson, “Assessment of clinical competence using objective structured examination,” British Medical Journal, vol. 1, no. 5955, pp. 447–451, 1975. View at Publisher · View at Google Scholar · View at Scopus
  2. M. Khan, S. M. Noor, and M. Siraj, “Students' perceptions OSCE in Dentistry,” Advances in Health Sciences Education, vol. 1, pp. 30–36, 2015. View at Google Scholar
  3. A. A. Nasir, A. S. Yusuf, L. O. Abdur-Rahman et al., “Medical students' perception of objective structured clinical examination: a feedback for process improvement,” Journal of Surgical Education, vol. 71, no. 5, pp. 701–706, 2014. View at Publisher · View at Google Scholar · View at Scopus
  4. Achievments of KMU, 2012, http://ahpe.kmu.edu.pk/article/view/27.
  5. F. G. Siddiqui, “Final year mbbs students' perception for observed structured clinical examination,” Journal of the College of Physicians and Surgeons Pakistan, vol. 23, no. 1, pp. 20–24, 2013. View at Google Scholar · View at Scopus
  6. B. Shitu and T. Girma, “Objective structured clinical examination (OSCE): examinee's perception at department of pediatrics and child health, Jimma university,” Ethiopian Journal of Health Sciences, vol. 18, pp. 47–52, 2008. View at Google Scholar
  7. A. Al Omari and Z. M. Shawagfa, “New experience with objective structured clinical examination in Jordan,” Rawal Medical Journal, vol. 35, no. 1, pp. 78–81, 2010. View at Google Scholar · View at Scopus
  8. K. E. Duffield and J. A. Spencer, “A survey of medical students' views about the purposes and fairness of assessment,” Medical Education, vol. 36, no. 9, pp. 879–886, 2002. View at Publisher · View at Google Scholar · View at Scopus
  9. R. B. Pierre, A. Wierenga, M. Barton, J. M. Branday, and C. D. C. Christie, “Student evaluation of an OSCE in paediatrics at the University of the West Indies, Jamaica,” BMC Medical Education, vol. 4, article 22, 2004. View at Publisher · View at Google Scholar · View at Scopus
  10. A. A. Selim, F. H. Ramadan, M. M. El-Gueneidy, and M. M. Gaafer, “Using Objective Structured Clinical Examination (OSCE) in undergraduate psychiatric nursing education: is it reliable and valid?” Nurse Education Today, vol. 32, no. 3, pp. 283–288, 2012. View at Publisher · View at Google Scholar · View at Scopus
  11. M. J. Yedidia, C. C. Gillespie, E. Kachur et al., “Effect of communications training on medical student performance,” The Journal of the American Medical Association, vol. 290, no. 9, pp. 1157–1165, 2003. View at Publisher · View at Google Scholar · View at Scopus
  12. M. Mani and M. T. Hosseini, “Is OSCE successful in pediatrics?” Journal of Medical Education, vol. 6, pp. 153–158, 2005. View at Google Scholar
  13. R. Allen, J. Heard, M. Savidge, J. Bittengle, M. Cantrell, and T. Huffmaster, “Surveying students' attitudes during the OSCE,” Advances in Health Sciences Education, vol. 3, no. 3, pp. 197–206, 1998. View at Publisher · View at Google Scholar · View at Scopus
  14. A. El-Nemer and N. Kandeel, “Using OSCE as an assessment tool for clinical skills: nursing students' feedback,” Australian Journal of Basic and Applied Sciences, vol. 3, no. 3, pp. 2465–2472, 2009. View at Google Scholar · View at Scopus
  15. R. A. Lindemann and J. Jedrychowski, “Self-assessed clinical competence: a comparison between students in an advanced dental education elective and in the general clinic,” European Journal of Dental Education, vol. 6, no. 1, pp. 16–21, 2002. View at Publisher · View at Google Scholar · View at Scopus
  16. M. Iqbal, B. Khizar, and Z. Zaidi, “Revising an OSCE in a resource limited Pakistani Medical school,” Education for Health (Abingdon, England), vol. 22, no. 1, pp. 209–212, 2009. View at Google Scholar · View at Scopus