Education Research International

Education Research International / 2019 / Article

Research Article | Open Access

Volume 2019 |Article ID 8463169 |

L. Benkirane, M. Hamza, W. Sbihi, El Arabi, "Perception of Learning Assessment Methods by Students at the End of Their Initial Training at the Faculty of Dentistry of Casablanca", Education Research International, vol. 2019, Article ID 8463169, 5 pages, 2019.

Perception of Learning Assessment Methods by Students at the End of Their Initial Training at the Faculty of Dentistry of Casablanca

Academic Editor: Connie M. Wiskin
Received07 Dec 2018
Accepted22 Jul 2019
Published03 Sep 2019


Aim. To explore students’ perception of theoretical, preclinical, and clinical assessment methods and to analyze their level of satisfaction, with the final goal of getting out with recommendations to improve the weaknesses identified. Material and Methods. A descriptive and transversal survey was carried out by a doctoral student in the Faculty of Dentistry of Casablanca on the perception of students, at the end of their initial training, of learning assessment methods. Results. 51.8% of surveyed students said they were not informed of the criteria to pass successfully the exams, 35.7% of students felt that integrating continuous assessment in addition to the final test would be beneficial for them, and 45.1% of them proposed a frequency of one assessment per month. According to them, this system will allow them to be up to date, to better manage time and knowledge, and to have feedback allowing them to check and improve their skills. Practical activities assessment systems are considered to be adequate by 92% of the surveyed students. Clinical internship assessment focused for the majority of students on the number of procedures. Conclusion. The assessment methods influence students’ learning. It allows teachers to monitor students’ productivity, their attitude, and their work quality, so the teacher can identify gaps in a timely manner at every level of the learning process. According to the students’ perception, the theoretical evaluation should be adapted to the learning objectives, the practical work is quite satisfactory, and the clinical evaluation is mainly based on quantitative criteria.

1. Introduction

Learning evaluation is a fundamental step in an educational process in any university because it allows assessing a student’s acquired knowledge before validating his or her final certification.

It is also the concern of any teacher wishing to ensure that the expected skills have been acquired.

For students, it is the best way to identify their misconceptions, correct them, and guide their learning strategies [1].

Learning challenges are numerous, as well as its tools and procedures [2].

Effective assessment tools of learning should be able to judge students’ progress whatever the field is in a fair and objective way [3, 4].

In medical sciences, in general, and dentistry, in particular, this evaluation mainly covers three aspects: theoretical, practical (or preclinical), and clinical. There are several parameters that come into play in this evaluation. The diversity of trainers and training environments, the nature and complexity of targeted skills at the end of the program, and the high societal expectations of the profession are cited. These factors will impose on evaluators specific requirements [5].

The dental medicine course in Casablanca consists of five years of study and is organized in two cycles. The first two-year cycle is designed to provide students with training in basic medical sciences and preclinical odontological sciences in addition to an introduction to odontotechnics. The 2nd cycle of 3 years is dedicated to the proper odontological training. It is awarded the Doctorate of Dentistry.

The curriculum is delivered in the form of lectures and guided and practical instruction. Clinical internships start in the 4th year and take place in the consultation and treatment dental centers.

The evaluation covers theoretical, practical, and clinical education [6].

Theoretical instruction is assessed through a written exam each semester. Different assessment tools, depending on the teacher’s free choice, are used such as short answer questions, clinical cases, multiple-choice questions, and synthesis questions.

The evaluation of practical teaching is based on the student’s assessment of his or her mastery of the performance of requested acts, and it is formative and summative.

As for clinical teaching, the evaluation focuses on quantitative (number of procedures performed) and qualitative criteria (such as hygiene and asepsis, work field organization, theoretical knowledge, and sign language skills). The tools used can include clinical procedure quotas, clinical case presentations, clinical interviews, and objective and structured clinical examinations.

Unlike teaching quality assessment, particularly in dentistry, literature on the perception of learning assessment methods remains scarce.

It is in this line that a descriptive study was carried out within the Faculty of Dentistry of Casablanca. Its aim is to explore students’ perception of theoretical, practical, and clinical assessment methods and to analyze their level of satisfaction, with the final goal of getting out with recommendations to improve the weaknesses identified.

2. Methods and Tools

Between December 2016 and January 2017, a descriptive and transversal survey was carried out in the Faculty of Dentistry of Casablanca on the perception of students, at the end of their initial training, of learning assessment methods.

A questionnaire has been designed, and it includes 3 major parts:

The first one deals with the overall perception of theoretical assessment, i.e., evaluation methods in general, the most appreciated one and the need to complete with continuous assessment.

The second part focuses on the perception of practical work assessment through the searching values measured by means of evaluation system, the perception of continuous assessments, and the utility of practical work for preclinical internship.

The last part is concerned with the perception of clinical assessment; it focuses on the values measured by the evaluation system and on the perception of the reliability of evaluation methods according to students to measure their performance.

The questionnaires were distributed to all 5th year students and doctoral students of 2016/2017 academic year. Their list was provided by the Department of Student Affairs.

To get in touch with all students, the contact was established with 5th year students on the site of their clinical internships and with doctoral students via the online questionnaire.

A focus group was organized with around 15 students. Its purpose was to refine the results of the survey by knowing the opinions of students about some concepts that are likely to be misunderstood, by collecting subjective data, and by formulating some proposals by the students. Questions were open, and the list of answer choices reported by students was not provided.

Information was entered and analyzed using the SPSS computer software.

3. Results

Among the 248 questionnaires distributed, 80.2% were filled out and returned: 55% of the respondent students were in 5th year, and the others were doctoral students.

3.1. Overall Perception of Theoretical Knowledge Assessment

Communication of Success Criteria. 51.8% of surveyed students said they were not informed of the criteria to pass successfully the exams, while 48.2% responded that they were informed. The results of the students' overall perception of theoretical knowledge assessment are presented in Table 1.

Students’ preferred evaluation method(N)(%)

 Clinical cases13869.3
 Short answer questions4623.1
 Synthesis question189

Implementation of continuous assessment(N)(%)


Proposed pace for continuous assessment(N)(%)

 Once a month3245.1
 Twice a semester1825.4
 Once every two weeks1318.3
 Once a semester811.3

MCQ: multiple-choice questions.
3.2. Perception of Practical Work Assessment

Practical activities assessment systems are considered to be adequate by 92% of the surveyed students. Other parameters of the students’ perceptions of practical work are shown in Table 2.

Measured criteria(N)(%)

 How to be, how to do, and knowledge4723.6
 How to do6532.7
 How to be and how to do8743.7

Students’ wish of having a communication of continuous assessment marks(N)(%)


Students’ wish of having argumentation of continuous assessment marks(N)(%)


3.3. Perception of Clinical Internship Assessment

Clinical internship assessment focused for the majority of students on the number of procedures (64.3% in conservative odontology, 71.4% in periodontology, and 71.4% in emergency department) and also on knowledge, quality of procedures, or professionalism. According to our students, the latter is mainly evaluated during the Dentofacial Orthodontics internship (40.2%).

The focus group discussion group believes that professionalism is measured by attendance, behavior towards supervisors, and dress that reflects the physician’s image.

The criteria measured through clinical assessment by service are presented in Table 3.

Criteria measured by disciplineN%

Conservative dentistry
 Number of acts12864.3
 How to do2010.1
 Quality of acts2412.1

Fixed prosthodontics
 Number of acts11256.3
 How to do168
 Quality of acts84

Removable prosthodontics
 Number of acts9447.2
 How to do2412.1
 Quality of acts178.5

Dentofacial orthodontics
 Number of acts4623.1
 How to do3517.6
 Quality of acts73.5

Surgical dentistry
 Number of acts11356.8
 How to do2311.6
 Quality of acts2412.1

 Number of acts8442.2
 How to do5527.6
 Quality of acts2311.6

 Number of acts14271.4
 How to do178.5
 Quality of acts52.5

 Number of acts14271.4
 How to do189
 Quality of acts21

 Number of acts12462.3
 How to do4422.1
 Quality of acts21

The percentage of students who answered the question regarding their perception of evaluation methods adopted in clinical placements and of the reliability of information on their performance is 47.2% in Surgical Odontology, 46.7% in Periodontology, and 37.7% in Pedodontics (Table 4).

Students’ perceptions of the evaluation methods adopted and their reliability of information on actual performanceN%

Conservative dentistry5025.1
Fixed prosthodontics42
Removable prosthodontics5929.6
Dentofacial orthodontics6231.2
Surgical dentistry9447.2

4. Discussion

There has been a considerable evolution in evaluation practices throughout the world, in order to best meet the quality and professionalism requirements of the future dentist. At the Faculty of Dentistry of Casablanca, methods and criteria to pass the exams are defined by the institution; they have already been codified by law (articles no. 20, 21, and 23 of the decree of February 15, 1993).

4.1. Assessment of Theoretical Knowledge

Despite the fact that it is distributed by the administrative staff and by student representatives directly or via the student guide and website, communication of success criteria is lacking for 51.8% of the surveyed students.

The students who say that they are not informed about the communication of the nature of the tests at the beginning of the teaching sequence represent 54.3%.

These percentages can be partly explained by a high level of absenteeism, noted from the beginning of the year.

According to WHO standards [7], it remains the faculty’s duty to define, describe, and communicate the methods used to evaluate its students.

The surveyed students who prefer clinical case resolution represent 69.3%.

In contrast to the result found in the faculty’s 2008 accreditation (according to the standards of the World Federation for Medical Education (WFME)), the multiple-choice question system was the most appreciated. This observation can be explained by the fact that the study concerned all students, at all levels.

Students at the end of the cycle mainly choose a method that evaluates their clinical reasoning. This type of test is a prime tool for exploring the reasoning process [8].

Although multiple-choice questions are one of the most frequently used assessment methods in dental schools, many of these assessments are still based on questions of low cognitive level (i.e., representing students’ ability to understand and remember) [9].

35.7% of students felt that integrating continuous assessment in addition to the final test would be beneficial for them and 45.1% of them proposed a frequency of one assessment per month. According to them, this system will allow them to be up to date, to better manage time and knowledge, and to have feedback allowing them to check and improve their skills.

This was verified by a survey conducted at the Faculty of Dentistry in India [10], where students who voluntarily enrolled in online formative assessments during the first and second semesters obtained higher scores with statically significant differences.

Another study conducted at the Michigan School of Dentistry showed that among the most common suggestions made by students was that professors are better able to design evaluations in depth along with their students than when they conceive them alone. This has proven to be an effective tool for building critical thinking and improving students' results in comparison with questions designed only by the teachers. This seems extremely useful [11].

In addition, a working group of 14 dental schools [12] addressed the issue of the ideal academic environment as seen by their students, and a number of improvements were suggested, including(i)Establishing clear and carefully planned assessment objectives(ii)Conducting regular, on-demand assessments that stimulate active learning and reduce final exam anxiety(iii)Providing immediate feedback to students

4.2. Practical Work Assessment

The main shortcomings identified are due to the fact that continuous assessment marks are not argued.

Teachers, at some practical works, actually do not have evaluation sheets that can explain the marks given to students. This is why the creation of better elaborated grids is essential. They could include the evaluation criteria that have to be answered: perfect, good, acceptable, and nil.

Diemer in 2005 [13] proposed the addition of photographs and/or videos to these validation criteria.

The evaluation grid in endodontic practical works, together with visual support, also seemed to help Toulouse students to develop skills more quickly in order to obtain a professionally satisfactory act. This was the result of a study carried out between 1999 and 2002 [14].

4.3. Clinical Internship Assessment

According to Diemer et al. clinical internship represents the ideal place for skills acquisition, and they allow the transfer and especially the mobilization of knowledge to ultimately encourage clinical and practical reasoning [13].

According to our students’ perception, almost all departments evaluate first the number of acts and then knowledge, professionalism, or the quality of the acts. This may further increase the stress on stress. This has already been proven in the 2015 study on the value of implementing mentoring for students at the Faculty of Dentistry of Casablanca [15]. The results demonstrated that the transition to grade 4 and 5 is a major source of stress. Among the factors mentioned was quota validation [15, 16].

A qualitative study based on interviews with students in their final year of dental surgery at the University of Toulouse in 2015-2016 confirms the negative effects of quantitative clinical evaluation, such as patient abandonment or modification of the treatment plan. Patient assessment of treatment is considered more humane, less stressful, and more effective. New student- and patient-centred clinical assessment methods must be developed to support student competencies [17].

In the focus group, students felt that professionalism is limited to attendance, behavior, and dress that reflect the physician’s image.

According to Pelaccia [4], five attributes characterize caregivers who demonstrate professionalism:(i)Development and maintenance of skills throughout the career(ii)Relational abilities(iii)Collaborative practice skills(iv)Professional integrity and ethics(v)Partnership with the patient

On the other hand, according to our students, assessment methods provide information on actual performance in the surgical and pedodontic departments in only 47.2% and 37.7%, respectively.

Assessing the quality of clinical procedures must be an integral part of the final mark given to the internship, alongside the quota imposed on students.

In order to reach this goal, the Faculty of Dentistry of Casablanca has implemented since 2010 [18] an online evaluation system inspired by the one applied at the University of Montreal [19]. It is an easy-to-use tool that addresses clinical evaluation issues and provides a quality strategy for evaluating competencies that are considered essential in addition to the number of procedures to be performed.

Students can also benefit from rapid and more effective feedback based on known and well-defined uniform criteria, and they can have better control over their clinical progress.

However, due to lack of feedback, the student does not know how much he or she progresses during the clinical internship, which can lead to a problem in the perception of competency levels between students and teachers. As a result, students may tend to overestimate their skill levels and believe strongly in their performance, which does not always meet the expectations of their supervisors.

However, there are still improvements to do in this system, which does not include, for example, time devoted to perform a given act. It also requires a real involvement and availability of supervisors to be able to fill in the form in real time.

5. Conclusion

Evaluation methods of theoretical, preclinical, and clinical teaching adopted by the Faculty of Dentistry of Casablanca are numerous. No method or tool is good or bad in itself, but it must be fundamentally consistent with pre-established pedagogical choices and objectives.

Through our study, it appears that the majority of end-of-course students believe that the methods of theoretical knowledge evaluation should be modified, that the system of evaluating practical work is perceived as satisfactory for almost all, and that the evaluation of clinical placements focuses mainly on the number of procedures.

We hope to have made useful data available to teachers to improve evaluation methods, in particular by integrating continuous assessment into theoretical evaluation, by explaining practical work marks, and by further evaluating know-how and life skills in clinical internship.

Improving assessment practices requires commitment of both teachers and students. When the latter will see the results, they will be more likely to commit.

Data Availability

The data used to support the findings of this study are included within the article.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.


  1. R. Valentine, D. Berthiaume, and A.-C. Allin-Pfister, “Comment évaluer les apprentissages dans l’enseignement supérieur professionnalisant?” in Guides Pratiques, De Boeck Supérieur, Brussels, Belgium, 1st edition, 2017. View at: Google Scholar
  2. A. Harouchi, La Pédagogie Des Compétences, LE FENNEC, Casablanca, Morocco, 2001.
  3. P. Jindal and G. Khurana, “The opinion of post graduate students on objective structured clinical examination in anaesthesiology: a preliminary report,” Indian Journal of Anaesthesia, vol. 60, no. 3, pp. 168–173, 2016. View at: Publisher Site | Google Scholar
  4. T. Pelaccia, “Comment (mieux) former et évaluer les étudiants en médecine et en sciences de la santé?” in Guides Pratiques, p. 480, De Boeck Supérieur, Brussels, Belgium, 1st edition, 2016. View at: Google Scholar
  5. J. Jouquan, “L’évaluation des apprentissages des étudiants en formation médicale initiale,” Pédagogie Médicale, vol. 3, no. 1, pp. 38–52, 2002. View at: Publisher Site | Google Scholar
  6. Le Programme Educationnel, Guide to the Establishment Self-Assessment Report of the School of Dentistry of Casablanca, 2008.
  7. WHO, Guidelines for Quality Assurance of Basic Medical Education in the Western Pacific Region, World Health Organization Regional Office for the Western Pacific Manila, Manila, Philippines, 2001.
  8. T. M. Gerzina, R. Worthington, S. Byrne, and C. McMahon, “Student use and perceptions of different learning aids in a problem-based learning (PBL) dentistry course,” Journal of Dental Education, vol. 67, no. 6, pp. 641–653, 2003. View at: Google Scholar
  9. J. E. Albino, S. K. Young, L. M. Neumann et al., “Assessing dental students’ competence: best practice recommendations in the performance assessment literature and investigation of current practices in predoctoral dental education,” Journal of Dental Education, vol. 72, no. 12, pp. 1405–1435, 2008. View at: Google Scholar
  10. B. L. Olson and J. L. McDonald, “Influence of online formative assessment upon student learning in biomedical science courses,” Journal of Dental Education, vol. 68, no. 6, pp. 656–659, 2004. View at: Google Scholar
  11. C. Gonzalez-Cabezas, O. S. Anderson, M. C. Wright, and M. Fontana, “Association between dental student-developed exam questions and learning at higher cognitive levels,” Journal of Dental Education, vol. 79, no. 11, pp. 1295–1304, 2015. View at: Google Scholar
  12. K. Divaris, P. J. Barlow, S. A. Chendea et al., “The academic environment: the students’ perspective,” European Journal of Dental Education, vol. 12, no. 1, pp. 120–130, 2008. View at: Publisher Site | Google Scholar
  13. F. Diemer, M.-E. Lauret, E. Prats, and P. Calas, “Evaluation préliminaire d’un dispositif de coévaluation en travaux pratiques d’endodontie,” Pédagogie Médicale, vol. 6, no. 2, pp. 79–87, 2005. View at: Publisher Site | Google Scholar
  14. S. Lauret and S. et Negrini, “Mise en place d’un nouveau système d’évaluation des stages hospitaliers dans le cadre de la réforme du diplôme de formation approfondie en sciences médicales,” Paris-Sud University, Orsay, France, 2013, Thesis. View at: Google Scholar
  15. A. Chlyah, S. Boulagriss, and I. Aatafay, “Le mentorat à la faculté de médecine dentaire de casablanca,” School of Dentistry of Casablanca, Casablanca, Morocco, 2015, Doctoral thesis. View at: Google Scholar
  16. C. T. Preoteasa, A. Axante, A. D. Cristea, and E. Preoteasa, “The relationship between positive well-being and academic assessment: results from a prospective study on dental students,” Education Research International, vol. 2016, Article ID 9024687, 8 pages, 2016. View at: Publisher Site | Google Scholar
  17. C. Marty, B. Gendron, F. Vaysse, I. Alsina, and J.-N. Vergnes, “Impact of a “per treated patient” clinical assessment method among dental students,” Pédagogie Médicale, vol. 18, no. 3, pp. 121–128, 2017. View at: Publisher Site | Google Scholar
  18. S. El Arabi, M. Hamza, H. Dahiri, and S. Khallouki, “L’évaluation en ligne des compétences cliniques des étudiants: qu’en pensent nos enseignants et nos étudiants?” School of Dentistry of Casablanca, Casablanca, Morocco, 2016, Doctoral thesis no. 55/56. View at: Google Scholar
  19. A. Charbonneau, “Evaluation des compétences cliniques en médecine dentaire,” in Newsletter of the Centre for Studies and Training in Higher Education, University of Montréal Mai, Montreal, Canada, 2002. View at: Google Scholar

Copyright © 2019 L. Benkirane et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder

Related articles

Article of the Year Award: Outstanding research contributions of 2020, as selected by our Chief Editors. Read the winning articles.