Table of Contents Author Guidelines Submit a Manuscript
Anesthesiology Research and Practice
Volume 2015, Article ID 971406, 6 pages
Research Article

Web-Based Learning for Emergency Airway Management in Anesthesia Residency Training

1Department of Anesthesia, McMaster University, 1280 Main Street West, Hamilton, ON, Canada L8S 4K1
2Biostatistics Unit, St. Joseph’s Healthcare Hamilton, McMaster University, 50 Charlton East, Hamilton, ON, Canada L8N 4A6
3Department of Clinical Epidemiology and Biostatistics and Department of Pediatrics, McMaster University, 1280 Main Street West, Hamilton, ON, Canada L8S 4K1

Received 20 September 2015; Accepted 29 November 2015

Academic Editor: Daniel E. Lee

Copyright © 2015 Ada Hindle et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Introduction. Web-based learning (WBL) is increasingly used in medical education; however, residency training programs often lack guidance on its implementation. We describe how the use of feasibility studies can guide the use of WBL in anesthesia residency training. Methods. Two case-based WBL emergency airway management modules were developed for self-directed use by anesthesia residents. The feasibility of using this educational modality was assessed using a single cohort pretest/posttest design. Outcome measures included user recruitment and retention rate, perceptions of educational value, and knowledge improvement. The differences between pre- and postmodule test scores and survey Likert scores were analysed using the paired test. Results. Recruitment and retention rates were 90% and 65%, respectively. User-friendliness of the modules was rated highly. There was a significant improvement in perceptions of the value of WBL in the postsurvey. There was a significant knowledge improvement of 29% in the postmodule test. Conclusions. Feasibility studies can help guide appropriate use of WBL in curricula. While our study supported the potential feasibility of emergency airway management modules for training, collaboration with other anesthesia residency programs may enable more efficient development, implementation, and evaluation of this resource-intensive modality in anesthesia education and practice.

1. Introduction

Web-based learning (WBL) can be defined as the “usage of computers and networks in education,” including learning management systems, online tutorials, discussion forums, and simulation [1, 2]. WBL has become increasingly used in medical education; however, it is important to understand how to best design and implement it as an educational modality [320].

Difficult and emergency airway management is an essential skill to acquire during anesthesia residency training. WBL may complement preexisting curricula and address gaps in clinical training in this area. Advantages of WBL include flexibility and access, enhancement of other educational modalities, ease for content updating, and appeal for the current “millennial learners” [21, 22].

While learning theories can guide the pedagogical use of WBL, major barriers to implementation in residency include time, cost, and technical expertise requirements as well as inadequate learner and faculty uptake [20, 21]. Therefore, WBL is effective only if successfully integrated in curricula and consistently used by learners [23]. Given the comparable educational outcomes of WBL and traditional methods [18, 19, 21], it is crucial to assess the feasibility [24] of WBL implementation in order to balance expense and educational benefits, especially for departments with limited resources.

The WBL literature provides little guidance on how to assess the feasibility of implementing a new teaching modality. Therefore, the purpose of this paper is to describe how we developed WBL modules for emergency airway training and explored their feasibility for implementation into our anesthesia residency curriculum at McMaster University, Canada. Feasibility is assessed by examining the recruitment and retention rate, user perceptions, and knowledge improvement. We hope our experience will inform other training programs that are considering incorporating WBL into their curricula.

2. Materials and Methods

2.1. Module Design and Development

We conducted a literature review and consulted an educational technology instructional designer in order to develop the modules according to principles of effective instructional web design [2123]. Based on active learning theories, aspects of WBL design that improve its educational efficacy include interactivity, practice exercises, feedback, and repetition [2, 18, 21, 23, 25]. Embedded case-based, self-assessment questions have also been shown to improve learning outcomes [26].

We selected two common emergency airway topics (burn and facial trauma) and developed them into two separate case-based airway modules for self-directed online use. Background information was provided in each patient scenario along with relevant images and diagrams. Embedded multiple choice questions allowed the participant to interactively make clinical management decisions throughout the modules. After selecting an answer to a question, the user received immediate feedback with a detailed explanation.

We developed the study objectives and content based on information from the Airway Evaluation and Management Guidelines of the National Curriculum for Canadian Anesthesia Residency [27], anesthesia textbooks [2830], and expert opinion papers [3135]. The content of the modules was peer-reviewed by several faculty anesthesiologists and revised for clarity and appropriateness.

The study instruments included pre- and postmodule survey and knowledge test. The pre- and postmodule surveys (supplemental content in Supplementary Material available online at assessed the participants’ perceptions of the value and user-friendliness of WBL and other educational modalities. The pre- and postmodule knowledge tests assessed the module content. The modules, surveys, and tests together formed the Training for Emergency Airway Management (TEAM) course, which was launched on Avenue to Learn (a computer-based McMaster learning management system).

After logging into Avenue to Learn, residents were directed to complete TEAM in the following order: premodule knowledge test and survey, the two airway modules, and finally the postmodule knowledge test and survey. Residents had five weeks to complete TEAM. Further details of how the modules and study instruments were created are outlined in the supplemental content. The total time for the development, design, testing, and launch of TEAM was 165 hours.

2.2. Feasibility Study Procedures

This study was approved by the McMaster University Research Ethics Board. We used a single cohort pretest/posttest design where participants served as their own control [36]. Because TEAM was developed for junior residents, only the current year’s cohort of anesthesia residents in postgraduate years (PGY) 1–4 (total study population = 29) were invited to participate.

2.3. Data Collection

Data from the participant responses on Avenue to Learn were exported to Excel data files, deidentified, and assigned individual codes by our Research Coordinator before analysis by the investigators.

2.4. Outcome Measures

The feasibility of incorporating WBL into the curriculum was assessed by the recruitment rate (% of participants/total study population) and retention rate (% of participants who successfully completed entire TEAM/total participants), user perceptions (reported as mean Likert scores), and knowledge improvement (measured by comparing pre- and postmodule knowledge test scores).

This WBL program would be considered feasible [24] if the following predefined criteria were met: (1) “definitely feasible” if recruitment rate and retention rate were at least 80%, (2) “possibly feasible” if recruitment rate and retention rate were between 60 and 79%, (3) mean user perception ratings of at least 5 on a 7-point Likert scale, (4) increased postmodule Likert scores of perceptions of the educational value of WBL, and (5) significant knowledge improvement between the pre-and postmodule test scores.

2.5. Statistical Analysis

Pre- and postmodule test scores and survey Likert scores were summarized as mean and standard deviation (SD). The differences between post- and premodule test scores and survey Likert scores were assessed using two-sided paired tests. The results were reported as mean differences, with corresponding 95% confidence interval and associated values. We set the level of significance at alpha = 0.05 and did not adjust this for multiple comparisons since these were primarily exploratory. Further exploratory analyses using regression were performed to assess associations between residency year and outcomes. All analyses were conducted using STATA 10.1 (College Station, TX).

3. Results

3.1. Recruitment and Retention

Twenty-six out of a total of 29 eligible anesthesia residents consented to the study, for a recruitment rate of 90% (Table 1). Eighteen of the 26 residents completed both the pre- and posttests, but one of these residents failed to complete the postsurvey (Figure 1). Therefore, the retention rate was 65% based on those 17 residents who completed all the pre- and postmodule tests and surveys. Incomplete data from the other participants were not included in the analysis.

Table 1: Participants’ demographics ( = 17).
Figure 1: Flow diagram of residents’ participation.
3.2. User Perceptions

User perceptions were assessed by questions relating to user-friendliness and educational value of WBL, using a 7-point Likert scale. User-friendliness was perceived to be high, as measured by “ease of use” (mean score: 6.76, SD: 1.15), “interpretability” (mean score: 6.5, SD: 1.26), and “visual aid” (mean score: 5.35, SD: 2).

With respect to perceptions of educational value, in the pre-and postmodule survey, residents ranked “simulation” and “on call experience with faculty member” as the first and second preferred learning methods, respectively, for learning to manage emergency airways (Figure 2). WBL was ranked in the middle with a mean value of 4.82 (SD: 1.55). Self-directed learning was ranked the lowest at 3.59 (SD: 1.37). After modules, both of these modalities increased significantly in perceived value with mean of 4.24 (SD: 1.25) (95% CI of 0.14 to 1.16, ) and 5.76 (SD: 1.39) (95% CI of 0.41 to 1.47, ), respectively, suggesting a positive impact on perceptions from using WBL.

Figure 2: Participant perception of teaching modality value in premodule versus postmodule survey comparison.
3.3. Knowledge Improvement

Knowledge improvement was measured by comparing the pre- and postmodule knowledge scores. The mean knowledge test scores were calculated out of 14 questions, instead of the original 15, due to a reported broken image link on one of the questions. A comparison analysis revealed no statistical difference between pre- and postmodule test scores of 14 questions compared to 15. On average, residents scored a mean of 7.39 (SD: 1.97) out of 14 on the premodule test (Table 2). The postmodule test score was significantly higher, with a mean of 11.44 (SD: 1.72) (). This represented 29% improvement in the postknowledge test.

Table 2: Pre- and postmodule comparisons of tests and surveys.

Regarding each individual test question, a predictably higher percentage of participants answered the posttest version correctly compared to the pretest version, with the exception of two questions, possibly due to ambiguous wording or content (Figure 3). Overall, premodule test scores tended to increase slightly with the level of training, although there was no statistically significant difference in scores between junior and more senior residents.

Figure 3: Individual question breakdown in pretest versus posttest comparison for percentage of participants who answered correctly.

4. Discussion

WBL is increasingly being used in anesthesia training. Bello et al. [9] demonstrated that the combination of online lecture slides and videos of difficult airway procedures resulted in an increase in postknowledge test scores and positive satisfaction scores. Soto et al. [17] paired didactic online lectures on anesthesia drug costs with “in-person” presentations to improve knowledge scores. Kopp and Smith [14] showed improved test scores in regional anesthesia with both case-based and traditional textbook style modules.

In their meta-analysis and systematic review, Cook et al. [18, 19] concluded that WBL is effective compared with no intervention and is at least as effective as traditional educational interventions. Therefore, given these findings, research should focus on examining the appropriate conditions for effective WBL implementation and use [18, 19]. In order to examine the usefulness of WBL in enhancing emergency airway management training in our curriculum and guide further development, we assessed the feasibility of implementing WBL modules into the anesthesia curriculum with respect to user uptake, perceptions, and educational value.

According to our feasibility criteria, the findings suggest that implementing interactive WBL emergency airway management modules in our anesthesia curriculum is “possibly feasible” as demonstrated by the recruitment and retention rates, user satisfaction, and evidence of educational value. The high recruitment rate of 90% is promising, suggesting that almost all of the residents were interested in WBL. However, the lower retention rate of 65% is problematic, pointing to a need to further examine contributing factors, such as technical issues, complexity or length of the modules, and/or the study measurement tools.

The user-friendliness of the modules in terms of ease of use and interpretability were highly rated. The lower rating on the use of visual aids might have been improved with more images or other multimedia. With respect to perceptions of educational value, resident “buy-in” for the use of WBL is further affirmed by a significantly increased mean Likert score and ranking of the postmodule survey compared with the premodule survey (4.8, SD: 1.55, versus 5.76, SD: 1.39). The educational value and construct validity of these modules are also supported by a significant overall improvement in postknowledge test scores.

Of all the educational modalities, “self-directed learning” received the lowest rating of 3.59 (SD: 1.37) and although it did significantly improve to 4.24 (SD: 1.25), it still remained low. The low ratings could have resulted from varying interpretations of what the term meant (as we did not clearly define it). Furthermore, it was asked separately from the WBL question; therefore residents may not have made the connection between self-directed learning and WBL. It is also possible that, in the critical area of emergency airway management, residents preferred to receive more instructor guidance than would be provided by self-directed learning. This finding warrants further investigation.

On call experience with faculty and simulation training were valued the highest both before and after survey. This finding is consistent with studies which show that learners often seek experiential knowledge around the context of patient encounters [37, 38]. Similar to other studies [3942], our findings suggest that WBL is regarded as a valued supplement but not replacement for current experiential educational modalities.

The results of our feasibility study as well as our systematic evidence-informed approach [4345] to the development of these modules help inform our decision to implement WBL as a potentially efficacious modality that will enhance anesthesia residency training. Based on these outcomes, future development would include improving the technical aspects of WBL design, addressing the limitations of the learning management system, examining issues associated with retention, and incorporating facilitated online discussions.

There were several outcomes that we were unable to report due to limitations in the learning management system. The software program did not allow comments to be collected from participants, reporting on the length of time spent on the different components or identifying whether “incorrect” questions were truly incorrect or simply not attempted at all. While knowledge improved after the modules, longer term knowledge retention needs to be assessed. We did not enable any online forums in order to maintain the participant’s anonymity; however, this should be considered as a venue for feedback in order to improve the modules. We cannot generalize our specific results to other programs; nevertheless, our approach to developing WBL modules as well as assessing feasibility of incorporating WBL is helpful to inform other programs that are considering incorporating WBL in their curricula.

5. Conclusion

In conclusion, our study shows how we can assess whether WBL is a feasible and potentially efficacious educational modality for implementation into an anesthesia training curriculum. Although the initial time and resource commitments were substantial, our findings suggest the possible feasibility of using WBL modules to enhance the clinical training in difficult and emergency airway management. It would be important to consider collaborating with other anesthesia training programs in order to pool resources and more efficiently develop, implement, and further evaluate this educational modality for anesthesia education and practice.


WBL:Web-Based Learning
TEAM:Training for Emergency Airway Management
PGY:Postgraduate year
SD:Standard deviation.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.


Ada Hindle is the first author and Principal Investigator. Ada has recently completed her anesthesia residency in the Department of Anesthesia, McMaster University, Hamilton, Ontario, Canada. Ji Cheng is a coauthor. She is a Research Associate with the Department of Anesthesia, McMaster University, and the Biostatistics Unit at St. Joseph’s Healthcare Hamilton. Lehana Thabane is a coauthor. He is a Research Methodologist, Professor, and Associate Chair, Department of Clinical Epidemiology and Biostatistics, and Director of the Biostatistics Unit, St. Joseph’s Healthcare Hamilton. Anne Wong is a coinvestigator, corresponding author, and supervisor of the resident research project. She is a Professor in the Department of Anesthesia, McMaster University.


Thanks are due to Toni Tidy for research coordination, Devon Mordell for technology support, webpage design supervision, and development, and Drs. G. Peachey, L. Olivieri, M. Hollidge, and G. Rosenblood for reviewing content of modules.


  1. M. Sajeva, “E-learning: web-based education,” Current Opinion in Anaesthesiology, vol. 19, no. 6, pp. 645–649, 2006. View at Publisher · View at Google Scholar · View at Scopus
  2. D. A. Cook, A. J. Levinson, S. Garside, D. M. Dupras, P. J. Erwin, and V. M. Montori, “Instructional design variations in internet-based learning for health professions education: a systematic review and meta-analysis,” Academic Medicine, vol. 85, no. 5, pp. 909–922, 2010. View at Publisher · View at Google Scholar · View at Scopus
  3. R. Gutmark, M. J. Halsted, L. Perry, and G. Gold, “Use of computer databases to reduce radiograph reading errors,” Journal of the American College of Radiology, vol. 4, no. 1, pp. 65–68, 2007. View at Publisher · View at Google Scholar · View at Scopus
  4. K.-M. Fung, L. A. Hassell, M. L. Talbert, A. F. Wiechmann, B. E. Chaser, and J. Ramey, “Whole slide images and digital media in pathology education, testing, and practice: the Oklahoma experience,” Analytical Cellular Pathology, vol. 35, no. 1, pp. 37–40, 2012. View at Publisher · View at Google Scholar · View at Scopus
  5. R. Friedl, H. Höppler, K. Ecard, W. Scholz, A. Hannekum, and S. Stracke, “Development and prospective evaluation of a multimedia teaching course on aortic valve replacement,” The Thoracic and Cardiovascular Surgeon, vol. 54, no. 1, pp. 1–9, 2006. View at Publisher · View at Google Scholar · View at Scopus
  6. N. Fearing, S. Bachman, M. Holzman, D. Scott, and M. Brunt, “Evaluation of a video-based curriculum for laparoscopic biliary surgery: a pilot study from the SAGES MIS Web Learning Center,” Surgical Endoscopy, vol. 24, no. 12, pp. 3141–3143, 2010. View at Publisher · View at Google Scholar · View at Scopus
  7. T. Satterwhite, J. Son, J. Carey et al., “Microsurgery education in residency training: validating an online curriculum,” Annals of Plastic Surgery, vol. 68, no. 4, pp. 410–414, 2012. View at Publisher · View at Google Scholar · View at Scopus
  8. R. Kulier, J. Hadley, S. Weinbrenner et al., “Harmonising evidence-based medicine teaching: a study of the outcomes of e-learning in five European countries,” BMC Medical Education, vol. 8, article 27, 10 pages, 2008. View at Publisher · View at Google Scholar · View at Scopus
  9. G. Bello, M. A. Pennisi, R. Maviglia et al., “Online vs live methods for teaching difficult airway management to anesthesiology residents,” Intensive Care Medicine, vol. 31, no. 4, pp. 547–552, 2005. View at Publisher · View at Google Scholar · View at Scopus
  10. L. F. Chu, C. Young, A. Zamora, V. Kurup, and A. Macario, “Anesthesia 2.0: internet-based information resources and web 2.0 applications in anesthesia education,” Current Opinion in Anaesthesiology, vol. 23, no. 2, pp. 218–227, 2010. View at Publisher · View at Google Scholar · View at Scopus
  11. S. Giglioli, S. Boet, A. R. De Gaudio et al., “Self-directed deliberate practice with virtual fiberoptic intubation improves initial skills for anesthesia residents,” Minerva Anestesiologica, vol. 78, no. 4, pp. 456–461, 2012. View at Google Scholar · View at Scopus
  12. J. M. Garfield, S. Paskin, and J. H. Philip, “An evaluation of the effectiveness of a computer simulation of anaesthetic uptake and distribution as a teaching tool,” Medical Education, vol. 23, no. 5, pp. 457–462, 1989. View at Publisher · View at Google Scholar · View at Scopus
  13. A. Jerath, A. Vegas, M. Meineri et al., “An interactive online 3D model of the heart assists in learning standard transesophageal echocardiography views,” Canadian Journal of Anesthesia, vol. 58, no. 1, pp. 14–21, 2011. View at Publisher · View at Google Scholar · View at Scopus
  14. S. L. Kopp and H. M. Smith, “Developing effective web-based regional anesthesia education: a randomized study evaluating case-based versus non-case-based module design,” Regional Anesthesia and Pain Medicine, vol. 36, no. 4, pp. 336–342, 2011. View at Publisher · View at Google Scholar · View at Scopus
  15. S. Lambden and B. Martin, “The use of computers for perioperative simulation in anesthesia, critical care, and pain medicine,” Anesthesiology Clinics, vol. 29, no. 3, pp. 521–531, 2011. View at Publisher · View at Google Scholar · View at Scopus
  16. H. A. Schwid, G. A. Rooke, P. Michalowski, and B. K. Ross, “Screen-based anesthesia simulation with debriefing improves performance in a mannequin-based anesthesia simulator,” Teaching and Learning in Medicine, vol. 13, no. 2, pp. 92–96, 2001. View at Publisher · View at Google Scholar · View at Scopus
  17. R. G. Soto, D. S. Cormican, C. J. Gallagher, and P. A. Seidman, “Teaching systems-based competency in anesthesiology residency: development of an education and assessment tool,” Journal of Graduate Medical Education, vol. 2, no. 2, pp. 250–259, 2010. View at Publisher · View at Google Scholar
  18. D. A. Cook, A. J. Levinson, S. Garside, D. M. Dupras, P. J. Erwin, and V. M. Montori, “Internet-based learning in the health professions: a meta-analysis,” The Journal of the American Medical Association, vol. 300, no. 10, pp. 1181–1196, 2008. View at Publisher · View at Google Scholar · View at Scopus
  19. D. A. Cook, S. Garside, A. J. Levinson, D. M. Dupras, and V. M. Montori, “What do we mean by web-based learning? A systematic review of the variability of interventions,” Medical Education, vol. 44, no. 8, pp. 765–774, 2010. View at Publisher · View at Google Scholar · View at Scopus
  20. A. K. Wong, “Educational technologies in anesthesia training,” Anaesthesia International, vol. 7, no. 1, pp. 10–13, 2013. View at Google Scholar
  21. J. G. Ruiz, M. J. Mintzer, and R. M. Leipzig, “The impact of e-learning in medical education,” Academic Medicine, vol. 81, no. 3, pp. 207–212, 2006. View at Publisher · View at Google Scholar · View at Scopus
  22. L. F. Chu, M. J. Erlendson, J. S. Sun, A. M. P. Clemenson, P. Martin, and R. L. Eng, “Information technology and its role in anaesthesia training and continuing medical education,” Best Practice and Research: Clinical Anaesthesiology, vol. 26, no. 1, pp. 33–53, 2012. View at Publisher · View at Google Scholar · View at Scopus
  23. D. A. Cook and D. M. Dupras, “A practical guide to developing effective web-based learning,” Journal of General Internal Medicine, vol. 19, no. 6, pp. 698–707, 2004. View at Publisher · View at Google Scholar · View at Scopus
  24. L. Thabane, J. Ma, R. Chu et al., “A tutorial on pilot studies: the what, why and how,” BMC Medical Research Methodology, vol. 10, article 1, 10 pages, 2010. View at Publisher · View at Google Scholar · View at Scopus
  25. US Department of Health and Human Services, “Research- Based Web Design and Usability Guidelines,” 2006,
  26. D. A. Cook, W. G. Thompson, K. G. Thomas, M. R. Thomas, and V. S. Pankratz, “Impact of self-assessment questions and learning styles in web-based learning: a randomized, controlled, crossover trial,” Academic Medicine, vol. 81, no. 3, pp. 231–238, 2006. View at Publisher · View at Google Scholar · View at Scopus
  27. M. Levine and P. Murphy, National Curriculum for Canadian Anesthesia Residency, 1st edition, 2013,
  28. P. G. Barash, Clinical Anesthesia, Lippincott Williams & Wilkins, Philadelphia, Pa, USA, 6th edition, 2009.
  29. O. Hung and M. F. Murphy, Management of the Difficult and Failed Airway, McGraw-Hill, New York, NY, USA, 2nd edition, 2012.
  30. R. D. Miller, Miller's Anesthesia, Churchhill Livingston Elsevier, Philadelphia, Pa, USA, 7th edition, 2010.
  31. J. E. McCall and T. J. Cahill, “Respiratory care of the burn patient,” Journal of Burn Care & Rehabilitation, vol. 26, no. 3, pp. 200–206, 2005. View at Publisher · View at Google Scholar · View at Scopus
  32. S. Rehberg, M. O. Maybauer, P. Enkhbaatar, D. M. Maybauer, Y. Yamamoto, and D. L. Traber, “Pathophysiology, management and treatment of smoke inhalation injury,” Expert Review of Respiratory Medicine, vol. 3, no. 3, pp. 283–297, 2009. View at Publisher · View at Google Scholar · View at Scopus
  33. E. T. Crosby, “Airway management in adults after cervical spine trauma,” Anesthesiology, vol. 104, no. 6, pp. 1293–1318, 2006. View at Publisher · View at Google Scholar · View at Scopus
  34. S. J. Mercer, S. E. Lewis, S. J. Wilson, P. Groom, and P. F. Mahoney, “Creating airway management guidelines for casualties with penetrating airway injuries,” Journal of the Royal Army Medical Corps, vol. 156, no. 4, supplement 1, pp. 355–360, 2010. View at Publisher · View at Google Scholar · View at Scopus
  35. R. Mohan, R. Iyer, and S. Thaller, “Airway management in patients with facial trauma,” Journal of Craniofacial Surgery, vol. 20, no. 1, pp. 21–23, 2009. View at Publisher · View at Google Scholar · View at Scopus
  36. G. R. Norman, C. P. M. van der Vleuten, and D. L. Newbie, International Handbook of Research in Medical Education, Kluwer Academic Publishers, Dordrecht, The Netherlands, 2002.
  37. R. S. Edson, T. J. Beckman, C. P. West et al., “A multi-institutional survey of internal medicine residents' learning habits,” Medical Teacher, vol. 32, no. 9, pp. 773–775, 2010. View at Publisher · View at Google Scholar · View at Scopus
  38. S. F. Monaghan, A. M. Blakely, P. J. Richardson, T. J. Miner, W. G. Cioffi, and D. T. Harrington, “The reflective statement: a new tool to assess resident learning,” Journal of Surgical Research, vol. 178, no. 2, pp. 618–622, 2012. View at Publisher · View at Google Scholar · View at Scopus
  39. L. A. Braeckman, A. M. Fieuw, and H. J. Van Bogaert, “A web- and case-based learning program for postgraduate students in occupational medicine,” International Journal of Occupational and Environmental Health, vol. 14, no. 1, pp. 51–56, 2008. View at Publisher · View at Google Scholar · View at Scopus
  40. C. M. Ferguson and A. L. Warshaw, “Failure of a Web-based educational tool to improve residents' scores on the American Board of Surgery in-training examination,” Archives of Surgery, vol. 141, no. 4, pp. 414–416, 2006. View at Publisher · View at Google Scholar · View at Scopus
  41. L. Dyrbye, A. Cumyn, H. Day, and M. Heflin, “A qualitative study of physicians' experiences with online learning in a masters degree program: benefits, challenges, and proposed solutions,” Medical Teacher, vol. 31, no. 2, pp. e40–e46, 2009. View at Publisher · View at Google Scholar · View at Scopus
  42. H. L. Smith and D. K. Menon, “Teaching difficult airway management: is virtual reality real enough?” Intensive Care Medicine, vol. 31, no. 4, pp. 504–505, 2005. View at Publisher · View at Google Scholar · View at Scopus
  43. G. Wong, T. Greenhalgh, and R. Pawson, “Internet-based medical education: a realist review of what works, for whom and in what circumstances,” BMC Medical Education, vol. 10, pp. 1–10, 2010. View at Publisher · View at Google Scholar · View at Scopus
  44. K. A. Ericsson, “Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains,” Academic Medicine, vol. 79, no. 10, supplement, pp. S70–S81, 2004. View at Google Scholar · View at Scopus
  45. W. C. McGaghie, S. B. Issenberg, E. R. Petrusa, and R. J. Scalese, “A critical review of simulation-based medical education research: 2003–2009,” Medical Education, vol. 44, no. 1, pp. 50–63, 2010. View at Publisher · View at Google Scholar · View at Scopus