Research Article | Open Access
Artemio M. Jongco, Sheila Bina, Robert J. Sporter, Marie A. Cavuoto Petrizzo, Blanka Kaplan, Myriam Kline, Susan J. Schuval, "A Simple Allergist-Led Intervention Improves Resident Training in Anaphylaxis", Journal of Allergy, vol. 2016, Article ID 9040319, 7 pages, 2016. https://doi.org/10.1155/2016/9040319
A Simple Allergist-Led Intervention Improves Resident Training in Anaphylaxis
Physicians underrecognize and undertreat anaphylaxis. Effective interventions are needed to improve physician knowledge and competency regarding evidence-based anaphylaxis diagnosis and management (ADAM). We designed and evaluated an educational program to improve ADAM in pediatrics, internal medicine, and emergency medicine residents from two academic medical centers. Anonymous questionnaires queried participants’ demographics, prior ADAM clinical experience, competency, and comfort. A pretest assessing baseline knowledge preceded a 45-minute allergist-led evidence-based presentation, including practice with epinephrine autoinjectors, immediately followed by a posttest. A follow-up test assessed long-term knowledge retention twelve weeks later. 159 residents participated in the pretest, 152 participated in the posttest, and 86 participated in the follow-up test. There were no significant differences by specialty or site. With a possible score of 10, the mean pretest score (7.31 ± 1.50) was lower than the posttest score (8.79 ± 1.29) and follow-up score (8.17 ± 1.72) ( for both). Although participants’ perceived confidence in diagnosing or managing anaphylaxis improved from baseline to follow-up ( for both), participants’ self-reported clinical experience with ADAM or autoinjector use was unchanged. Allergist-led face-to-face educational intervention improves residents’ short-term knowledge and perceived confidence in ADAM. Limited clinical experience or reinforcement contributes to the observed decreased knowledge.
Teaching physicians effectively about low probability, high consequence medical conditions, such as anaphylaxis, is challenging. Medical education curricula emphasize more common high stakes conditions (e.g., stroke) where misdiagnosis or mismanagement leads to poor outcomes. Physicians may lack opportunities to gain firsthand clinical experience or to reinforce their limited learning of infrequent conditions. Effective interventions are needed.
Clinicians face several challenges when dealing with anaphylaxis, a potentially life-threatening allergic reaction requiring immediate identification and treatment. First, there are no universally accepted diagnostic criteria for anaphylaxis [1, 2]. A comprehensive clinical definition of anaphylaxis from an NIH expert panel has not achieved widespread acceptance among physicians despite high reported sensitivity and negative predictive value [1, 3, 4]. Second, there are no pathognomonic anaphylaxis signs or symptoms. Physicians may overlook the diagnosis because the clinical presentation may vary among patients, even in the same patient with a history of multiple episodes [1, 5]. Third, physicians may not consider anaphylaxis when patients do not present stereotypically (e.g., laryngeal edema after bee sting). Additionally, diagnostic tests are rarely useful during an acute episode. Furthermore, the usual probability-based methods of clinical reasoning and decision-making are difficult to apply to anaphylaxis [6, 7]. Estimates of anaphylaxis prevalence and incidence are unclear due to the lack of symptom recognition, poor physician awareness of diagnostic criteria [1, 2], and paucity of robust, validated methods for identifying anaphylaxis diagnoses using currently available administrative claims [8, 9]. Thus, it is unsurprising that medical personnel underrecognize and undertreat anaphylaxis [1, 2].
Anaphylaxis diagnosis and management (ADAM) by physicians need improvement, regardless of the stage of training [10–22]. Despite the establishment and dissemination of treatment guidelines, medical providers consistently underutilize or incorrectly administer and dose epinephrine, which is accepted as first-line treatment [10–27]. Instead of prompt epinephrine administration, practitioners continue to utilize second-line agents, such as antihistamines and glucocorticoids, contrary to evidence-based recommendations [10–29].
Improving provider education regarding ADAM is an unmet but modifiable deficiency [30–33]. Allergists are particularly well suited to address this gap, as evidenced by the few published studies reporting successful allergist-led interventions [34–36]. We developed, implemented, and evaluated an educational program consisting of face-to-face didactic session and hands-on training conducted by allergy trainees or attending physicians in the proper use of epinephrine autoinjectors. We hypothesized that this allergist-led intervention would improve residents’ knowledge, competence, and perceived confidence in anaphylaxis diagnosis and management. Allergists and likely other nonallergist providers can adapt our simple and resource nonintensive intervention in a variety of settings to educate other providers about evidence-based ADAM.
2.1. Study Description and Eligibility
This longitudinal study examined full-time resident physicians at all training levels who were enrolled in an Accreditation Council for Graduate Medical Education, accredited training program in internal medicine (), pediatrics (), or emergency medicine (), from July 2010 to June 2013, at tertiary care university hospitals in two health systems. Residents were recruited during an hour-long educational conference in which they received an explanatory letter about the study and asked to participate. To maximize participation and account for differences in academic schedules, residents were recruited during two distinct department-specific conferences on different academic blocks. The institutional review boards at both institutions approved this study and waived the need to obtain written informed consent from participants.
2.2. Intervention, Quizzes, and Questionnaire
At the recruitment session, residents completed an anonymous questionnaire which queried participants’ demographics, prior clinical experience, perceived competency, and comfort with ADAM, as well as a 10-item multiple choice pretest that assessed baseline knowledge of anaphylaxis. Attending physicians (Artemio M. Jongco and Susan J. Schuval) and/or trainees (Sheila Bina and Robert J. Sporter) from the Division of Allergy and Immunology from the respective institutions presented a 45-minute evidence-based didactic lecture using PowerPoint (Microsoft, Redman, WA), followed by hands-on practice with needleless epinephrine autoinjector trainers (EpiPen® trainer). Study personnel observed participants’ technique, provided constructive criticism when appropriate, and answered participants’ questions related to the educational content or autoinjector use. Immediately following the presentation, the residents completed a similar 10-item posttest to evaluate knowledge acquisition. Approximately 12 weeks later, during another nonanaphylaxis conference, residents completed a similar 10-item follow-up quiz and questionnaire. To foster a safe and nonpunitive learning environment, only residents and study personnel were present at conference sessions. No identifying information was collected from the participants, nor were identifiers recorded on quizzes or questionnaires. Hence, linking individual’s responses at different time points or connecting an individual’s performance to his/her identity was not possible. Only residents who participated in both programs were included in the analysis. Examples of the questionnaires and quizzes are provided in Supplementary Material available online at http://dx.doi.org/10.1155/2016/9040319.
The authors developed the quizzes, consisting of clinical scenarios that evaluated knowledge of evidence-based anaphylaxis diagnosis and management. To ensure that quiz questions were roughly equivalent in complexity, the scientific content was identical from one quiz to another, with minor modifications (e.g., clinical parameters, order of questions, and answer choices). Quizzes were graded numerically on a scale from 0 to 10. Before the study, the authors reviewed the scientific content of the educational intervention, quizzes, and answers. The authors pilot-tested the quiz questions on a small group of rotating residents in the Division of Allergy & Immunology at Hofstra Northwell School of Medicine. Participants did not have access to quiz answers at any point during the study. Achieving 8 correct answers on the quiz was considered to be the minimum level of competence for medical knowledge.
2.3. Statistical Analysis
All statistical analyses were conducted using SAS 9.3 (SAS Institute Inc., Cary, NC). Graphs were generated using Prism 6 (GraphPad Software, San Diego, CA). The results of the descriptive analysis were shown as mean ± standard deviation with 95% confidence intervals and as percentages. The chi-square test was used to measure the association between the categorical variables, and Wilcoxon rank sum test or Kruskal-Wallis test was used to compare the groups on the continuous variables. A two-tailed was considered significant. Bonferroni adjustment was applied for multiple comparisons.
The two academic health centers employed a total of 397 residents that were eligible to participate (204 internal medicine, 153 pediatrics, and 40 emergency medicine residents). A total of 159 residents participated in the pretest (response rate of 40.05%). One hundred and fifty-two residents of the original 159 (95.60%) completed the posttest, and 86 residents (54.09%) were available for the follow-up test. Table 1 provides the distribution of sample characteristics by site and specialty. Since chi-squared analysis failed to reveal a significant difference by site () or by specialty () over time, data were combined in subsequent analyses.
Table 2 summarizes participant characteristics at baseline and follow-up. Chi-squared analysis revealed that the proportion of participants having demonstrated epinephrine autoinjector use increased from baseline to follow-up (). The proportion of participants who had diagnosed () or managed () anaphylaxis or used an epinephrine autoinjector () in the past did not differ significantly from baseline to follow-up. Of the participants who reported having managed anaphylaxis prior to the study, the most common venue was the emergency department (42.35%), followed by general ward (20.59%), intensive care unit (24.71%), and then allergy office (2.35%) (data not shown). Also, the proportion of participants who self-reported being confident in their ability to diagnose () or manage () anaphylaxis increased from baseline to follow-up. Table 3 summarizes participants’ self-reported behaviors and attitudes at follow-up. During the interval since the intervention, despite increased self-reported confidence in ADAM, residents appear to have had few opportunities to utilize what they have learned about anaphylaxis, or to refer patients with anaphylaxis to allergists.
Figure 1(a) illustrates the nonnormal distribution of quiz scores. The mean pretest score of 7.31 ± 1.50 (95% CI 7.08–7.54) was lower than the posttest score of 8.79 ± 1.29 (95% CI 8.58–9.00) and follow-up score of 8.17 ± 1.72 (95% CI 7.80–8.54). Figure 1(b) shows that the distribution of pretest scores is significantly lower than posttest scores and follow-up scores ( for both). The distribution of follow-up scores is significantly lower than that of posttest scores (). These results were significant even after Bonferroni adjustment was made for multiple comparisons (). Furthermore, our intervention appears to have helped the participants achieve the medical knowledge competency threshold of 8 correct answers.
Table 4 lists possible covariates of quiz score. Quiz scores at the three time points did not significantly differ by specialty () or by training level (). Moreover, the quiz scores did not vary according to their self-reported experience of past anaphylaxis diagnosis (), management (), past use (), demonstration of epinephrine autoinjector (), or past referral to allergist (). However, quiz scores did significantly differ depending on residents’ reported confidence in the ability to diagnose () or manage () anaphylaxis.
In this study, we demonstrate that allergist-led didactic lectures and hands-on practice with epinephrine autoinjectors are effective educational interventions that enhance short-term resident knowledge of evidence-based ADAM. These findings corroborate the literature which shows that the continuing opportunities to apply knowledge and to practice skills are essential to maintain knowledge and competency [34, 35]. Indeed, the majority of participants reported having limited opportunity to apply or utilize their new knowledge or skills in the 12-week interval between intervention and follow-up. We suspect that resident performance would have continued to decline in the absence of educational reinforcement if reevaluated after 12 weeks.
There are several findings, which failed to reach statistical significance, that further underscore the importance of continuing medical education and ongoing opportunities to practice clinical skills in order to maintain ADAM proficiency. There were trends to suggest an increase in the proportion of participants who had diagnosed () or managed () anaphylaxis or used an epinephrine autoinjector () at follow-up compared to baseline. Moreover, past anaphylaxis diagnosis () and management () are likely covariates of quiz score. Further research is needed to identify the optimal frequency and modality of continuing medical education that will result in maximal retention of knowledge and competency.
Interestingly, study participants reported high levels of confidence in diagnosing or managing anaphylaxis at baseline and follow-up, despite limited clinical experience. In fact, the levels of self-reported confidence increased from baseline to follow-up. This observed disconnect between physician self-assessment and objective measures of competence is unsurprising since physicians have a limited ability for self-assessment . Physicians’ overestimation of their own competence may compromise patient safety and clinical outcomes. It may be beneficial to help physicians at all training levels to become more cognizant of this disconnect. Moreover, training programs should consider restructuring current educational endeavors to include increased allocation of time and resources for educating trainees about low probability, high consequence conditions like anaphylaxis, since simple, non-resource-intensive interventions, such as the one described in this paper, can lead to measurable improvements in resident knowledge, and possibly clinical competence.
We acknowledge that our medical knowledge competency threshold score of 8 is somewhat arbitrary and may not necessarily reflect clinical competence. Clinical competence, which relies upon a foundation of basic clinical skills, scientific knowledge, and moral development, includes a cognitive function (i.e., acquiring and using knowledge to solve problems); an integrative function (i.e., using biomedical and psychosocial data in clinical reasoning); a relational function (i.e., communicating effectively with patients and colleagues); and an affective/moral function (i.e., the willingness, patience, and emotional awareness to use these skills judiciously and humanely) . Although evaluating medical knowledge through a quiz represents an incomplete assessment of clinical competence at best, it is still reasonable to hypothesize that medical knowledge correlates with clinical competence to some degree and that lower levels of medical knowledge may negatively impact the quality and efficacy of care delivered by the provider. Thus, the observation that the follow-up quiz scores trended back down towards baseline is worrisome for a possible concomitant decline in the quality of care delivered by the residents. Whether the residents demonstrated any change in their clinical practice after the intervention is unknown and outside the scope of the current study. More research is needed to determine the effect of educational interventions such as this in real-life clinical practice.
This study has several limitations. First, only 40% of all eligible participants were included in the study, and only about half of these participated in the 12-week follow-up evaluation. This is likely due to scheduling difficulties and competing demands on resident time, although we cannot exclude the possibility of participation bias. Notably, our participation rate is similar to other studies of physicians and residents [39–41].
Second, since we did not collect identifying information, we could not ensure that recruited participants completed the entire study, track individual performance over time, or give participants personalized feedback on their quiz performance.
Third, extensively validated quiz questions (e.g., questions from previous certification examinations) were not used due to the lack of access. The content of each question on the quizzes was directly linked to each one of our evidence-based learning goals, thus serving as a measure of face validity. Further, there was consensus among the board certified allergists/content experts who developed, verified, and honed the quiz questions, thereby providing us with a measure of content validity. Finally, because the quizzes were utilized at more than one site, in more than one clinical department, and on a modest sample size, we believe that the generalizability of the instrument was attained to a respectable degree.
Finally, this study only utilized traditional educational modalities of didactic lecture and hands-on practice. More research is needed to evaluate the efficacy of various educational interventions, especially with regard to long-term knowledge retention, improved performance on objective measures of clinical competence, and actual patient outcomes. Simulation-based education may hold promise in this regard [30–32, 35].
This study had several strengths. It was multicenter and included participants from multiple specialties. Also, the larger sample size of this study and the longer follow-up interval distinguish this study from other published studies of educational interventions. Furthermore, since the intervention is relatively simple and not resource-intensive, it can be adapted and implemented in a variety of educational settings.
Physicians, regardless of the stage of training, underdiagnose and undertreat anaphylaxis. Teaching providers about evidence-based ADAM is challenging. The allergist-led face-to-face educational intervention described above improves residents’ short-term knowledge, competence, and perceived confidence in ADAM. Lack of clinical experience and/or educational reinforcement may contribute to knowledge and competence decline over time. Thus, continuing medical education, coupled with ongoing opportunities to apply knowledge and practice skills, is necessary. Innovative educational interventions are needed to improve and maintain resident knowledge and clinical competence regarding evidence-based ADAM. More research is also needed to determine the impact of such interventions on patient outcomes.
The institutional review boards of Northwell Health System and Stony Brook Children’s Hospital approved this study.
This paper was presented as a poster at the 2014 annual meeting of the American Academy of Allergy, Asthma, and Immunology, San Diego, CA, February 2014.
Conflict of Interests
The authors declare that there is no conflict of interests regarding the publication of this paper.
The educational materials generated for this study are included as Supplementary Material. Appendix 1 contains the slides presented by the authors during the 45 minute evidence-based didactic lecture. Appendix 2 and 3 are sample quizzes developed by the authors. Appendix 2 is a pre-test and Appendix 3 is a post-test/follow-up test. Appendix 4 and 5 are sample questionnaires developed by the authors to query demographics and participant experience with anaphylaxis diagnosis and management. Appendix 4 is a pre-test questionnaire, and Appendix 5 is a post-test/follow-up questionnaire.
- P. Lieberman, R. A. Nicklas, J. Oppenheimer et al., “The diagnosis and management of anaphylaxis practice parameter: 2010 update,” The Journal of Allergy and Clinical Immunology, vol. 126, no. 3, pp. 477.e42–480.e42, 2010.
- D. A. Sclar and P. L. Lieberman, “Anaphylaxis: underdiagnosed, underreported, and undertreated,” American Journal of Medicine, vol. 127, no. 1, pp. S1–S5, 2014.
- H. A. Sampson, A. Muñoz-Furlong, R. L. Campbell et al., “Second symposium on the definition and management of anaphylaxis: summary report—Second National Institute of Allergy and Infectious Disease/Food Allergy and Anaphylaxis Network symposium,” Journal of Allergy and Clinical Immunology, vol. 117, no. 2, pp. 391–397, 2006.
- R. L. Campbell, J. B. Hagan, V. Manivannan et al., “Evaluation of national institute of allergy and infectious diseases/food allergy and anaphylaxis network criteria for the diagnosis of anaphylaxis in emergency department patients,” Journal of Allergy and Clinical Immunology, vol. 129, no. 3, pp. 748–752, 2012.
- R. A. Wood, C. A. Camargo Jr., P. Lieberman et al., “Anaphylaxis in America: the prevalence and characteristics of anaphylaxis in the United States,” Journal of Allergy and Clinical Immunology, vol. 133, no. 2, pp. 461–467, 2014.
- J. P. Kassirer, “The principles of clinical decision making: an introduction to decision analysis,” Yale Journal of Biology and Medicine, vol. 49, no. 2, pp. 149–164, 1976.
- A. Cahan, D. Gilon, O. Manor et al., “Probabilistic reasoning and clinical decision-making: do doctors overestimate diagnostic probabilities?” QJM: The Quarterly Journal of Mathematics, vol. 96, no. 10, pp. 763–769, 2003.
- G. Schneider, S. Kachroo, N. Jones et al., “A systematic review of validated methods for identifying anaphylaxis, including anaphylactic shock and angioneurotic edema, using administrative and claims data,” Pharmacoepidemiology and Drug Safety, vol. 21, no. 1, pp. 240–247, 2012.
- K. E. Walsh, S. L. Cutrona, S. Foy et al., “Validation of anaphylaxis in the Food and Drug Administration's Mini-Sentinel,” Pharmacoepidemiology and Drug Safety, vol. 22, no. 11, pp. 1205–1213, 2013.
- J. S. Gandhi, “Use of adrenaline by junior doctors,” Postgraduate Medical Journal, vol. 78, no. 926, article 763, 2002.
- L. L. Gompels, C. Bethune, S. L. Johnston, and M. M. Gompels, “Proposed use of adrenaline (epinephrine) in anaphylaxis and related conditions: a study of senior house officers starting accident and emergency posts,” Postgraduate Medical Journal, vol. 78, no. 921, pp. 416–418, 2002.
- J. Wang, S. H. Sicherer, and A. Nowak-Wegrzyn, “Primary care physicians' approach to food-induced anaphylaxis: a survey,” Journal of Allergy and Clinical Immunology, vol. 114, no. 3, pp. 689–691, 2004.
- B. R. Haymore, W. W. Carr, and W. T. Frank, “Anaphylaxis and epinephrine prescribing patterns in a military hospital: underutilization of the intramuscular route,” Allergy and Asthma Proceedings, vol. 26, no. 5, pp. 361–365, 2005.
- S. D. Krugman, D. R. Chiaramonte, and E. C. Matsui, “Diagnosis and management of food-induced anaphylaxis: a national survey of pediatricians,” Pediatrics, vol. 118, no. 3, pp. e554–e560, 2006.
- R. Jose and G. J. Clesham, “Survey of the use of epinephrine (adrenaline) for anaphylaxis by junior hospital doctors,” Postgraduate Medical Journal, vol. 83, no. 983, pp. 610–611, 2007.
- S. Thain and J. Rubython, “Treatment of anaphylaxis in adults: results of a survey of doctors at Dunedin Hospital, New Zealand,” The New Zealand Medical Journal, vol. 120, no. 1252, Article ID U2492, 2007.
- J. Droste and N. Narayan, “Hospital doctors' knowledge of adrenaline (epinephrine) administration in anaphylaxis in adults is deficient,” Resuscitation, vol. 81, no. 8, pp. 1057–1058, 2010.
- R. J. José and P. T. Fiandeiro, “Knowledge of adrenaline (epinephrine) administration in anaphylaxis in adults is still deficient. Has there been any improvement?” Resuscitation, vol. 81, no. 12, p. 1743, 2010.
- M. Kastner, L. Harada, and S. Waserman, “Gaps in anaphylaxis management at the level of physicians, patients, and the community: a systematic review of the literature,” Allergy, vol. 65, no. 4, pp. 435–444, 2010.
- G. Lowe, E. Kirkwood, and S. Harkness, “Survey of anaphylaxis management by general practitioners in Scotland,” Scottish Medical Journal, vol. 55, no. 3, pp. 11–14, 2010.
- J. Droste and N. Narayan, “Anaphylaxis: lack of hospital doctors' knowledge of adrenaline (epinephrine) administration in adults could endanger patients' safety,” European Annals of Allergy and Clinical Immunology, vol. 44, no. 3, pp. 122–127, 2012.
- S. L. Grossman, B. M. Baumann, B. M. Garcia Peña, M. Y.-R. Linares, B. Greenberg, and V. P. Hernandez-Trujillo, “Anaphylaxis knowledge and practice preferences of pediatric emergency medicine physicians: a national survey,” Journal of Pediatrics, vol. 163, no. 3, pp. 841–846, 2013.
- F. E. R. Simons and A. Sheikh, “Evidence-based management of anaphylaxis,” Allergy, vol. 62, no. 8, pp. 827–829, 2007.
- A. Sheikh, Y. A. Shehata, S. G. A. Brown, and F. E. R. Simons, “Adrenaline for the treatment of anaphylaxis: cochrane systematic review,” Allergy, vol. 64, no. 2, pp. 204–212, 2009.
- K. J. Simons and F. E. R. Simons, “Epinephrine and its use in anaphylaxis: current issues,” Current Opinion in Allergy and Clinical Immunology, vol. 10, no. 4, pp. 354–361, 2010.
- C. R. Simpson and A. Sheikh, “Adrenaline is first line treatment for the emergency treatment of anaphylaxis,” Resuscitation, vol. 81, no. 6, pp. 641–642, 2010.
- A. Worth, J. Soar, and A. Sheikh, “Management of anaphylaxis in the emergency setting,” Expert Review of Clinical Immunology, vol. 6, no. 1, pp. 89–100, 2010.
- A. Sheikh, V. Ten Broek, S. G. A. Brown, and F. E. R. Simons, “H1-antihistamines for the treatment of anaphylaxis: cochrane systematic review,” Allergy, vol. 62, no. 8, pp. 830–837, 2007.
- K. J. L. Choo, E. Simons, and A. Sheikh, “Glucocorticoids for the treatment of anaphylaxis: cochrane systematic review,” Allergy, vol. 65, no. 10, pp. 1205–1211, 2010.
- J. Jacobsen, A. L. Lindekær, H. T. Oøstergaard et al., “Management of anaphylactic shock evaluated using a full-scale anaesthesia simulator,” Acta Anaesthesiologica Scandinavica, vol. 45, no. 3, pp. 315–319, 2001.
- H. Owen, B. Mugford, V. Follows, and J. L. Plummer, “Comparison of three simulation-based training methods for management of medical emergencies,” Resuscitation, vol. 71, no. 2, pp. 204–211, 2006.
- A. M. Gaca, D. P. Frush, S. M. Hohenhaus et al., “Enhancing pediatric safety: using simulation to assess radiology resident preparedness for anaphylaxis from intravenous contrast media,” Radiology, vol. 245, no. 1, pp. 236–244, 2007.
- M. Arga, A. Bakirtas, F. Catal et al., “Training of trainers on epinephrine autoinjector use,” Pediatric Allergy and Immunology, vol. 22, no. 6, pp. 590–593, 2011.
- V. Hernandez-Trujillo and F. E. R. Simons, “Prospective evaluation of an anaphylaxis education mini-handout: the AAAAI anaphylaxis wallet card,” The Journal of Allergy and Clinical Immunology: In Practice, vol. 1, no. 2, pp. 181–185, 2013.
- J. L. Kennedy, S. M. Jones, N. Porter et al., “High-fidelity hybrid simulation of allergic emergencies demonstrates improved preparedness for office emergencies in pediatric allergy clinics,” Journal of Allergy and Clinical Immunology, vol. 1, no. 6, pp. 608–617, 2013.
- J. E. Yu, A. Kumar, C. Bruhn, S. S. Teuber, and S. H. Sicherer, “Development of a food allergy education resource for primary care physicians,” BMC Medical Education, vol. 8, article 45, 2008.
- D. A. Davis, P. E. Mazmanian, M. Fordis, R. Van Harrison, K. E. Thorpe, and L. Perrier, “Accuracy of physician self-assessment compared with observed measures of competence: a systematic review,” The Journal of the American Medical Association, vol. 296, no. 9, pp. 1094–1102, 2006.
- R. M. Epstein and E. M. Hundert, “Defining and assessing professional competence,” The Journal of the American Medical Association, vol. 287, no. 2, pp. 226–235, 2002.
- S. E. Kellerman and J. Herold, “Physician response to surveys. A review of the literature,” American Journal of Preventive Medicine, vol. 20, no. 1, pp. 61–67, 2001.
- J. B. VanGeest, T. P. Johnson, and V. L. Welch, “Methodologies for improving response rates in surveys of physicians: a systematic review,” Evaluation and the Health Professions, vol. 30, no. 4, pp. 303–321, 2007.
- I. Grava-Gubins and S. Scott, “Effects of various methodologic strategies: survey response rates among Canadian physicians and physicians-in-training,” Canadian Family Physician, vol. 54, no. 10, pp. 1424–1430, 2008.
Copyright © 2016 Artemio M. Jongco et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.