Table of Contents Author Guidelines Submit a Manuscript
Journal of Biomedical Education

Volume 2014, Article ID 286081, 5 pages

http://dx.doi.org/10.1155/2014/286081
Research Article

Faculty Development Effectiveness: Insights from a Program Evaluation

University of Toronto, Hospital for Sick Children, 555 University Avenue, Toronto, ON, Canada M5G 1X8

Received 4 April 2014; Accepted 5 June 2014; Published 23 June 2014

Academic Editor: Saeed Farooq

Copyright © 2014 Anupma Wadhwa et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Background. Faculty development programs are often time and resource intensive. In order to accommodate time constrained clinicians a limited time commitment faculty development program was developed and was shown to be effective in improving participant’s scholarly productivity. Objectives. The objective of this study was to assess participants’ perceptions of why the faculty development program was effective in promoting scholarship in education. Methods. In-depth semistructured interviews of course participants were conducted a year after completing a faculty development program. The interviews were audiotaped and transcribed verbatim. The transcriptions were coded independently by the investigators for dominant themes. The investigators held coding meetings to further refine the themes and discrepancies were handled by referring to the transcripts and reaching consensus. Results. The participants’ satisfaction with the course as described in the interviews correlated with the early satisfaction surveys. Reasons offered for this impact fell into four broad categories: course content, course format, social networking during the course, and the course facilitation coaching strategies to achieve goals. Conclusions. Course focusing on the process, experiential learning, and situating the course facilitator in the role of a functional mentor or coach to complete projects can be effective in facilitating behaviour change after faculty development programs.

1. Background

Faculty development is seen as an imperative for every medical school [1]; however, there remains a paucity of evidence regarding its effectiveness [2]. Furthermore, there has been a call to evaluate faculty development effectiveness beyond outcomes, that is, to explore why an intervention is effective in order to better consider its applicability and transferability to other contexts [3]. Some “key features” to effective faculty development programs designed to improve teaching effectiveness have been identified [2]. There has been a call, however, for more qualitative studies to explore the complexities of these interventions and for evaluation to include faculty development interventions targeted at other faculty roles such as education scholarship [4].

The production and dissemination of education scholarship are critical to the education mission of Academic Health Science Centers (AHSCs). Such work ensures new knowledge is generated in the field of medical education, it promotes the delivery of high quality medical training through evidence based practices, and it facilitates academic promotion and career satisfaction of its clinician educator faculty [5]. To sustain high standards in medical education, AHSCs must support faculty development that fosters the education scholarship productivity of its faculty [6].

While many clinician educators and other faculty members have voiced an interest to engage in education scholarship, productivity has generally been suboptimal [79]. To address this, several institutions have developed faculty development programs aimed to foster education scholarship skills in its clinician educators. A limited number of studies examining the outcomes of such programs have revealed promising results in terms of participant satisfaction and scholarly productivity [2, 1012]. An understanding of why these programs were effective, however, is lacking.

The project planning and management course offered at the University of Toronto is a faculty development course designed to promote education scholarship and productivity in clinician educators. This course, which has been previously described, requires participants to attend one four-hour session per month for 11 months and to complete an education scholarship project. Each monthly session includes a two-hour structured curriculum session, followed by two hours of “lab” time. During the lab time, each participant is given time to work on his or her individual project. Evaluation of this course immediately upon its completion revealed high participant satisfaction and evidence of education scholarship productivity [13].

We sought to gain an understanding of what elements or key features contributed to the effectiveness of this education scholarship faculty development course by exploring participants’ perceptions.

2. Methods

Semistructured interviews of course participants were conducted a year after course completion. The study was approved by the institutional research ethics board and informed consent was obtained from all participants. The interviews were conducted by one of the study investigators (LD) who was not previously affiliated with the course and who was not previously known to the participants. Open-ended questions explored participants’ satisfaction with the course as well as the course’s perceived impact on their education scholarship skills and productivity (e.g., tell me what you thought about the course. Has taking the course affected your practice? How?). Finally, questions probed participants’ perceptions of what made the course effective (e.g., what do you think was good about the course? Why do you think it “worked” for you?).

The interviews were audiotaped and transcribed verbatim. The transcriptions were coded independently by two investigators (AW, a clinician educator and an outsider to the course, and LD). A conventional content analysis approach was used. Conventional content analysis is used to describe a phenomenon by interpreting meaning from the content of text data. Text data is interpreted through the systematic process of coding and identifying themes or patterns. All transcripts are read repeatedly to achieve immersion and obtain a sense of the whole. Data are then read to capture key thoughts and concepts and to derive codes. Codes are then sorted into categories based on how different codes are related and linked [14]. The investigators held coding meetings to further refine the themes and discrepancies were handled by referring to the transcripts and reaching consensus.

3. Results

Nine of the twelve course participants consented to be interviewed. All study participants felt strongly that the course had a positive impact on their practice as educators, as this participant described:

“It sort of got me on my path…it sort of launched me.” (P8)

After citing examples of her education scholarship work that she was currently engaged in, this participant commented:

“It’s amazing how much I am applying from what I got out of that course.” (P3)

Another participant explained how she proceeded with her project and educational scholarship:

“The project that I wrote out for the course, I submitted it for PSI funding…and I got scores of 8 and 9 out of 10 but did not get any funding. So subsequently, I broke up the project into the three parts, it had three phases…so the phase one, which was needs assessment, I conducted with some funds that I had and I have subsequently presented that as a poster at an international conference and I have just recently sent the manuscript for publication to a Canadian Journal. And then now that that phase is done, I am now focusing on phase 2, and I’m actually in the process of submitting that proposal for funding in order to implement that.” (P7)

The participants’ satisfaction with the course and their self-reported increased productivity as described in the interviews was congruent with the results of the satisfaction surveys and outcomes evaluation that had been completed immediately upon course completion [13]. During the interviews, perceptions of why the course was effective were explored further. Reasons offered for this effectiveness fell into four broad categories: course content, course format, development of a social network, and the course facilitator. These categories are described in more detail below.

3.1. Course Content

The participants repeatedly described that what they remembered from the course was not straight facts but rather the “how”:

“It’s not a matter of data or information…it’s on how to run a project, how to set it up.” (P5)

It was felt that the content contained practical information and information about processes that was relevant to their practices.

3.2. Course Format

As described above, the course was delivered in a longitudinal format, specifically one half-day per month for one year. In addition, the course was project-based. Each participant was required to work on a specific project throughout the year and was given 2 hours of lab time each month for this purpose. This format was felt to contribute to the success of the course. In addition to allowing for experiential learning, many commented on how this helped them have “enforced protected time.”

One participant explained, “Most of us have trouble setting aside time for (education scholarship) unless it is evenings and weekends. Lovely to have it in the middle of the day!” (P1)

Another participant specified that funding is not the real issue but time is. “Just the lack of time, funding actually is not an issue, it is not very expensive, it is lack of time, and priorities.” (P9)

This participant commented on the two hours of lab time: “Recognizing that everyone is busy - if you slotted off that time, you’re actually going to use that time”. (P6)

3.3. Development of a Social Network

The third broad category that was felt to contribute to the impact of the course related to the social network that developed from the peers in the class.

This participant described: “Some of what was helpful with the course was the unofficial linking that you made with other people…everyone’s kind of in the same boat.” (P8)

Participants also felt that the social network served as a source of energy.

“It’s a field where people feed off one another, which keeps the passion going.” (P2)

3.4. Course Facilitator

The fourth category to which all participants attributed the course success and impact was the course facilitator. Given the dominance of this category in the participant responses, perceptions of how the course facilitator contributed to their productivity in education scholarship were further explored.

Participants described the central role the course facilitator played as a mentor and coach. Examples of “functional mentorship” were presented. That is, the course facilitator was described as providing close guidance with the project work, providing relevant resources to participants, and linking participants to other experts as needed. The practice of “coaching” or constantly pushing the students was also described as a strong point.

“Constantly pushing…she sets high expectations…and without the follow-up that she does…I would have found excuses not to do it.” (P1)

“(Course facilitator) was so persistent, (she would say) “you’ve been so far, and you can keep going”. And that was very helpful. She kept a stick on our heads.” (P4)

Several participants described that the course facilitator’s encouragement increased their confidence,

“I thought I’d never be able to do (education scholarship). But going through the course and listening to (course facilitator’s) encouragement every month, I realized, you know what, I can…It kind of took (my practice) to the next level.” (P1)

3.5. Barriers

When asked about barriers to success of the course, all participants remarked on time.

“It was in addition to everything else I was doing with work. Time. I’m sure that’s a common answer.” (P1)

The issues of time included decreased amount of time, fragmentation of time (interruptions), and other responsibilities taking priority.

4. Discussion

We have evaluated a faculty development program perceived to be effective in changing participants’ behaviour. Several elements have arisen suggesting why it is effective: content (the “how”, not just the “what”); format (project-based, forced protected time); social network (group work promoting group synergy); and the role of the Course facilitator (functional mentor, coaching). While the first three (content, format, and social network) have been addressed and similar results were found in other studies, [2, 1517] our study has revealed a less studied area of the course facilitator.

There has been a call to reexamine the place of the teacher in learner-centred education. With the focus on educational strategies, and structured curriculum, the key role of the teacher has been missed [18, 19]. In our study, the role of the course facilitator/teacher as a “functional mentor or coach” emerged as a strong contributor to changing the behaviour of the participants.

In this faculty development program, the course facilitator served as the functional mentor to all members of the course as they pursued their projects. If skills were needed to complete projects that were outside of the facilitator’s expertise, the facilitator’s role was to connect the mentee with the right adviser. The “functional mentor” is a concept described as a mentorship relationship which is focused on a project, and the mentor’s role is to “fill in” the skills or other needs that the learner is missing in order to complete the project. This concept has been previously described and has been offered as an effective faculty development technique outside of the course setting [20]. The practical realities of recruiting potential functional mentors for each participant and completing a mentored research project can pose challenges. In fact, at times programs are forced to abandon this important aspect of faculty development [12].

“…, in the initial discussions…a mentored research project was intended as a capstone experience. …At the time of the program’s development, the steering committee perceived that the task of recruiting potential mentors for each participant would be overwhelming. …Rather than delay the implementation…the steering committee decided to postpone requiring mentored research for…certification” [12].

Placing the course facilitator into the role of a functional mentor in project-based faculty development courses may be an efficient use of resources (one mentor serving several participants at the same time). The risk of overwhelming one facilitator with multiple projects, however, must also be considered. As such, knowledge, experience, and network of the course facilitators are important factors when selecting a course facilitator.

This course appears to have contributed to the learners changing their previous pattern of behaviour of not accomplishing or completing education scholarship; many participants stated that they have participated in and/or completed more scholarship after they had taken the course—they “just have to do it” and “develop the discipline and get it done.” The participants described the course facilitator as using several techniques that fall under the category of “coaching.”

Peer mentoring and peer coaching have been shown to be effective in faculty development [21, 22]. Although many of the processes are similar for coaching and mentoring there are several differences and the relationships are different. A survey looking at effective mentoring summarizes the following: “A mentor is like a sounding board, they can give advice but the partners is free to pick and choose what they do. The context does not have specific performance objectives. A coach is trying to direct a person to some end result, the person may choose how to get there, but the coach is strategically assessing and monitoring the progress and giving advice for effectiveness and efficiency.” [23].

Since a purpose of a faculty development program is to influence the learner’s behaviour and change practice, the use of coaching techniques may contribute to this goal. Medical educators can examine the coaching literature to understand which techniques work and in which contexts. We can use these principles and apply it to coaching clinician educators to change behaviour, that is, find the time to start and complete scholarly work. Understanding the skills required in coaching can be used to further develop the skills of medical teachers.

This study is hypothesis generating as expected from the research design and does not claim generalizability in all setting. However, qualitative analysis of the why and how a course can motivate faculty to improve academic productivity improves our understanding of the complexities of the knowledge to action phenomenon and generates hypothesis that can be beneficial to improve continuing education and faculty development.

5. Conclusion

We studied a faculty development program that appeared to have a positive impact on the learner’s behaviour. Effective aspects of the faculty development program included its content (how, not just what), format (project based, forced protected time), and social network (group work allowing for group synergy). An additional key feature that emerged was the role and influence of the course facilitator in effecting the learner’s behaviour. Situating the course facilitator in the role of a functional mentor can be effective and brings to light the need for course facilitators to have experience, knowledge, and a network of resources from which to draw. In addition, a course facilitator can adopt “coaching techniques” to influence the behaviours of the learners to achieve the desired outcome of the program.

Conflict of Interests

The authors declare no conflict of interests.

Acknowledgment

This work was supported by the David Fear Fellowship Grant for Continuing Education, University of Toronto, Canada.

References

  1. M. McLean, F. Cilliers, and J. van Wyk, “Faculty development: yesterday, today and tomorrow,” Medical Teacher, vol. 30, no. 6, pp. 555–584, 2008. View at Publisher · View at Google Scholar · View at Scopus
  2. Y. Steinert and P. J. McLeod, “From novice to informed educator: the teaching scholars program for educators in the health sciences,” Academic Medicine, vol. 81, no. 11, pp. 969–974, 2006. View at Publisher · View at Google Scholar · View at Scopus
  3. K. Parker, G. Burrows, H. Nash, and N. D. Rosenblum, “Going beyond Kirkpatrick in evaluating a clinician scientist program: it's not ‘if it works’ but ‘how it works’,” Academic Medicine, vol. 86, no. 11, pp. 1389–1396, 2011. View at Publisher · View at Google Scholar · View at Scopus
  4. Y. Steinert, K. Mann, A. Centeno et al., “A systematic review of faculty development initiatives designed to improve teaching effectiveness in medical education: BEME Guide No. 8,” Medical Teacher, vol. 28, no. 6, pp. 497–526, 2006. View at Publisher · View at Google Scholar · View at Scopus
  5. W. Levinson and A. Rubenstein, “Integrating clinician-educators into academic medical centers: challenges and potential solutions,” Academic Medicine, vol. 75, no. 9, pp. 906–912, 2000. View at Publisher · View at Google Scholar · View at Scopus
  6. W. Levinson and A. Rubenstein, “Mission critical—integrating clinician-educators into academic medical centers,” The New England Journal of Medicine, vol. 341, no. 11, pp. 840–843, 1999. View at Publisher · View at Google Scholar · View at Scopus
  7. M. E. Christiaanse, E. L. Russell, S. J. Crandall, A. Lambros, J. C. Manuel, and J. K. Kirk, “Development of an asset map of medical education research activity,” Journal of Continuing Education in the Health Professions, vol. 28, no. 3, pp. 186–193, 2008. View at Publisher · View at Google Scholar · View at Scopus
  8. S. Huwendiek, S. Mennin, P. Dern et al., “Expertise, needs and challenges of medical educators: results of an international web survey,” Medical Teacher, vol. 32, no. 11, pp. 912–918, 2010. View at Publisher · View at Google Scholar · View at Scopus
  9. M. A. Goldszmidt, E. M. Zibrowski, and W. W. Weston, “Education scholarship: it's not just a question of “degree”,” Medical Teacher, vol. 30, no. 1, pp. 34–39, 2008. View at Publisher · View at Google Scholar · View at Scopus
  10. D. C. Fidler, R. Khakoo, and L. A. Miller, “Teaching scholars programs: Faculty development for educators in the health professions,” Academic Psychiatry, vol. 31, no. 6, pp. 472–478, 2007. View at Publisher · View at Google Scholar · View at Scopus
  11. M. A. Martimianakis, N. McNaughton, G. R. Tait et al., “The research innovation and scholarship in education program: an innovative way to nurture education,” Academic Psychiatry, vol. 33, no. 5, pp. 364–369, 2009. View at Publisher · View at Google Scholar · View at Scopus
  12. L. D. Gruppen, E. Yoder, A. Frye, L. C. Perkowski, and B. Mavis, “Supporting medical education research quality: the association of American medical colleges' medical education research certificate program,” Academic Medicine, vol. 86, no. 1, pp. 122–126, 2011. View at Publisher · View at Google Scholar · View at Scopus
  13. S. Ratnapalan, “Knowledge to action: scholarship for faculty and staff,” Journal of Continuing Education in the Health Professions, vol. 29, no. 1, pp. 32–38, 2009. View at Publisher · View at Google Scholar · View at Scopus
  14. H. F. Hsieh and S. E. Shannon, “Three approaches to qualitative content analysis,” Qualitative Health Research, vol. 15, no. 9, pp. 1277–1288, 2005. View at Publisher · View at Google Scholar · View at Scopus
  15. A. M. Alteen, P. Didham, and C. Stratton, “Reflecting, refueling, and reframing: a 10-year retrospective model for faculty development and its implications for nursing scholarship,” Journal of Continuing Education in Nursing, vol. 40, no. 6, pp. 267–272, 2009. View at Publisher · View at Google Scholar · View at Scopus
  16. J. Conklin, A. Kothari, P. Stolee, L. Chambers, D. Forbes, and K. Le Clair, “Knowledge-to-action processes in SHRTN collaborative communities of practice: a study protocol,” Implementation Science, vol. 6, no. 1, article 12, 2011. View at Publisher · View at Google Scholar · View at Scopus
  17. Y.-S. Ju, “Evaluation of a team-based learning tutor training workshop on research and publication ethics by faculty and staff participants,” Journal of Educational Evaluation for Health Professions, vol. 6, article 5, 2009. View at Publisher · View at Google Scholar
  18. R. M. Harden and J. Crosby, “AMEE guide no 20: the good teacher is more than a lecturer—the twelve roles of the teacher,” Medical Teacher, vol. 22, no. 4, pp. 334–347, 2000. View at Publisher · View at Google Scholar · View at Scopus
  19. C. J. Hatem, N. S. Searle, R. Gunderman et al., “The educational attributes and responsibilities of effective medical educators,” Academic Medicine, vol. 86, no. 4, pp. 474–480, 2011. View at Publisher · View at Google Scholar · View at Scopus
  20. L. E. Thorndyke, M. E. Gusic, and R. J. Milner, “Functional mentoring: a practical approach with multilevel outcomes,” Journal of Continuing Education in the Health Professions, vol. 28, no. 3, pp. 157–164, 2008. View at Publisher · View at Google Scholar · View at Scopus
  21. J. A. Lord, E. Mourtzanos, K. McLaren, S. B. Murray, R. J. Kimmel, and D. S. Cowley, “A peer mentoring group for junior clinician educators: four years' experience,” Academic Medicine, vol. 87, no. 3, pp. 378–383, 2012. View at Publisher · View at Google Scholar · View at Scopus
  22. L. E. Sekerka and J. Chao, “Peer coaching as a technique to foster professional development in clinical ambulatory settings,” The Journal of Continuing Education in the Health Professions, vol. 23, no. 1, pp. 30–37, 2003. View at Publisher · View at Google Scholar · View at Scopus
  23. M. Matt and M. M. Starcevich, Coach, Mentor: Is There a Difference? NC, CEO Center for Coaching & Mentoring, 2014, http://www.coachingandmentoring.com/Articles/mentoring.html.