About this Journal Submit a Manuscript Table of Contents
Nursing Research and Practice
Volume 2013 (2013), Article ID 909606, 15 pages
http://dx.doi.org/10.1155/2013/909606
Research Article

Dissemination and Implementation Research Funded by the US National Institutes of Health, 2005–2012

UNM College of Nursing, 1 University of New Mexico MSC 095350, Albuquerque, NM 87131-0001, USA

Received 29 November 2012; Accepted 22 January 2013

Academic Editor: Deborah Vincent

Copyright © 2013 Mindy Tinkle et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Dissemination and implementation (D&I) research is a growing area of science focused on overcoming the science-practice gap by targeting the distribution of information and adoption of interventions to public health and clinical practice settings. This study examined D&I research projects funded under specific program announcements by the US National Institutes of Health (NIH) from 2005 to 2012. The authors described the projects’ D&I strategies, funding by NIH Institute, focus, characteristics of the principal investigators (PIs) and their organizations, and other aspects of study design and setting. Results showed 46 R01s, 6 R03s, and 24 R21s funded totaling $79.2 million. The top funders were the National Cancer Institute and the National Institute of Mental Health, together providing 61% of funding. The majority of PIs were affiliated with Schools of Medicine or large, nonprofit research organizations and think tanks. Only 4% of projects were to PIs with appointments at Schools of Nursing, with 7% of the funding. The most commonly funded projects across all of the studies focused on cancer control and screening, substance abuse prevention and treatment, and mental health services. Typically implemented in community and organizational settings, D&I research provides an excellent opportunity for team science, including nurse scientists and interdisciplinary collaborators.

1. Introduction

The existence of a gap between science and practice is universally recognized. Clinical research findings and clinical practice guidelines that have promise to improve health move very slowly from the research setting into clinical practice, and many of these interventions never reach those who could benefit. It is estimated that it takes an average of 17 years to translate 14% of original research into benefit for patients and an average of 9 years for interventions recommended as evidence-based practices to be fully adopted [1, 2].

Dissemination and implementation (D&I) research is a growing area of science focused on overcoming this science-practice gap. Research on dissemination addresses the targeted distribution or spread of information and interventions to specific public health and clinical practice settings. Implementation science is the study of methods to promote the integration of evidence and change practice patterns and health care policy within real-world public health and clinical service settings [3]. Using the language from models depicting the continuum of translational science from bench to practice and health impact, D&I research is often depicted as “T3” and “T4” science [4, 5].

Although the research enterprise has generated a rich supply of evidence-based interventions, programs, and services, knowledge about how to best disseminate and implement these evidence-based practices has not kept pace. Evidence is needed for how these interventions can be “scaled up,” what contextual factors and conditions are pivotal to successful adoption, and how to give added attention to issues of external validity, fidelity, and sustainability. D&I nurse researchers and practitioners are important players in advancing the goals of this science to improve patient and system outcomes [68].

Evidence does suggest that passive approaches to dissemination, such as the publication of consensus statements in professional journals, mass mailings, and untargeted presentations to heterogeneous groups, are ineffective strategies to achieve significant uptake and practice change [9, 10]. Targeted and active dissemination strategies, such as hands-on technical assistance, replication guides, point-of-decision prompts, and training workshops with hands-on experience, are more promising [11]. Implementation strategies have been described as either “top down” or “bottom up” and include a range of approaches, such as stakeholder relationship building and communication, continuous quality management, audit and feedback, service delivery practices and training, and local consensus building [12]. These D&I strategies are often directed at multiple levels and in different combinations of levels, including patient, provider, organizational, and policy. Features of organizations (e.g., hospitals, clinics, workplaces, and schools), such as organizational leadership and climate, managerial relations, and absorptive capacity, are increasingly seen as key intervention targets to facilitate D&I efforts [13].

Although D&I research in health is a relatively young science, advances in both the rigor and ambitiousness of studies over the past decade reflect robust growth in the field [14]. Many conceptual frameworks for guiding D&I research, such as the Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM) model [15], have been developed and are being tested. Common themes in these frameworks include a heavy emphasis on context, fidelity adaptation and quality of implementation, multilevel targets, and engagement of the target population in partnership research [16]. Research is also focused on developing and validating sound measures important for D&I research, such as measures for key organizational-level constructs [13]. In addition, comparative effectiveness studies of competing active D&I strategies are beginning to appear in the literature, including evaluation of cost [17]. Study designs best suited to answer questions in D&I research are also developing, including a focus on mixed-methods designs and system science approaches [18].

Federal funding for D&I research has traditionally been very small, particularly in relation to the funding available for discovery research. Although the portfolio of D&I research at the National Institutes of Health (NIH) is growing, funding for this science remains extremely small compared with the $30 billion each year that the NIH spends on basic and efficacy research [19]. In 2005, a trans-NIH committee (including the National Institute of Nursing Research [NINR]) issued the first of a set of multi-institute program announcements to stimulate research in this area. These program announcements have been continuously reissued over the past 8 years and include the Research Project Grant (R01) and Small Grant (R03 and R21) mechanisms. The purpose of this funding opportunity is to support innovative approaches to identifying, understanding, and overcoming barriers to the adaptation, adoption, and integration of evidence-based interventions and guidelines that previous research has shown to be efficacious and effective, but where uptake to date has been limited or significantly delayed [20].

A total of 76 D&I research projects have been funded by the NIH through these multi-institute program announcements from 2005 to 2012. Research findings from some of these NIH-funded projects have been presented at the NIH Annual Conference on the Science of Dissemination and Implementation and published in the literature. However, no summary of these funded projects examining the body of work in this research portfolio is available.

The purpose of this study was to examine the D&I research projects funded through these program announcements from 2005 to 2012 in terms of describing what has been funded by which NIH Institutes, the dollars invested, characteristics of the principal investigators (PIs) and their organizations, and an assessment of the focus of these projects, the D&I strategies employed, and other aspects of study design and setting. Although the studies funded through these Program Announcements Reviewed (PARs) do not include all the possible D&I-related research projects funded by NIH over this period, this study was confined to these projects because they represent outcomes of a sustained, multi-institute initiative to stimulate development of D&I science. This paper presents a description of these funded projects and suggests opportunities for nurse scientists in D&I research.

2. Methods

To accomplish the aims of this study, the abstracts from all projects funded by the NIH under the multi-institute “Dissemination and Implementation Research in Health” (PAR-06-039, PAR-06-520, PAR-06-521, PAR-07-086, PAR-10-038, PAR-10-039, and PAR-10-040) were accessed through the NIH Research Portfolio Online Reporting Tools (RePORT) [21]. Projects funded under an earlier program announcement specific to mental health (PAR-02-131, “Dissemination and Implementation Research in Mental Health”) were excluded from this review. The paper process is outlined in Figure 1.

909606.fig.001
Figure 1: Flowchart for the review process. PAR: Program Announcement Reviewed; D&I: Dissemination and Implementation.

The narrative abstracts for all 76 projects were independently examined by two reviewers to extract information on the following areas: funding institute, award amount, project topic, and characteristics of the PI and awardee organization. The 46 R01 projects were further independently examined by 3 reviewers for conceptual frameworks, D&I strategies, level of measurement, and study design and setting. In cases of discrepancies in recorded findings, an iterative process of abstract review was used until 100% agreement among the reviewers was reached. The small grant mechanisms (i.e., R03s and R21s) were excluded from the analysis of the R01s in this examination because the smaller projects were mainly needs assessments, small pilots, and instrument development studies and were generally not intervention research.

Summary tables and figures were constructed, using type of funding mechanism (R01, R03, R21) for initial categorization of abstracts. The quantitative results (e.g., frequencies and means) and qualitative results (e.g., project topic and D&I strategy) provided a foundation for drawing conclusions about the funded D&I research and discussing the implications for nursing research.

3. Results

3.1. Overview of the Projects

A total of 76 project abstracts were reviewed, representing 46 R01s, six R03s, and 24 R21s funded by the NIH, totaling $79.2 million during the 2005 to 2012 period (Table 1). Nine NIH Institutes and Centers funded these awards, with the National Cancer Institute (NCI) and the National Institute of Mental Health (NIMH) funding 58% of the total number of projects, which accounted for 61% of the total funding. The NINR awarded 6% of the projects: three R01s, one R03, and one R21, totaling $4.9 million. The NIH Fogarty International Center joined the program announcements in 2009 and funded several grants focused on global health. Figure 2 depicts the funding for the R01 awards, again, with the majority of the larger research project grants funded by the NCI and the NIMH.

tab1
Table 1: Dissemination and implementation awards by NIH Institute and grant mechanism, 2005–2012 (out-year funding for projects recently awarded is not included).
909606.fig.002
Figure 2: R01 funding by NIH Institutes, 2005–2012.

The majority of PIs for these funded projects were affiliated with Schools of Medicine and large, nonprofit research organizations and think tanks, such as Rand and Kaiser, and these institutions received 58% of the total funding (Table 2). Schools of Public Health accounted for about 18% and 19% of the PI affiliations and the total funding, respectively. Only about 4% of the funded projects were to PIs at Schools of Nursing, making up about 7% of the total funding. The universities and research organizations funded for these projects were more heavily clustered in the West and East coastal states and the East North Central states (Figure 3). The institutions in the West and East coastal states received about two thirds of the total funding for all of the projects.

tab2
Table 2: Projects by PI affiliation, mechanism, and total funding.
fig3
Figure 3: (a) Geographical distribution of institutions receiving project funding, (b) Legend for Figure 3(a).

The topic focus for the 76 projects funded over the past 8 years reflects a broad spectrum of areas, consistent with the missions of the sponsoring NIH Institutes and Centers. The projects included dissemination and implementation of evidence-based health behavior interventions and research-based guidelines, programs, and services for prevention, diagnosis, and clinical management in public health, clinical settings, and the public policy arena. As shown in Figure 4, the most commonly funded projects were cancer control and screening, substance abuse prevention and treatment, and mental health services.

909606.fig.004
Figure 4: NIH R01, R03, and R21 study topics. HIV/AIDS: human immunodeficiency virus/acquired immunodeficiency syndrome; HPV: human papilloma virus.

Many of the projects involved scaling up interventions found to be effective in smaller trials or adapting an evidence-based intervention for implementation in a new setting. For example, one R01 project [22] studied the dissemination and implementation of an evidence-based weight-management program for veterans called MOVE! by scaling up this program to reach a broad population of veterans across the Veterans Affairs national network of medical centers and community clinics. Another R01 project [23] examined the impact of adapting and delivering an evidence-based organizational implementation strategy called Availability, Responsiveness, and Continuity (ARC), originally developed in Tennessee, on improving mental health services for youth in community-based agencies in a Midwest community.

3.2. A Focus on the R01 Projects

A more in-depth analysis of the R01 projects was conducted to better understand the intervention projects that move the science beyond the pilot phase. This analysis included an examination of the theories and frameworks, D&I strategies, levels of measurement, and study design and setting.

3.2.1. Theories and Frameworks

Among the 46 R01 projects, the range of orientation to a theory or model varied widely (Table 3). The majority of the R01-funded projects had no mention of a theoretical framework or guiding model ( ). Of the frameworks mentioned, the RE-AIM framework for evaluating interventions was most commonly utilized ( ), closely followed by Rogers’ Diffusion of Innovations model ( ) [24]. One study combined Rogers’ Diffusion of Innovations model and the RE-AIM framework ( ). Nine studies utilized frameworks that had specific relevance for their project, such as grants that proposed the use of behavioral interventions and had a related behavioral model supporting the intervention. For example, a project titled “Dissemination of a Theory-Based Bone Health Program in Online Communities” [25] utilized social cognitive theory in designing an online bone-health intervention targeting adults aged 50 years and older.

tab3
Table 3: Theories or models utilized in the R01 abstracts.
3.2.2. D&I Strategies

Of the 46 R01 projects, 29 had one D&I strategy and 15 had two D&I strategies, for a total of 59 D&I strategies used (Table 4). In two of the abstracts, the strategies were either unclear or were not stated. Examining the D&I strategies, a majority (78%) of studies utilized active dissemination approaches, whereas 10% used passive dissemination approaches, and the remaining 12% used an evaluative approach (i.e., evaluation of an existing program). Many studies utilized a combination of mixed active, passive, and/or evaluative strategies. The range of active approaches varied; for example, one study adapted patient navigation strategies to Chinese women in Chicago [26] in an intervention that modified tailored patient navigation to improve cancer screening rates in this low-income and underserved population. The dissemination approach proposed in this study incorporated patient navigators as providers of cancer control education and screening in active teaching roles within the Asian immigrant population and was categorized in our study as hands-on technical assistance and training, two active D&I strategies. An example of evaluation of dissemination was demonstrated in a study examining smoking cessation and knowledge integration with people who use tobacco control quitlines [27], in which participant social network analysis was conducted to provide insight about potential dissemination approaches in the future.

tab4
Table 4: Passive, active, and evaluative D&I strategies identified.
3.2.3. Levels of Measurement

These R01 projects used five levels of measurement in evaluating the intervention outcomes: policy level, patient level, provider level, organizational level, and multilevel outcome measures. Multilevel outcome measures involved different combinations of the other four levels, such as a combination of organizational and provider measures or an intervention study that measured both provider and patient outcomes. Multilevel outcome measures were the most common means of evaluation for these R01 projects (47%), whereas the most frequent individual outcome measure was at the organizational level (22%; see Figure 5).

909606.fig.005
Figure 5: Frequencies for the level of measurement for the evaluation of intervention study outcomes.

One example of a research study using a multilevel outcomes methodology [28] involved a colorectal screening intervention replicated by a community health services agency for Asian American patients. Intervention outcomes were measured using a combination multilevel evaluation at both the organizational (i.e., the agency) and individual levels by collecting data from individual providers at the intervention community agencies.

An example of program evaluation using a single level of outcome measure [29] involved a 5-year project that examined the transportation, implementation, and sustainability of a computer-assisted cognitive-behavioral treatment (CACBT) for young children’s anxiety in elementary schools. In this multisite study, each school provided at least four counselors, social workers, school psychologists, or teachers who could implement the intervention. The outcome of the project was measured in terms of whether or not the school personnel increased the use of the CACBT program and increased identification of elementary school students with distressing anxiety through use of the Behavioral Assessment System for Children, Second Edition, Teacher Rating Scale.

3.2.4. Study Design and Setting

Many of these 46 R01 projects had multiple aims and multiple stages as part of the study intervention. The project designs included randomized controlled trials, quasiexperimental designs, case studies, survey research, community-based participatory approaches, and system science designs, such as social network analysis.

Many projects utilized mixed-methods designs, using quantitative methods to measure outcomes and qualitative methods to describe processes or expand the depth of understanding. For example, one project [30] involved three separate stages, with the first stage using two in-depth case studies of model substance abuse treatment programs serving the substance abuser group in the community. The second stage used a quantitative telephone survey approach to collect data from the directors of all 480 substance abuse treatment programs serving substance abusers in their respective communities. The final stage involved a qualitative approach using in-depth case studies of 12 of these 480 substance abuse treatment programs from the stage-two telephone interviews. As a result, the researchers planned to collect, analyze, and report both quantitative and qualitative data gathered during the three stages of the project [30].

When these R01 projects were grouped by predominant design, 43% of the studies used a quasiexperimental design, and 24% used randomized controlled trials. Together, these two designs accounted for two out of three designs among the R01 studies (Table 5). The project settings were also diverse and included rural and urban primary care and specialty care practices ( ), state government and international health settings ( ), community health agencies ( ), hospitals ( ), online social networking ( ), schools ( ), large health care systems ( ), churches ( ), and worksites ( ). See Table 6 for a list of the 76 abstracts used in the study.

tab5
Table 5: Predominant study designs and methods in the R01 projects.
tab6
Table 6: Abstracts evaluated in the study.

4. Discussion

This review of abstracts was conducted to describe the R01, R03, and R21 projects funded under the NIH program announcements for D&I research from 2005 to 2012. Further analysis was performed for the R01 studies, which were intended to move the science beyond the pilot phase.

Review of these abstracts demonstrated a robust set of projects that reflect a growing and evolving area of science. NCI and NIMH were the major funders for the projects, which is indicative of their long history of working to advance this field. Each of these institutes has designated organizational D&I units and program officers dedicated to broadly stimulating this science. The PIs for these projects represented an array of disciplines, and the topical focus of the projects was equally diverse, illustrating the interdisciplinary nature of the D&I research community. Only a small proportion of PIs for these projects were from Schools of Nursing. Likewise, as illustrated in Figure 3, rural and western states were underrepresented among the funded projects.

When a theory or framework was present, it referred to one that is commonly applied in D&I research in the context of health, particularly the RE-AIM [15] and Diffusion of Innovations [24] models. Other commonly used models included those that supported individually focused behavioral interventions, such as cognitive behavioral theory. Consistent with the literature that demonstrates that active D&I strategies are more effective [12], most of the projects relied on active approaches, such as training and technical assistance.

Assessment of the outcomes and fidelity of the dissemination and implementation of new preventive practices, guidelines, or programs in these projects were most often measured at multiple levels, such as the individual patient, the provider, and the organization. This multimodal approach is characteristic of more recent D&I research and may reflect an evolving use of systems thinking for D&I in terms of understanding how actors and organizations influence each other within a whole [31, 32]. The predominant study designs used in these projects included quasiexperimental and randomized clinical trials. Many of the projects used mixed methods, with a third of them including both quantitative and qualitative components. Mixed methods are particularly useful in generating data from multiple levels and many stakeholders and may be particularly suited to answer the complex questions in D&I research [18, 33].

Many of these funded projects can be considered first-generation D&I research. What areas of knowledge development suggested from these studies and other experts should be promoted to move the field forward in terms of the next generation of D&I research? Developing a standard and consistent terminology for D&I research is one critical area requiring careful attention [33]. Additional theory development and testing are needed to better understand the relationships among the complex array of factors required for successful dissemination and implementation of health interventions in various settings [34]. Building a more robust set of common measures for D&I research is also a priority [13]. Glasgow and colleagues suggest that alternative study designs beyond the traditional randomized trial that emphasize the importance of external validity and that take advantage of existing social, environmental, and community data should be utilized [19]. A focus on D&I approaches with high-risk populations, including low-income, minority, and low-health literacy groups, and in low-resourced settings is also an imperative [19, 34]. Finally, new interdisciplinary collaborations with diverse partners, including key stakeholders, consumers, and clinicians, will be important to grow the science [19].

4.1. Implications for Nursing Research

D&I science moves beyond the individual as the unit of analysis to focus on groups, systems, the community, and beyond. Nursing research is conducted in all these areas, but D&I research is a way for nurses to influence health and health care on a larger scale. One example is the dissemination of evidence-based practice guidelines throughout a unit, hospital, and health care system. What are the best ways to have these guidelines widely used by nurses and other health care professionals? Which strategies are most cost-effective?

The D&I program announcements clearly present a potential funding opportunity for nurse scientists committed to translating evidence-based interventions to improve health (these program announcements were recently reissued on January 9, 2013, under PAR-13-055 (R01), PAR-13-054 (R21), and PAR-13-056 (R03)). Nursing has a rich history of work in research utilization to improve clinical care and promote practice-based inquiry. Nurse scientists are well prepared to lead and participate as members in interdisciplinary teams focused on disseminating and implementing evidence to practice. Symptom management and self-care in chronic disease and end-of-life care are just a few examples where nursing has made significant contributions to the science and should take a leadership role in translating this work to practice [35]. Many nurse researchers are skilled in both quantitative and qualitative designs often used in D&I research. A mixed-methods approach might be especially useful in testing different strategies for implementation across different populations.

There are opportunities to learn more about D&I science and receive assistance and feedback on a proposed grant application. The NIH hosts an annual conference on the science of dissemination and implementation in health, where attendees can hear state-of-the science presentations, learn about research findings in the poster session, network with D&I scientists, and attend a technical assistance workshop led by NIH Program Officers and funded by D&I researchers. Other research training opportunities, such as the annual NIH-sponsored Training Institute on Dissemination and Implementation Research in Health, are also offered periodically to the extramural community. The NIH Office of Behavioral and Social Sciences Research website includes information about these opportunities (http://obssr.od.nih.gov/scientific_areas/translation/dissemination_and_implementation/index.aspx).

Talking to the appropriate NIH Institute Program Officer identified on the PAR as the scientific contact person is also essential in planning an application submission. Considering the fit of an application with the mission of other institutes (e.g., NCI or NIMH), in addition to NINR, is also important. Grant applications for these PARs prior to 2010 were reviewed in Special Emphasis Panels, in which peer reviewers were appointed for each panel on a temporary basis. In 2010, the NIH Center for Scientific Review established a chartered study section to peer review these and other investigator-initiated applications in this science area, called the Dissemination and Implementation Research in Health Study Section (http://public.csr.nih.gov/StudySections/IntegratedReviewGroups/HDMIRG/DIRH/Pages/default.aspx). This website also provides a link to the study section’s membership roster. Whereas standing members of this study section are appointed for a specific term, other temporary reviewers are often appointed for a specific review cycle to augment the scientific expertise that may be needed, depending on the pool of applications. Another invaluable opportunity for nurse scientists who have expertise in a specific area of D&I science is to volunteer to serve as a peer reviewer for this study section by sending their curriculum vitae and letter of interest to the Scientific Review Officer assigned to this study section.

4.2. Limitations

The description and analysis in this review were based solely on the funded project abstracts published in the NIH RePORTER under the previously identified program announcements. By their nature, proposal abstracts present a limited amount of information and might not accurately represent the project once completed. Outcomes cannot be identified through review of proposal abstracts; this would require subsequent review of publications based on the funded projects. The level of analysis was limited by having access only to those abstracts publicly available through funding. Consequently, no conclusions can be made based on funded projects compared with all proposals submitted in response to this PAR. No information is publically available regarding the total number of applications submitted. Furthermore, it is not known how representative the funded projects are for any of the descriptors provided in this paper compared with submitted proposals. It is also not known whether different types of applications or specific topics were more or less likely to be funded in relation to the entire pool of applications.

As reported by others [11, 12], this abstract review was hindered by inconsistent terminology for design and strategy and even for what was meant by dissemination or implementation. Although some abstracts provided details of the proposed projects, others omitted relevant content (e.g., model used). Equally evident from this review was the lack of common measures with established validity and reliability, particularly for measuring D&I processes and outcomes. The imperative to develop these common measures as key to the successful advancement of D&I science has been widely advocated [13, 34].

Finally, it should be noted that D&I science has undergone much development since the first general program announcement was released in 2005. It is not known what impact this may have had on the number and quality of proposals submitted, especially during the latter part of the time selected for this review.

5. Conclusion

The overall goal of D&I science is to overcome the research-practice gap so that evidence-based health practices yield significant health benefits to all populations and across all health care settings [19]. The purpose of this paper was to enhance understanding of the body of work represented in the projects funded under the NIH dissemination and implementation program announcements over the past 8 years and suggest implications for nurse researchers. The projects in this portfolio demonstrated that D&I research is complex, often multiphase, and requires a collaborative, interdisciplinary approach. These projects make a highly significant contribution to the field, yet much work remains to be done, such as improving methods and measures, to move D&I science forward.

Although many nurse researchers and practitioners are engaged in D&I science, nurse scientists were underrepresented among the PIs for these projects. Nurse scientists are uniquely prepared to contribute to the advancement of D&I research in health. This NIH initiative represents an outstanding potential funding and leadership opportunity for nurse researchers committed to translational research and shortening the science-practice gap.

Conflict of Interests

The authors declare no conflict of interests.

References

  1. E. A. Balas and S. A. Boren, “Managing clinical knowledge for health care improvement,” in Yearbook of Medical Informatics 2000: Patient-Centered Systems, J. Bemmel and A. T. McCray, Eds., pp. 65–70, Schattauer, Stuttgart, Germany, 2000.
  2. L. W. Green, J. M. Ottoson, C. García, and R. A. Hiatt, “Diffusion theory and knowledge dissemination, utilization, and integration in public health,” Annual Review of Public Health, vol. 30, pp. 151–174, 2009. View at Publisher · View at Google Scholar · View at Scopus
  3. National Institutes of Health, Fogarty International Center. Frequently Asked Questions About Implementation Science, 2010, http://www.fic.nih.gov/News/Events/implementation-science/Pages/faqs.aspx.
  4. J. M. Westfall, J. Mold, and L. Fagnan, “Practice-based research-“Blue Highways” on the NIH roadmap,” JAMA, vol. 297, no. 4, pp. 403–406, 2007.
  5. M. J. Khoury, M. Gwinn, P. W. Yoon, N. Dowling, C. A. Moore, and L. Bradley, “The continuum of translation research in genomic medicine: how can we accelerate the appropriate integration of human genome discoveries into health care and disease prevention?” Genetics in Medicine, vol. 9, no. 10, pp. 665–674, 2007. View at Publisher · View at Google Scholar · View at Scopus
  6. N. E. Donaldson, D. N. Rutledge, and J. Ashley, “Outcomes of adoption: measuring evidence uptake by individuals and organizations,” Worldviews on Evidence-Based Nursing, vol. 1, pp. S41–S51, 2004. View at Scopus
  7. J. E. Squires, T. Reay, D. Morelejo, S. M. Lefort, A. M. Hutchinson, and C. A. Estabrooks, “Designing strategies to implement research-based policies and procedures: a set of recommendations for nurse leaders based on the PARiHS framework,” Journal of Nursing Administration, vol. 42, no. 5, pp. 293–297, 2012.
  8. M. G. Titler, “Translation science and context,” Research and Theory for Nursing Practiceno, vol. 24, no. 1, pp. 35–54, 2012.
  9. L. A. Bero, R. Grilli, J. M. Grimshaw, E. Harvey, A. D. Oxman, and M. A. Thomson, “Getting research findings into practice. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings,” British Medical Journal, vol. 317, no. 7156, pp. 465–468, 1998. View at Scopus
  10. B. A. Rabin, R. C. Brownson, J. F. Kerner, and R. E. Glasgow, “Methodologic challenges in disseminating evidence-based interventions to promote physical activity,” American Journal of Preventive Medicine, vol. 31, no. 4, supplement, pp. 24–34, 2006. View at Publisher · View at Google Scholar · View at Scopus
  11. B. A. Rabin, R. E. Glasgow, J. F. Kerner, M. P. Klump, and R. C. Brownson, “Dissemination and implementation research on community based cancer prevention: a systematic review,” American Journal of Preventive Medicine, vol. 38, no. 4, pp. 443–456, 2010. View at Publisher · View at Google Scholar · View at Scopus
  12. E. K. Proctor and R. C. Brownson, “Measurement issues in dissemination and implementation research,” in Dissemination and Implementation Research in Health: Translating Science to Practice, R. C. Brownson, G. A. Colditz, and E. K. Proctor, Eds., pp. 261–280, Oxford University Press, New York, NY, USA, 2012.
  13. K. M. Emmons, B. Weiner, M. E. Fernandez, and S. P. Tu, “Systems antecedents for dissemination and implementation: a review and analysis of measures,” Health Education & Behavior, vol. 39, no. 1, pp. 87–105, 2012.
  14. D. Chambers, “Foreword,” in Dissemination and Implementation Research in Health: Translating Science to Practice, R. C. Brownson, G. A. Colditz, and E. K. Proctor, Eds., pp. vii–x, Oxford University Press, New York, NY, USA, 2012.
  15. R. E. Glasgow, T. M. Vogt, and S. M. Boles, “Evaluating the public health impact of health promotion interventions: the RE-AIM framework,” American Journal of Public Health, vol. 89, no. 9, pp. 1322–1327, 1999. View at Scopus
  16. B. Gaglio and R. E. Glasgow, “Evaluation approaches for dissemination and implementation research,” in Dissemination and Implementation Research in Health: Translating Science to Practice, R. C. Brownson, G. A. Colditz, and E. K. Proctor, Eds., pp. 327–356, Oxford University Press, New York, NY, USA, 2012.
  17. R. E. Glasgow and J. F. Steiner, “Comparative effectiveness research to accelerate translation: recommendations for an emerging field of science,” in Dissemination and Implementation Research in Health: Translating Science to Practice, R. Brownson, G. A. Colditz, and E. K. Proctor, Eds., pp. 72–93, Oxford University Press, New York, NY, USA, 2012.
  18. J. Landsverk J, C. H. Brown, P. Chamberlain, et al., “Design and analysis in dissemination and implementation research,” in DissemInation and Implementation Research in Health: TranslatIng Science to Practice, R. Brownson, G. A. Colditz, and E. K. Proctor, Eds., pp. 225–260, Oxford University Press, New York, NY, USA, 2012.
  19. R. E. Glasgow, C. Vinson, D. Chambers, M. J. Khoury, R. M. Kaplan, and C. Hunter, “National Institutes of Health approaches to implementation science; current and future directions,” American Journal of Public Health, vol. 102, no. 7, pp. 1274–1281, 2012.
  20. National Institutes of Health. PAR-10-038: Dissemination and Implementation Research in Health, 2009, http://grants.nih.gov/grants/guide/pa-files/PAR-10-038.html.
  21. National Institutes of Health. Research Portfolio Online reporting Tools (RePORT), 2012, http://report.nih.gov/.
  22. M. K. Campbell, 1R01CA124400-01: Dissemination of a Weight Management Program Among US Veterans, 2012, http://projectreporter.nih.gov/project_info_description.cfm?aid=7173197&icde=14075515&ddparam=&ddvalue=&ddsub=&cr=3&csb=default&cs=ASC.
  23. C. A. Glisson, 1R01MH084855-01A1: Testing an Organizational Implementation Strategy in Children's Mental Health, 2012, http://www.projectreporter.nih.gov/project_info_description.cfm?aid=8268517&icde=15737611&ddparam=&ddvalue=&ddsub=&cr=1&csb=default&cs=ASC.
  24. E. M. Rogers, Diffusion of Innovations, The Free Press, New York, NY, USA, 5th edition, 2003.
  25. E.-S. Nahm, 1R01NR011296-01: Dissemination of a Theory-Based Bone Health Program in Online Communities, 2012, http://projectreporter.nih.gov/project_info_description.cfm?aid=7691904&icde=14073702&ddparam=&ddvalue=&ddsub=&cr=9&csb=default&cs=ASC.
  26. M. A. Simon, 1R01CA163830-01: Adapting Patient Navigation to Promote Cancer Screening in Chicago’s Chinatown, 2012, http://projectreporter.nih.gov/project_info_description.cfm?aid=8223013&icde=14069793&ddparam=&ddvalue=&ddsub=&cr=23&csb=default&cs=AS.
  27. S. J. Leischow, 1R01CA128638-01A1: Knowledge Integration in Quitlines: Networks That Improve Cessation, 2012, http://projectreporter.nih.gov/project_info_description.cfm?aid=7431848&icde=14069557&ddparam=&ddvalue=&ddsub=&cr=2&csb=default&cs=ASC.
  28. S.-P. Tu, 1R21CA136460-01A1: Dissemination Through Community Health Centers Serving Diverse Populations, 2012, http://projectreporter.nih.gov/project_info_description.cfm?aid=7692474&icde=14072411&ddparam=&ddvalue=&ddsub=&cr=10&csb=default&cs=ASC.
  29. P. C. Kendall, 1R01MH086438-01A2: Disseminating Evidence-Based Practice to the Schools: CBT for Child Anxiety, 2012, http://projectreporter.nih.gov/project_info_description.cfm?aid=8041881&icde=14069793&ddparam=&ddvalue=&ddsub=&cr=13&csb=default&cs=ASC.
  30. D. K. Novins, 1R01DA022239-01A1: Evidence-Based Practices and Substance Abuse Treatment for Native Americans, 2008, http://projectreporter.nih.gov/project_info_description.cfm?aid=7431560&icde=14089831&ddparam=&ddvalue=&ddsub=&cr=10&csb=default&cs=ASC.
  31. B. J. Holmes, D. T. Finegood, B. L. Riley, and A. Best, “Systems thinking in dissemination and implementation research,” in Dissemination and Implementation Research in Health: Translating Science to Practice, R. C. Brownson, G. A. Colditz, and E. K. Proctor, Eds., pp. 175–191, Oxford University Press, New York, NY, USA, 2012.
  32. G. Aarons, 1R01MH092950-01A1: Interagency Collaborative Teams to Scale-Up Evidence Based Practice, 2011, http://projectreporter.nih.gov/project_info_details.cfm?aid=8138874&icde=14069686.
  33. B. Rabin and R. C. Brownson, “Developing the terminology for dissemination and implementation research,” in Dissemination and Implementation Research in Health: Translating Science to Practice, R. C. Brownson, G. A. Colditz, and E. K. Proctor, Eds., pp. 23–51, Oxford University Press, New York, NY, USA, 2012.
  34. R. C. Brownson, M. Dreisinger, G. A. Colditz, and E. K. Proctor, “The path forward in dissemination and implementation research,” in Dissemination and Implementation Research in Health: Translating Science to Practice, R. C. Brownson, G. A. Colditz, and E. K. Proctor, Eds., pp. 498–508, Oxford University Press, New York, NY, USA, 2012.
  35. N. F. Woods and D. L. Magyary, “Translational research: why nursing's interdisciplinary collaboration is essential,” Research and Theory for Nursing Practice, vol. 24, no. 1, pp. 9–24, 2010. View at Scopus