You are on page 1of 6

Integrating Research, Practice, and Policy: What We

See Depends on Where We Stand


Jon F. Kerner
rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr

I
n this special issue of the Journal of Public Health health practice, and the complex issues associated with
Management and Practice, the editors have taken on the the meaning and methods of dissemination and im-
plementation research. Although the concepts of diffu-
important challenge of characterizing the current landscape of
sion and dissemination date back more than a century,1
knowledge translation research and practice in public health. much of that work was done outside the context of
This includes the diffusion of scientific and program evaluation health (eg, in agriculture). It is only in the past two
evidence into public health practice and policy, the dissemination decades that recognizing the gap between research dis-
covery and program delivery and how to address it has
and implementation of evidence-based interventions in public
been articulated in the context of health.2
health practice, and the complex issues associated with the A key reason why addressing the discovery-delivery
meaning and methods of dissemination and implementation gap in health has become so pressing of late is the grow-
research. Three of the most important challenges for moving the ing recognition within the United States that our failure
to address the slow and uneven spread of innovations
field of dissemination and implementation science and research
from health research into practice and policy is a signif-
dissemination and implementation practice forward are the icant contributor to health disparities. Thus, it is those
confusion of terminology, the meaning of evidence, and who have the most (ie, the 4 Ws—the Worried, White,
partnerships across the research, practice, and policy divides. Wealthy, and Well) rather than those who need the most
who are usually first in line to benefit from the Nation’s
Because many in the research, practice, and policy-making
investment in health research.3
sectors do not see their role in closing the gap among research, Of all the many challenges and opportunities that
practice, and policy, new and expanded incentives need to be put are addressed in this special issue, three of the most im-
in place to encourage these collaborations. Partnerships between portant for moving the field of dissemination and im-
plementation science and research dissemination and
research, practice, and policy can help inform decisions in all
implementation practice forward are the confusion of
three sectors to help achieve a better balance between evidence terminology, the meaning of evidence, and partner-
based on science and evidence based on personal experience. ships across the research, practice, and policy divides.

KEY WORDS: knowledge translation, research-practice-policy


integration ● What’s in a Name?
A key challenge is one of language and mean-
ing. The terms translational and translation research,
In this special issue of the Journal of Public Health
Management and Practice, the editors have taken on the
important challenge of attempting to characterize the The findings and conclusions in this article are those of the author and do not
current landscape of knowledge translation research necessarily represent the views of the National Cancer Institute, The National
and practice in public health. This includes diffusion of Institutes of Health, and the US Department of Health and Human Services.
scientific and program evaluation evidence into public Corresponding Author: Jon F. Kerner, PhD, Division of Cancer Control &
health practice and policy, the dissemination and im- Population Sciences, National Cancer Institute, 6130 Executive Blvd, EPN 6142,
plementation of evidence-based interventions in public Bethesda, MD 20892 (kernerj@mail.nih.gov).
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
Jon F. Kerner, PhD, is Deputy Director for Research Dissemination and Diffusion,
J Public Health Management Practice, 2008, 14(2), 193–198 Division of Cancer Control & Population Sciences, National Cancer Institute, National
Copyright C 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Institutes of Health, Bethesda, Maryland.

193
194 ❘ Journal of Public Health Management and Practice

knowledge translation and transfer, dissemination, dif- tiple countries to identify both common concepts and
fusion and implementation (to name but a few) are used terms across sectors and unique concepts and terms
interchangeably to mean sometimes similar and some- within sectors. As such, the output of this first phase
times different things in the literature.4,5 Within the may be viewed as a “Rosetta Stone,” allowing re-
US National Institutes of Health (NIH), there are some searchers and practitioners from multiple sectors and
who hold that translational research not only applies to countries to communicate with each other about shared
basic discovery-to-intervention development types of research and practice priorities for the dissemination
research but also extends to development-to-delivery and implementation of evidence-based practices and
research. Others, this author included, argue that it policies.
is important to distinguish translational research from
dissemination and implementation research, because,
in part, the context of the former is so different from the
context of the latter. Thus, the context of translational ● Is Evidence, Like Beauty, in the Eye of
research is relatively homogeneous (eg, academic med- the Beholder?
ical centers) and relatively resource rich. In contrast,
the contexts for translating intervention innovations, Different perspectives regarding the nature and quality
particularly in public health practice, are more hetero- of evidence, and a lack of agreement on what constitutes
geneous in terms of both implementation resources and “best” evidence,9 suggest that for practitioners and pol-
infrastructure to support dissemination and implemen- icy makers alike, evidence like beauty is in the eye of the
tation of innovations.6 beholder. Such a person-specific and context-specific
In this special issue, Rabin et al7 address the chal- perspective implies that new models and strategies
lenge of language and meaning and argue that a com- for integrating explicit knowledge from research with
mon language is essential. When different disciplines tacit and contextual knowledge from practice and pol-
and organizations begin to work on the development icy experience (knowledge integration)10 may hold some
of a field, that field is characterized by inconsistent ter- promise over and above the relatively unidirectional
minology. They note that the same terms can mean dif- approach of framing the translation challenge as always
ferent things and different terms can refer to the same emphasizing the value of objective evidence gained
thing both within and across countries. They also ac- from research over subjective evidence gained from
knowledge that given the relative nascence of dissem- practitioner and patient experience (knowledge transfer).
ination and implementation research, at least in the In a collaboration between the National Cancer In-
health context, it is premature to resolve all of the exist- stitute (NCI), the Centers for Disease Control and
ing inconsistencies. Their glossary represents a starting Prevention Breast and Cervical Cancer Early Detec-
point for resolving language differences in the field. tion Program (BCCEDP), the American Cancer Soci-
At a recent international think tank on implemen- ety (ACS), and the US Department of Agriculture’s
tation and translational research held outside Stock- (USDA’s) County Cooperative Extension Service, these
holm, Sweden,8 presenters from the health, social work, multisector research and practice partners initiated the
and education sectors demonstrated that inconsisten- Team-Up: Cancer Screening Saves Lives demonstra-
cies of terminology and meaning exist not only within tion project. Team-Up focused on the dissemination
the health sector but also across and within these other and implementation of evidence-based breast and cer-
research/practice contexts where translating evidence vical cancer screening promotion programs in six states
into practice is both a priority and a conundrum. From with multiple counties that had historically high rates of
these presentations and subsequent discussions, an in- breast and cervical cancer mortality. A central research
ternational team of conference participants represent- dissemination resource for this collaborative demon-
ing multiple research and practice sectors are in the stration program was the Cancer Control PLANET
early stages of planning a multiphase effort to first (Plan, Link, Act, Network with Evidence-based Tools)
develop lexicons of dissemination and implementa- Web portal.11 PLANET is the first comprehensive Web
tion research terms within and across research/practice portal to provide access to evidence-based cancer con-
sectors and countries, and then use these lexicons to trol planning data and resources that can help cancer
help set priorities for future international and multi- control planners, health educators, program staff, and
sector dissemination and implementation research and researchers design, implement, and evaluate evidence-
practice. based cancer control programs.
Rather than viewing this first step as creating an A key issue that emerged during the Cancer Con-
agreed-upon scientific “Esperanto” for dissemination trol PLANET training of Team-Up cancer screening
and implementation, this effort will involve partici- service promotion program teams was the concern
pants from several research/practice sectors and mul- expressed by CDC BCCEDP–funded program staff,
Integrating Research, Practice, and Policy ❘ 195

TABLE 1 ● An informed decision-making model for selecting interventions to disseminate and/or implement
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
Types of programs

Research-tested Evaluated Program based on


intervention intervention Evidence-informed personal experience/
Type of evidence review programa programb programc tacit knowledged
Systematic evidence reviewse 1 2 NA NA
Individual efficacy/effectiveness study 2 3 4 NA
Individual program evaluation/pilot study NA 5 6 8
Program description/process report NA 7 8 9
Abbreviation: NA, not applicable.
a Tested in a peer review grant, outcome evaluation published in a peer-reviewed journal.
b Outcome evaluation published in a peer-reviewed journal.
c Based on the published literature.
d No reference to published literature.
e For example, Guide to Community Preventive Services, Guide to Clinical preventive Services, Cochrane Reviews.

USDA County Cooperative Extension agents, and ACS contextual fit and implementation feasibility in real-
volunteers as to the contextual relevance of the inter- world settings. At the Stockholm Think Tank, a lively
vention approaches and research-tested intervention debate took place about the issue of implementation
programs available from the Cancer Control PLANET. fidelity versus program adaptation. One camp argued
This concern led the NCI to develop an evidence matrix that implementing the interventions with fidelity was
to help program staff working with cancer control re- the best way, perhaps the only way, to ensure com-
searchers make informed decisions about selecting the parable outcomes. Adapting the intervention to help
most evidence-based program that was also a “good with contextual fit could be considered later, as long
fit” within the county service delivery context in which as core intervention elements were preserved intact.
breast and cervical cancer screening was taking place. The other camp pointed out that unless program im-
Table 1 displays this matrix. plementers were able to adapt the intervention a priori
As Table 1 suggests, there are many types of interven- to address contextual and population fit, and perhaps
tion programs and intervention approaches from which imbue a sense of program ownership of the evidence-
to choose, with different levels of evidence applied to based intervention being implemented, there would be
each. Exemplar questions that program staff and policy a much lower likelihood of initiating implementation
makers often face when deciding how best to address a in the first place.
public health problem are as follows. (1) which program Moreover, identifying core elements from efficacy
or intervention approach has the strongest evidence of or effectiveness research study findings is rarely pos-
efficacy and effectiveness? (2) Which program or ap- sible because the size and expense of factorial designs
proach has the best fit for the service delivery context needed to sort out core program elements from those
in which we are operating? (3) Which program or ap- that could be changed or discarded usually mean that
proach can most easily be adapted to improve the fit these types of research studies are neither proposed
in our service delivery context or to meet the needs of nor funded. Absent available dissemination and im-
our target populations? (4) How much flexibility do we plementation research data to test these competing hy-
have to adapt the program or the approach without seri- potheses, this is a good example of one of many dissem-
ously undermining the impact on outcomes? (5) Which ination and implementation issues that can and should
program or intervention approach can we afford to im- be addressed through D&I research. The NIH has, for
plement within the resource base of our service delivery the past several years, provided an important funding
context? Researchers may be inclined to focus more on opportunity for researchers and practitioners working
the first question while practitioners may be sensitive together to try to address issues like this example.12
to the more practical questions 2 to 5. To help Team-Up program staff sort through the ma-
Recognizing that the answers to some of these ques- trix of program types and levels of evidence reflected
tions may be in conflict with the answers to others, it in Table 1, the NCI focused on finding a range of pro-
should be recognized that for every context there is gram options available to them, assessing the contex-
balance that must be reached between, for example, tual, population, and resource fit of the available pro-
implementing an intervention with fidelity to ensure grams, and selecting the program that had the most
positive outcomes, and adaptation to help increase the rigorous level of evidence supporting its efficacy or
196 ❘ Journal of Public Health Management and Practice

effectiveness that also fit their needs. A key theme in with less to meet agency-specific priorities makes col-
this training and discussion was the idea that if research laboration that much more difficult.
should influence practice, so should practice influence On the policy side, how decisions are made about
research.13 Thus, depending on the extent to which pro- service delivery programs and where to integrate the
grams are chosen from the lower right corner of the ma- lessons learned from science into the decision-making
trix (eg, #’s 7–9), perhaps because of better contextual fit, process is extremely complex. For example, Figure 1
it was suggested that program staff find and work with provides a schematic of all the Legislative and Execu-
intervention researchers to conduct rigorous efficacy tive Branch units in the US government that were in-
or effectiveness research evaluations of the programs volved with child and family services as of 1993, based
so that lessons learned from these intervention efforts on a report issued in 1995.14
can further contribute to the evidence base from which In the Legislative Branch alone, there were 10 com-
future implementation choices can be made. Similarly, mittees and 20 subcommittees of the US House of Rep-
to the extent interventions are selected from the upper resentatives and 9 committees and 13 subcommittees
left corner of the matrix (eg, #’s 1–3) for implementa- of the US Senate that exercised control over the pro-
tion, again it would be ideal if dissemination and im- grams and policies for children and family services.
plementation researchers would seek out and partner The elected members and the committee staff of all
with program implementation staff to jointly study the these committees and subcommittees would then inter-
factors that enhance implementation of evidence-based act with 10 departments within the Executive Branch,
interventions. New and expanded resources are needed working with staff of multiple agencies within many
to create and support these research-practice partner- of these 10 departments. Taken together, 88 separate
ships and to develop communities of evidence-based US government entities had responsibility for 76 major
practice. federal programs for children and families. Thus, two
key questions when considering how to integrate sci-
ence with policy decision making would be where and
● Partnerships: Who Will Work With Whom? with whom to start?
On the legislative side, each committee and sub-
There are many levels at which partnerships to close committee is made up of elected officials (and their
the discovery-delivery gap, if expanded and supported, staff) who, in addition to holding a substantive inter-
could greatly facilitate the pace and extent to which the est in the service context that is the purview of the
lessons learned from science could be better integrated committee/subcommittee of which they are a member,
with the lessons learned from practice and policy. At are equally and perhaps more concerned about play-
the national level, the federal government spends tens ing a leadership role in policy legislation that emerges
of billions of dollars on research and hundreds of bil- from that committee/subcommittee as well as how
lions of dollars on service delivery but spends “decimal that role will be viewed in the home district that they
dust” on linking the lessons learned from each with the were elected to represent. With respect to the execu-
other.6 One key problem is that US government agen- tive branch agencies affected by this policy legislation,
cies are “siloed,” making it difficult to overcome admin- both their funding levels and program priorities can
istrative and structural barriers to build bridges across be determined in part or in full by the policy deci-
the divide between research and practice. sions made through legislation. So to the extent they
For example, within the US Department of Health have program and/or policy discretion, they may be
and Human Services (DHHS), the NIH have admin- quite sensitive to how the members of the different leg-
istrative and management procedures in place which islative committees and subcommittees view their deci-
make it relatively easy for one institute or center at the sions. Finally both members of legislative and executive
NIH to partner with another institute or center on a branch agencies may be as much or more concerned
shared research funding initiative. However, if an NIH about “self-promotion” as they are about promoting
institute or center wants to partner with another DHHS the health and social welfare of the constituents they
agency more focused on delivering services than re- were elected or appointed to serve. How best to in-
search (eg, the CDC, the HRSA), pooling funds and ex- tegrate science into the program and policy decision
pertise is much more challenging. This does not prevent making that takes place in this complex system is not
collaboration, but rather makes it more time consum- obvious, nor easy to evaluate.
ing and more expensive to work collaboratively than Similar to the complexity of research, practice, and
to work alone. Moreover, as budgets have flattened or policy partnerships at the federal level, state and lo-
been reduced recently both with respect to supporting cal health departments, state governors and legisla-
health research and the delivery of health services, shar- tors, county executives and boards of health, and city
ing resources and expertise in the face of doing more mayors and city councils all are potential partners with
Integrating Research, Practice, and Policy ❘ 197

FIGURE 1 ● Executive Branch Departments and Congressional Committees That Control Programs for Children and
Families and How They Relate to Each Other.
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq

Reproduced with permission from Dunkle.14

researchers and research institutions to work together find the resources to implement these plans were ini-
to make sure the lessons learned from health research tiated. This extensive array of collaborations serves as
benefit all populations and communities. Moreover, a call to action to public, private, and nonprofit orga-
given that resources from government, nongovernment nizations, governments at all levels, and individuals to
organizations, and the private sector all contribute to, renew their commitments to reducing the burden of
and often compete with each other at all levels, to sup- cancer15 as well as other chronic diseases.
port public health, a key question is what incentives
can be put in place to overcome turf issues, competi-
tion for “brand” recognition, and administrative and ● Summary
organizational barriers to research, practice, and policy
partnerships? In this special issue, the challenges and opportunities
The recognition by a number of national govern- for learning from dissemination and implementation
ment and nongovernment organizations that by work- research and translating these findings into new re-
ing together the whole could be greater than the sum search dissemination and implementation practices are
of its parts led to a national effort to support the devel- reviewed. This issue is of growing concern for the re-
opment and implementation of comprehensive cancer search, practice, and policy communities, because all
control plans in all 50 states and among tribes and terri- Americans are stakeholders in ensuring that the return
tories within the United States. This national effort led on our public sector investment in research is high and
to 50 state and several tribe and territorial comprehen- equitably distributed with respect to improved public
sive cancer control plans being successfully developed. health, clinical care services, and positive health out-
In addition, a growing number of successful efforts to comes. Given that many in the research, practice, and
198 ❘ Journal of Public Health Management and Practice

policy sectors do not see themselves as having a central 4. Ellis P, Ciliska D, Sussman J, et al. A systematic review of stud-
role in closing the gap among research, practice, and ies evaluating diffusion and dissemination of selected cancer
policy, new and expanded incentives need to be put in control interventions. Health Psych. 2005;24(5):488–500.
place to encourage these collaborations. 5. Venkat Narayan KM, Benjamin E, Gregg EW, Norris SL, En-
As long as the majority of scientists and scientific gelgau MM. Diabetes translation research: where are we and
where do we want to be? Ann Inter Med. 2004;140:958–963.
funding agencies do not see that they not only are
6. Kerner JF. Knowledge translation versus knowledge integra-
stakeholders in the dissemination and implementa-
tion: a “funder’s” perspective. J Cont Med Educ Health Prof.
tion of evidence-based interventions and policy ap- 2006;26:72–80.
proaches but can also increase the number of research 7. Rabin BA, Brownson RC, Haire-Joshu D, Kreuter MW,
applications and funding of science to inform research Weaver NL. A glossary for dissemination and implemen-
dissemination, the slow pace of translating research tation research in health. J Public Health Manag Pract.
into practice will continue to suppress the return on 2008;14(2):117–123.
our Nation’s investment in health research. Similarly, 8. Stockholm Conference on Implementation and Translational
unless the practice community comes to value the Research at Lejondal Castle. Stockholm, Sweden, Pre-
important role they must play in helping shape inter- conference papers and PowerPoint presentations. http://
vention science to achieve a greater balance between www.socialstyrelsen.se/IMS/implementeringskonferens.htm.
Accessed October 15–16, 2007.
internal and external validity,16 many intervention in-
9. Haynes RB. What kind of evidence is it that evidence-based
novations emerging from health research will continue medicine advocates want health care providers and con-
to be viewed by the practice community as impractical sumers to pay attention to? BMC Health Serv Res. 2002;2:3–10.
and irrelevant. Consequently, public health and clini- 10. Glasgow R. Disseminating behavioral medicine research:
cal care will continue to be delivered with suboptimal making the translational leap. Presentation at: 26th Annual
or at best inconsistent quality. Finally, many competing SBM Meeting, Symposium #22; SBM 2005.
political pressures influence policy makers, in deciding 11. Kerner JF, Guirguis-Blake J, Hennessy KD, et al. Translating
where to target resources and how to evaluate program research into improved outcomes in comprehensive cancer
success. To the extent that new and expanded partner- control. Cancer Causes Control. 2005;16(suppl 1):27–40.
ships between research, practice, and policy can help 12. NIH Dissemination and Implementation Funding Op-
inform these important policy decisions, a better bal- portunity Announcements. http://cancercontrol.cancer.
gov/funding apply.html#dd. Accessed December 19, 2007.
ance between evidence based on science and evidence
13. Green LW, Glasgow RE. Evaluating the relevance, gener-
based on personal experience can be achieved.
alization, and applicability of research: issues in external
validation and translation methodology. Eval Health Prof.
2006;29(1):126–153.
REFERENCES
14. Dunkle MC. Who controls major federal programs for chil-
1. Dearing JW. Evolution of diffusion and dissemination theory. dren and families: Rube Goldberg Revisited. Special Report
J Public Health Manag Pract. 2008;14(2):99–108. #3. The Policy Exchange. Washington, DC: The Institute for
2. Lomas J. Diffusion, dissemination, and implementation: who Educational Leadership; 1995.
should do what? Ann N Y Acad Sci. 1993;703:226–235. 15. Given LS, Black B, Lowry G, Huang P, Kerner JF. Collaborat-
3. Kerner JF, Dusenbury L, Mandelblatt JS. Poverty and cul- ing to conquer cancer: a comprehensive approach to cancer
tural diversity: challenges for health promotion among the control. Cancer Causes Control. 2005;16(suppl 1):3–14.
medically underserved. Annu Rev Publ Health. 1993;14:355– 16. Glasgow RE, Green LW, Klesges LM, et al. External validity:
377. we need to do more. Ann Behav Med. 2006;31(2):105–108.

You might also like