Professional Documents
Culture Documents
April
2
6 Evaluation and Practice
Process evaluation is used to moni- for process-evaluation plans include the program and how that could
tor and document program imple- fidelity, dose (delivered and re- affect program impacts or outcomes
mentation and can aid in under- ceived), reach, recruitment, and con- (Bouffard, Taxman, & Silverman,
standing the relationship between text. The purpose of this article is 2003; Harachi, Abbott, Catalano,
specific program elements and pro- to describe and illustrate the steps Haggerty, & Fleming, 1999).
gram outcomes. The scope and im- involved in developing a process- In recent years, an increasing em-
plementation of process evaluation evaluation plan for any health- phasis has been placed on measur-
has grown in complexity as its im- promotion program. ing program implementation, in part
portance and utility have become because of great variability in pro-
more widely recognized. Several Keywords: process evaluation; mea- gram implementation and policy
practical frameworks and models suring program imple- adoption in school and commu-
are available to practitioners to mentation nity settings (Dusenbury, Brannigan,
guide the development of a compre- Falco, & Hansen, 2003; Harachi
hensive evaluation plan, including et al., 1999; Helitzer, Yoon, Waller-
process evaluation for collaborative stein & Garcia-Velarde, 2000; McGraw
M
community initiatives. However, uch emphasis is placed on et al., 2000; Scheirer et al., 1995;
frameworks for developing a com- outcome evaluation to
prehensive process-evaluation plan determine whether a health-
for targeted programs are less com- promotion program was successful. The Authors
mon. Building from previous frame- Process evaluation, which helps us
works, the authors present a com- understand why a program was or Ruth P. Saunders, PhD, is an associate
prehensive and systematic approach was not successful, is equally impor- professor in the Department of Health
for developing a process-evaluation tant (Bartholomew, Parcel, Kok, & Promotion, Education, and Behavior
plan to assess the implementation Gottlieb, 2001; Steckler & Linnan, in the Arnold School of Public Health
of a targeted health promotion 2002a). In fact, process evaluation at the University of South Carolina in
intervention. Suggested elements may be used to confirm that the Columbia, South Carolina.
intervention was indeed imple- Martin H. Evans, MS, is a doctoral
Authors’ Note: Correspondence concerning mented before using resources to candidate in the Department of Health
this article should be addressed to Ruth P.
assess its effectiveness (Scheirer, Promotion, Education, and Behavior
Saunders; Department of Health Promo-
tion, Education, and Behavior; Arnold
Shediac, & Cassady, 1995). A pro- in the Arnold School of Public Health
gram’s lack of success could be at the University of South Carolina in
School of Public Health; University of
attributed to any number of Columbia, South Carolina.
South Carolina; 800 Sumter Street; Colum-
bia, South Carolina 29208. Phone: (803) program-related reasons, including Praphul Joshi, PhD, MPH, is a
777-2871; e-mail: rsaunders@sc.edu. poor program design, poor or incom- research and planning administrator
plete program implementation, and/ in the Division of Cardiovascular
Health Promotion Practice or failure to reach sufficient num- Health at the South Carolina Depart-
April 2005 Vol. 6, No. 2, 134-147 bers of the target audience. Process ment of Environmental Control in
DOI: 10.1177/1524839904273387 evaluation looks inside the so-called Columbia, South Carolina.
©2005 Society for Public Health Education black box to see what happened in
134
Evaluation and Practice
Behavioral Health
Inputs Immediate Impacts Short-Term Impacts Impacts Outcomes
Providing training, result in the development result in changes in the will result in improved
materials, and of a Media Matters team school media environment, reduced use of health in
consultation, to and cross-disciplinary social norms, and alcohol and students.
teachers will implementation of the development of students’ tobacco and,
Media Matters curriculum, self-efficacy and skills for ultimately,
which will abstaining from alcohol and
tobacco use, which
1997; Zapka et al., 2004), existing lored intervention, single or multi- evaluation planning is an iterative
structures of the organizations and ple treatments; Viadro et al., 1997). rather than linear process, particu-
groups, organizational social system Program characteristics and context larly Steps 3 to 5. Each step of the
characteristics (e.g., interorganiza- affect process evaluation in at least process is described in more detail
tional linkages, community agency two ways. First, important contex- and illustrated with a case study in
partnerships; Scheirer et al., 1995; tual factors should be identified and the following section. The steps pre-
Zapka et al., 2004), and factors in the measured in process evaluation. sented in Figure 1 incorporate a
external environment (e.g., compet- Second, as program size and com- detailed understanding of the pro-
ing events, controversy about the plexity increase, the resources gram and its characteristics, contex-
program, external political factors, required to monitor and measure tual factors, and the purposes of the
and history and events that happen implementation will increase. process evaluation.
concurrently with the program;
Scheirer et al., 1995; Viadro et al.,
1997; Zapka et al., 2004). > STEPS FOR DEVELOPING
A PROCESS-EVALUATION
Step 1: Describe the Program
The characteristics of the program PLAN
are also an important influence on In Step 1, the previously planned
program implementation, including Six steps to developing a process- program is described fully, includ-
the program’s age (new or old), size evaluation plan for an intervention, ing its purpose, underlying theory,
(large or small), coverage (single- or adapted from Steckler and Linnan objectives, strategies, and the
multisite, local, state, or national), (2002a), are illustrated in Figure 1. expected impacts and outcomes of
and complexity (standardized or tai- As reflected in Figure 1, process- the intervention. Ideally, this should
Saunders et al. / DEVELOPING A PROCESS-EVALUATION PLAN 137
COMPLETE AND ACCEPTABLE DELIVERY OF MEDIA MATTERS
The ideally implemented Media Matters program will consist of four essential components: an environmental component
focusing on creating a Media Matters intervention team; two curriculum modules, Deconstructing Media Messages and Youth-
Produced Media; and a teacher training component.
FIGURE 3 Media Matters Case Study—Step 2: Describe Complete and Acceptable Delivery of the Program
TABLE 1
Elements of a Process-Evaluation Plan, With Formative and Summative Applications
Fidelity (quality) Extent to which intervention was Monitor and adjust program Describe and/or quantify
implemented as planned. implementation as needed to fidelity of intervention
ensure theoretical integrity implementation.
and program quality.
Dose delivered Amount or number of intended Monitor and adjust program Describe and/or quantify
(completeness) units of each intervention or implementation to ensure all the dose of the interven-
component delivered or components of intervention tion delivered.
provided by interventionists. are delivered.
Dose received Extents to which participants Monitor and take corrective Describe and/or quantify
(exposure) actively engage with, interact action to ensure participants how much of the inter-
with, are receptive to, and/or are receiving and/or using vention was received.
use materials or recommended materials/resources.
resources; can include “initial
use” and “continued use.”
Dose received Participant (primary and Obtain regular feedback from Describe and/or rate
(satisfaction) secondary audiences) satisfaction primary and secondary participant satisfaction
with program, interactions with staff targets and use feedback as and how feedback was
and/or investigators. needed for corrective action. used.
Reach Proportion of the intended priority Monitor numbers and character- Quantify how much of
(participation audience that participates in the istics of participants; ensure the intended target audi-
rate) intervention; often measured by sufficient numbers of target ence participated in the
attendance; includes documentation population are being reached. intervention; describe
of barriers to participation. those who participated
and those who did not.
Recruitment Procedures used to approach and Monitor and document Describe recruitment
attract participants at individual recruitment procedures to procedures.
or organizational levels; includes ensure protocol is followed;
maintenance of participant adjust as needed to ensure
involvement in intervention and reach.
measurement components of study.
Context Aspects of the environment that Monitor aspects of the physical, Describe and/or quantify
may influence intervention imple- social, and political environ- aspects of the environ-
mentation or study outcomes; ment and how they impact ment that affected pro-
includes contamination or the implementation and needed gram implementation
extent to which the control group corrective action. and/or program impacts
was exposed to the program. or outcomes.
NOTE: Adapted from Steckler and Linnan (2002a) and Baranowski and Stables (2000).
be conveyed in a logic model that designed to reduce alcohol and products, and staff behaviors
specifies the theoretical constructs tobacco use among youth. (Scheirer et al., 1995). The goal of
of interest, those expected to change, this step is to state what would be
and mediators of the change process entailed in complete and acceptable
Step 2: Describe Complete and
(Scheirer et al., 1995; Steckler & delivery of the program (Bartholo-
Acceptable Delivery of the
Linnan, 2002a). A description of mew et al., 2001). A description of
Program
Media Matters, the fictional case complete and acceptable delivery of
example we will use to illustrate The elements that comprise the the program should be based on the
process-evaluation planning, is pro- program are described in more details of the program (e.g., program
vided in Figure 2. Media Matters is a detail in the second step of process- components, theory, and elements
school-based program with environ- evaluation planning; this includes in logic model) and guided by an
mental and curricula components specific strategies, activities, media external framework such as the rec-
Fidelity 1. To what extent was the intervention 1. What constitutes high-quality delivery for
implemented consistently with the each component of the intervention? What
underlying theory and philosophy? specific behaviors of staff reflect the theory
and philosophy?
2. To what extent was training provided as 2. What behaviors of trainers convey the
planned (consistent with the underlying underlying theory and philosophy?
theory and/or philosophy)?
Dose delivered 3. To what extent were all of the intended units 3. How many units/components (and
or components of the intervention or program subcomponents as applicable) are in the
provided to program participants? intervention?
4. To what extent were all materials (written and 4. What specific materials are supposed to be
audiovisual) designed for use in the used and when should they be used?
intervention used?
5. To what extent was all of the intended 5. What specific content should be included
content covered? and when should it be covered? What is the
minimum and maximum time to spend on
the content?
6. To what extent were all of the intended 6. What specific methods, strategies, and/or
methods, strategies, and/or activities used? activities should be used in what sessions?
Dose received 7. To what extent were participants present at 7. What participant behaviors indicate being
intervention activities engaged in the activities? engaged?
8. How did participants react to specific aspects 8. With what specific aspects of the
of the intervention? intervention (e.g., activities, materials,
training, etc.) do we want to assess
participant reaction or satisfaction?
9. To what extent did participants engage in 9. What are the expected follow-up behaviors:
recommended follow-up behavior? reading materials, engaging in
recommended activities, or using resources?
Reach 10. What proportion of the priority target 10. What is the total number of people in the
audience participated in (attended) each priority population?
program session? How many participated
in at least one half of possible sessions?
Recruitment 11. What planned and actual recruitment 11. What mechanisms should be in place to
procedures were used to attract individuals, document recruitment procedures?
groups, and/or organizations?
12. What were the barriers to recruiting 12. How will we systematically identify and
individuals, groups, and organizations? document barriers to participation?
13. What planned and actual procedures were 13. How will we document efforts for
used to encourage continued involvement of encouraging continued involvement in
individuals, groups, and organizations? intervention?
14. What were the barriers to maintaining 14. What mechanisms should be in place to
involvement of individuals, groups, and identify and document barriers encountered
organizations? in maintaining involvement of participants?
Context 15. What factors in the organization, community, 15. What approaches will be used to identify
social/political context, or other situational and systematically assess organizational,
issues could potentially affect either community, social/political, and other
intervention implementation or the contextual factors that could affect the
intervention outcome? intervention? Once identified, how will
these be monitored?
Design Timing of data collection: Observe classroom activities at least twice per semes-
when and how often data ter with at least 2 weeks between observations.
are to be collected. Conduct focus groups with participants in the last
month of the program.
Data sources Source of information (e.g., For both quantitative and qualitative approaches, data
who will be surveyed, sources include participants, teachers/staff deliver-
observed, interviewed, etc.). ing the program, records, the environment, etc.
Data collection tools Instruments, tools, and guides For both quantitative and qualitative approaches, tools
or measures used for gathering process- include surveys, checklists, observation forms, inter-
evaluation data. view guides, etc.
Data collection Protocols for how the data Detailed description of how to do quantitative and/or
procedures collection tool will be qualitative classroom observation, face-to-face or
administered. phone interview, mailed surveys, etc.
Data management Procedures for getting data Staff turn in participant sheets weekly; process-
from field and entered; evaluation coordinator collects and checks surveys
quality checks on raw data and gives them to data entry staff.
forms and data entry. Interviewers transcribe information and turn in tapes
and complete transcripts at the end of the month.
Data analysis/ Statistical and/or qualitative Statistical analysis and software that will be used to
synthesis methods used to analyze, analyze quantitative data (e.g., frequencies and chi
synthesize, and/or squares in SAS).
summarize data. Type of qualitative analysis and/or software that will
be used (e.g., NUD*IST to summarize themes).
Step 3: Develop a List of Potential To illustrate, a sampling of poten- received is different from reach, dis-
Process-Evaluation Questions tial process-evaluation questions for cussed below! Although the quality
a so-called generic school-based pro- of training is presented as a fidelity
In Step 3, the initial wish list of
gram is presented in Table 2; also question in Table 2, it can also be
possible process-evaluation ques-
noted is the information needed to considered a separate intervention
tions based on the program (without
answer each process-evaluation component in which dose, reach,
full consideration of resources)
question, which should also be iden- and fidelity are assessed, as in the
needed is drafted. (Alternatively,
tified in this step. As noted previ- case example (see Figure 3).
these can be stated as process objec-
ously, defining fidelity can be chal- Process evaluation for reach,
tives.) This initial draft of process-
lenging and should be based on the recruitment, and context often in-
evaluation questions can be orga-
underlying theory and philosophy volves documentation and record
nized by intervention component
of the program. Specifying dose keeping. Part of the planning for
and guided by the elements of a
delivered is fairly straightforward these elements involves specifying
process-evaluation plan. The ele-
after the components of the interven- the specific elements to document or
ments (or components) of a process-
tion are defined because this per- monitor. Attendance or records of
evaluation plan are provided in
tains to the parts of the intervention participation often assess the reach
Table 1 and include fidelity, dose
that staff deliver. Dose received, the of the intervention into the priority
delivered, dose received, reach,
expected participant reaction or population. However, attendance
recruitment, and context (developed
involvement, is somewhat more records must be compared with the
from Baranowski & Stables, 2000;
challenging in that it requires number of potential program partici-
Steckler & Linnan, 2002a). Each com-
thoughtful consideration of the pants (the number of people in the
ponent can be used for both forma-
expected reactions and/or behaviors priority population) to be meaning-
tive and summative purposes, as
in participants. Note that dose ful. Note that reach can be reported
shown in the table.
FIGURE 5 Media Matters Case Study—Step 4: Determine Methods for Process Evaluation
Process-
Evaluation Data Tools/ Timing of Data Analysis
Question Sources Procedures Data Collection or Synthesis Reporting
Fidelity 1. To what extent was Teachers and Self-reported check- Teachers report Calculate score based Formative—informal
the curriculum Media Matters list and observa- weekly; at least two on percentage of feedback to staff
implemented as staff tion with checklist observations per intended characteristics weekly; summative—
planned? teacher scheduled included summarized by mod-
ule, school, and
overall
Dose delivered 2. To what extent were Teachers and Self-reported check- Teachers report Calculate score based Formative—informal
all modules and units Media Matters list and observa- weekly; at least two on percentage of feedback to staff
within the curriculum staff tion with checklist observations per intended modules and weekly; summative—
implemented? teacher scheduled units included summarized by mod-
ule, school, and
overall
Dose received 3. Did students enjoy Teachers and Focus groups with Focus groups and Teachers—themes Summative—reported
the Media Matters students open-ended ques- scales administered identified through qual- after curriculum
curriculum and tions for teachers; at end of semester in itative analysis; stu- implementation is
activities? brief satisfaction which curriculum dents—response fre- completed
4. Were Media Matters scales adminis- implemented quencies summarized
instructors satisfied tered to students
with the curriculum?
Reach 5. Was the intervention Teachers Classroom rosters Taken for each class Look at number of stu- Formative—report
delivered to at least in which Media dents participating in weekly by class;
80% of the eighth- Matters is taught at least 80% of class summative—report
grade sessions/total number by module, school,
students? of students and overall
Recruitment 6. What procedures Media Matters Media Matters staff Daily Narrative description Formative—examined
were followed to staff document all of procedures weekly to prevent/
recruit teachers to recruitment solve problems;
Media Matters activities summative—
training? described for project
overall
Context 7. What were barriers Teachers Focus groups with Administered at end Themes identified Summative—reported
and facilitators to open-ended ques- of semester in which through qualitative after curriculum
implementing the tions for teachers curriculum was analysis implementation is
Media Matters implemented completed
curriculum?
Evaluation and Practice
McGraw, S. A., Stone, E. J., Osganian, S. K., curricula: A comparison of three measures. ventions and research. San Francisco:
Elder, J. P., Perry, C. L., Johnson, C. C., et al. Health Education Research, 13(2), 239-250. Jossey-Bass.
(1994). Design of process evaluation within Scheirer, M. A., Shediac, M. C., & Cassady, Viadro, C., Earp, J. A. L., & Altpeter, M.
the child and adolescent trial for cardiovas- C. E. (1995). Measuring the implementation (1997). Designing a process evaluation for a
cular health (CATCH). Health Education of health promotion programs: The case of comprehensive breast cancer screening
Quarterly, Suppl. 2, S5-S26. the breast and cervical-cancer program in intervention: Challenges and opportunities.
Millstein, B., Wetterhall S., & CDC Evalua- Maryland. Health Education Research, Evaluation and Program Planning, 20(3),
tion Working Group. (2000). A framework 10(1), 11-25. 237-249.
featuring steps and standards for program Steckler, A., & Linnan, L. (2002a). In A. Zapka, J., Goins, K. V., Pbert, L., & Ockene,
evaluation. Health Promotion Practice, Steckler & L. Linnan (Eds.), Process evalua- J. K. (2004). Translating efficacy research to
1(3), 221-228. tion for public health interventions and effectiveness studies in practice: Lessons
Resnicow, K., Davis, M., Smith, M., Laza- research (pp. 1-24). San Francisco: Jossey- from research to promote smoking cessa-
rus-Yaroch, A., Baranowski, T., Bass. tion in community health care centers.
Baranowski, J., et al. (1998). How best to Steckler, A., & Linnan, L. (Eds.). (2002b).
Health Promotion Practice, 5(3), 245-255.
measure implementation of school health Process evaluation for public health inter-