You are on page 1of 14

HEALTH PROMOTION PRACTICE / April 2005

April
2
6 Evaluation and Practice

Developing a Process-Evaluation Plan


for Assessing Health Promotion Program
Implementation: A How-To Guide
Ruth P. Saunders, PhD
Martin H. Evans, MS
Praphul Joshi, PhD, MPH

Process evaluation is used to moni- for process-evaluation plans include the program and how that could
tor and document program imple- fidelity, dose (delivered and re- affect program impacts or outcomes
mentation and can aid in under- ceived), reach, recruitment, and con- (Bouffard, Taxman, & Silverman,
standing the relationship between text. The purpose of this article is 2003; Harachi, Abbott, Catalano,
specific program elements and pro- to describe and illustrate the steps Haggerty, & Fleming, 1999).
gram outcomes. The scope and im- involved in developing a process- In recent years, an increasing em-
plementation of process evaluation evaluation plan for any health- phasis has been placed on measur-
has grown in complexity as its im- promotion program. ing program implementation, in part
portance and utility have become because of great variability in pro-
more widely recognized. Several Keywords: process evaluation; mea- gram implementation and policy
practical frameworks and models suring program imple- adoption in school and commu-
are available to practitioners to mentation nity settings (Dusenbury, Brannigan,
guide the development of a compre- Falco, & Hansen, 2003; Harachi
hensive evaluation plan, including et al., 1999; Helitzer, Yoon, Waller-
process evaluation for collaborative stein & Garcia-Velarde, 2000; McGraw

M
community initiatives. However, uch emphasis is placed on et al., 2000; Scheirer et al., 1995;
frameworks for developing a com- outcome evaluation to
prehensive process-evaluation plan determine whether a health-
for targeted programs are less com- promotion program was successful. The Authors
mon. Building from previous frame- Process evaluation, which helps us
works, the authors present a com- understand why a program was or Ruth P. Saunders, PhD, is an associate
prehensive and systematic approach was not successful, is equally impor- professor in the Department of Health
for developing a process-evaluation tant (Bartholomew, Parcel, Kok, & Promotion, Education, and Behavior
plan to assess the implementation Gottlieb, 2001; Steckler & Linnan, in the Arnold School of Public Health
of a targeted health promotion 2002a). In fact, process evaluation at the University of South Carolina in
intervention. Suggested elements may be used to confirm that the Columbia, South Carolina.
intervention was indeed imple- Martin H. Evans, MS, is a doctoral
Authors’ Note: Correspondence concerning mented before using resources to candidate in the Department of Health
this article should be addressed to Ruth P.
assess its effectiveness (Scheirer, Promotion, Education, and Behavior
Saunders; Department of Health Promo-
tion, Education, and Behavior; Arnold
Shediac, & Cassady, 1995). A pro- in the Arnold School of Public Health
gram’s lack of success could be at the University of South Carolina in
School of Public Health; University of
attributed to any number of Columbia, South Carolina.
South Carolina; 800 Sumter Street; Colum-
bia, South Carolina 29208. Phone: (803) program-related reasons, including Praphul Joshi, PhD, MPH, is a
777-2871; e-mail: rsaunders@sc.edu. poor program design, poor or incom- research and planning administrator
plete program implementation, and/ in the Division of Cardiovascular
Health Promotion Practice or failure to reach sufficient num- Health at the South Carolina Depart-
April 2005 Vol. 6, No. 2, 134-147 bers of the target audience. Process ment of Environmental Control in
DOI: 10.1177/1524839904273387 evaluation looks inside the so-called Columbia, South Carolina.
©2005 Society for Public Health Education black box to see what happened in

134
Evaluation and Practice

Zapka, Goins, Pbert, & Ockene,


2004). Several practical frameworks
and models to guide the develop-
ment of comprehensive evaluation
> FORWARD FROM THE EDITORS
plans, including process evalua- We are pleased to highlight Saunders, Evans, and Joshi’s article,
tion for collaborative community which offers a comprehensive, systematic approach for developing a
initiatives, have been developed. In- process-evaluation plan. In this article, the authors describe the main
cluded among these are Prevention elements of process-evaluation planning (fidelity, dose, reach, recruit-
Plus III (Linney & Wandersman, ment, and context) and its six steps (as adapted from Steckler &
1991), Community Coalition Action Linnan, 2002b). The real contribution of the work is that they then pro-
Theory (Butterfoss & Kegler, 2002), vide the reader with a step-by-step evaluation plan for a hypothetical
Getting to Outcomes (Chinman et al., school-based media training program, Media Matters.
2001), and the CDC Framework Many health-promotion and disease-prevention programs are being
(Millstein, Wetterhall, & CDC Evalu- implemented across the country with limited resources, often pro-
ation Working Group, 2000). Al- vided primarily by volunteers. We know that many of you who work in
though comprehensive evaluation school and community settings are expected to document your pro-
plans such as these are available to grams and their effects, not to perform expensive, complex program
practitioners, frameworks for devel- evaluations. Effective process evaluation can be used to monitor pro-
oping a comprehensive process- gram efforts so that programs are implemented as planned, use
evaluation plan for targeted pro- resources where needed, and change when appropriate. Process
grams are less common (Steckler & assessment data help provide accountability for administrators and
Linnan, 2002a). Recent advances funders and let planners know why the program worked or did not
have occurred in identifying and work. Process evaluation will help staff assess if the program is reach-
clarifying the components of process ing its intended audience and provide a clear description of the pro-
evaluation (Baranowski & Stables, gram that was implemented so that it can be disseminated and shared
2000; Steckler & Linnan, 2002b). with others. As always, one of the main goals of program evaluation is
Steckler and Linnan (2002a) also to make sure it is useful—to you, your administration, your funders,
provided an overview of the key and your program participants. The following article is offered in the
steps in planning process evaluation spirit of providing another practical, useful tool for our readers.
as well as examples of process evalu- Frances Butterfoss and Vincent T. Francisco
ation from studies in a variety of set-
tings (Steckler & Linnan, 2002b).
In this article, we work from these
previously identified process evalu-
ation components, definitions, and geted programs that are imple- stakeholders with a multidisciplin-
steps to present a comprehensive, mented as part of a coalition activity ary professional perspective and an
systematic, and user-friendly ap- (e.g., Level 2, coalition evaluation; understanding of the iterative
proach to planning process evalua- Butterfoss & Francisco, 2004) or as a nature of process-evaluation plan-
tion in public health interventions stand-alone program in a specific ning (Bartholomew et al., 2001;
with an emphasis on assessing tar- setting (e.g., schools, work sites). Butterfoss & Francisco, 2004;
geted program implementation. Spe- The development of the process- Steckler & Linnan, 2002a). Impor-
cifically, we provide an in-depth evaluation plan is illustrated with a tant issues that must be addressed
guide for developing a plan to assess proposed, school-based case study, when developing a process-evalua-
program implementation, which is Media Matters. tion plan include (a) understanding
one component of a comprehensive the health promotion program and
program-evaluation plan (e.g., Step
6, program plan, and Step 7, quality > INTRODUCTION TO
PROCESS-EVALUATION
how it is supposed to work, (b)
defining the purposes for the process
of implementation, of the Getting to evaluation, and (c) considering pro-
PLANNING
Outcomes framework; Chinman et gram characteristics and context and
al., 2001). The process presented in Ideally, process-evaluation plan- how these may affect implementa-
this article is especially suited for ning is conducted with a collabora- tion. Having a well-planned and
planning process evaluation for tar- tive planning team that includes key theory-based health-promotion pro-

Saunders et al. / DEVELOPING A PROCESS-EVALUATION PLAN 135


gram is the beginning point for
planning process evaluation. The
1. Describe the
approach to process-evaluation program
planning described in this article
assumes that the intervention has
been planned in detail with guid-
2. Describe complete &
ance from appropriate theories and/
acceptable program
or a conceptual model. delivery
Broadly speaking, process-evalu-
ation data can be used for both
formative and summative pur- Steps 3 – 5 considered iteratively
poses. Formative uses of process
evaluation involve using process- 5. Consider program
evaluation data to fine-tune the pro- resources, context &
characteristics
gram (e.g., to keep the program on
track; Devaney & Rossi, 1997; 3. Develop potential
Helitzer et al., 2000; Viadro, Earp, & list of questions
Altpeter, 1997). Summative uses of 4. Determine methods
process evaluation involve making a
judgment about the extent to which
the intervention was implemented
as planned and reached intended
participants (Devaney & Rossi, 1997;
Helitzer et al., 2000). This informa- 6. Finalize the
process evaluation
tion, in turn, can be used to interpret plan
and explain program outcomes, ana-
lyze how a program works, and pro-
vide input for future planning FIGURE 1 Steps in the Process-Evaluation Process
(Baranowski & Stables, 2000; Dehar,
Casswell, & Duignan, 1993; McGraw
et al., 1994, 1996). Beyond these broad
purposes, a process-evaluation plan
The Editors will also have specific purposes that
Frances D. Butterfoss, PhD, is an associate professor and directs the Health Promo- are unique to the program for which
tion and Disease Prevention Section at the Center for Pediatric Research, Norfolk, it is being designed, as described in
VA. the next section.
Health-behavior change is made
Vince Francisco, PhD, is associate professor of Public Health Education at the Uni-
in ongoing social systems that typi-
versity of North Carolina, Greensboro. cally involve participating agencies
(e.g., schools), program implement-
ers (e.g., teachers), a proximal tar-
get person (e.g., student), and a
distal target person (e.g., parent;
Baranowski & Stables, 2000). Assess-
ment of program implementation
requires taking into account the sur-
rounding social systems, including
characteristics of the organization in
which the program is being imple-
mented, characteristics of persons
delivering the program (Viadro et al.,

136 HEALTH PROMOTION PRACTICE / April 2005


Evaluation and Practice

PROGRAM DESCRIPTION FOR MEDIA MATTERS


Media Matters is a school-based program designed to decrease adolescent risk behaviors by increasing individual and group
empowerment among the participants. The overarching concept of the Media Matters program is that by developing an under-
standing of how media work—how they construct reality, create meaning, and influence behavior—students can increase their
self-efficacy regarding their ability to deconstruct media messages and, thereby, become more critically aware of and resistant to
the unhealthy behaviors that media sometimes reinforce. Because this curriculum includes a component in which students pro-
duce their own media, they also have an opportunity to put into practice the concepts and skills that they learn such as coopera-
tion, team building, and responsibility. A venue is also provided for the students to create their own messages to influence the
social norms regarding behaviors and issues with which they are directly involved. The highly participatory classroom strate-
gies include small group work, demonstration and feedback, problem solving, and role playing.
The theoretical framework of the program is based largely on Bandura’s (1986) Social Cognitive Theory (SCT). The
primary constructs of SCT, on which the Media Matters curriculum focuses, are self-efficacy, behavioral capability, situ-
ation, observational learning, knowledge, and attitudes. The primary objectives of the Media Matters intervention are to
increase adolescents’ self-efficacy for abstaining from alcohol and tobacco consumption and to influence the social
norms within the school environment so that alcohol and tobacco consumption are viewed as unhealthy choices. The
Media Matters logic model is provided below:

Behavioral Health
Inputs Immediate Impacts Short-Term Impacts Impacts Outcomes

Providing training, result in the development result in changes in the will result in improved
materials, and of a Media Matters team school media environment, reduced use of health in
consultation, to and cross-disciplinary social norms, and alcohol and students.
teachers will implementation of the development of students’ tobacco and,
Media Matters curriculum, self-efficacy and skills for ultimately,
which will abstaining from alcohol and
tobacco use, which

FIGURE 2 Media Matters Case Study—Step 1: Describe the Program

1997; Zapka et al., 2004), existing lored intervention, single or multi- evaluation planning is an iterative
structures of the organizations and ple treatments; Viadro et al., 1997). rather than linear process, particu-
groups, organizational social system Program characteristics and context larly Steps 3 to 5. Each step of the
characteristics (e.g., interorganiza- affect process evaluation in at least process is described in more detail
tional linkages, community agency two ways. First, important contex- and illustrated with a case study in
partnerships; Scheirer et al., 1995; tual factors should be identified and the following section. The steps pre-
Zapka et al., 2004), and factors in the measured in process evaluation. sented in Figure 1 incorporate a
external environment (e.g., compet- Second, as program size and com- detailed understanding of the pro-
ing events, controversy about the plexity increase, the resources gram and its characteristics, contex-
program, external political factors, required to monitor and measure tual factors, and the purposes of the
and history and events that happen implementation will increase. process evaluation.
concurrently with the program;
Scheirer et al., 1995; Viadro et al.,
1997; Zapka et al., 2004). > STEPS FOR DEVELOPING
A PROCESS-EVALUATION
Step 1: Describe the Program
The characteristics of the program PLAN
are also an important influence on In Step 1, the previously planned
program implementation, including Six steps to developing a process- program is described fully, includ-
the program’s age (new or old), size evaluation plan for an intervention, ing its purpose, underlying theory,
(large or small), coverage (single- or adapted from Steckler and Linnan objectives, strategies, and the
multisite, local, state, or national), (2002a), are illustrated in Figure 1. expected impacts and outcomes of
and complexity (standardized or tai- As reflected in Figure 1, process- the intervention. Ideally, this should
Saunders et al. / DEVELOPING A PROCESS-EVALUATION PLAN 137
COMPLETE AND ACCEPTABLE DELIVERY OF MEDIA MATTERS
The ideally implemented Media Matters program will consist of four essential components: an environmental component
focusing on creating a Media Matters intervention team; two curriculum modules, Deconstructing Media Messages and Youth-
Produced Media; and a teacher training component.

The Environmental Component


A Media Matters team will be formed at each school to develop a strategic plan for delivering the program across several dis-
ciplines within the eighth-grade curriculum as documented by a written plan. Team membership will include at least one
administrator, two teachers, and appropriate staff (media specialist/librarian). Additionally, the school environment will be
supportive of media literacy/education by (a) strongly limiting the children’s exposure to commercial media such as advertise-
ments within the school (i.e., Channel 1 closed circuit TV, etc.) and (b) providing exhibition space and/or presentation time for
counteradvertising media messages produced by eighth-grade students.

The Curriculum Component


The Media Matters curriculum is comprised of two primary modules —Deconstructing Media Messages and Youth-
Produced Media—and is designed to be delivered during an 8-week period with an average time of 1.5 to 2 hours per week spent
directly on the curriculum. The curriculum is designed to be highly participatory. Implementers are encouraged to engage the
student participants in a manner that promotes critical thinking and discussion. These qualities will become more vital in the
latter portion of the intervention when students are required to perform some role-playing activities and work in teams.
For implementation to be considered ideal, the curriculum should be delivered across multiple disciplines within the
eighth-grade curriculum, one of which must include health. A likely combination of disciplines might include social studies,
language arts, and health promotion/prevention. The curriculum components are designed to be delivered in sequential order
because the knowledge and skills taught in the second require mastering those taught in the first.
The Deconstructing Media Messages module will contain, at a minimum, the following:
• Instructors will deliver Media Messages curriculum in which students view media examples with in-class discussion.
• Instructors will assign students a self-reflection assignment regarding media influence on personal behavior.
• Instructors will engage students in role-playing activities in which they view a scene from a movie depicting a particular
risk behavior and then act out how the character could have better handled the situation. They will then enact the conse-
quences of the behaviors.
The Youth-Produced Media module will be characterized by:
• Students working in teams to produce counteradvertising messages designed to promote healthy decisions related to alco-
hol and tobacco consumption.
• Students participating in the entire production process—brainstorming, scripting, storyboarding, and production.
• Students analyzing and critiquing each group’s final product based on its use of the various techniques to increase its
effectiveness.
• Counteradvertising messages will be publicly showcased.
The Training Component
Media Matters teachers will obtain needed skills (i.e., effective use of media in the classroom and teaching media production
skills) through training sessions delivered by Media Matters program developers. Training will include 1 week during summer
and booster training (a half day) in January. Additional consultation and support will be provided as needed. All necessary pro-
gram materials will be provided during training.
The training will be characterized by:
• Having the appropriate teachers present (reflecting multiple disciplines).
• Coverage of primary content.
• Demonstration of interactive classroom strategies.
• Involvement of teachers through small group discussion, demonstration and feedback, role play, and developing counter
messages.
The training will also address the Media Matters team, including:
• Appropriate team membership.
• Effective team functioning.
• The Media Matters team’s role and activities in the school.
• Involving teachers, administrators, and staff outside of the Media Matters team.

FIGURE 3 Media Matters Case Study—Step 2: Describe Complete and Acceptable Delivery of the Program

138 HEALTH PROMOTION PRACTICE / April 2005


Evaluation and Practice

TABLE 1
Elements of a Process-Evaluation Plan, With Formative and Summative Applications

Component Purpose Formative Uses Summative Uses

Fidelity (quality) Extent to which intervention was Monitor and adjust program Describe and/or quantify
implemented as planned. implementation as needed to fidelity of intervention
ensure theoretical integrity implementation.
and program quality.
Dose delivered Amount or number of intended Monitor and adjust program Describe and/or quantify
(completeness) units of each intervention or implementation to ensure all the dose of the interven-
component delivered or components of intervention tion delivered.
provided by interventionists. are delivered.
Dose received Extents to which participants Monitor and take corrective Describe and/or quantify
(exposure) actively engage with, interact action to ensure participants how much of the inter-
with, are receptive to, and/or are receiving and/or using vention was received.
use materials or recommended materials/resources.
resources; can include “initial
use” and “continued use.”
Dose received Participant (primary and Obtain regular feedback from Describe and/or rate
(satisfaction) secondary audiences) satisfaction primary and secondary participant satisfaction
with program, interactions with staff targets and use feedback as and how feedback was
and/or investigators. needed for corrective action. used.
Reach Proportion of the intended priority Monitor numbers and character- Quantify how much of
(participation audience that participates in the istics of participants; ensure the intended target audi-
rate) intervention; often measured by sufficient numbers of target ence participated in the
attendance; includes documentation population are being reached. intervention; describe
of barriers to participation. those who participated
and those who did not.
Recruitment Procedures used to approach and Monitor and document Describe recruitment
attract participants at individual recruitment procedures to procedures.
or organizational levels; includes ensure protocol is followed;
maintenance of participant adjust as needed to ensure
involvement in intervention and reach.
measurement components of study.
Context Aspects of the environment that Monitor aspects of the physical, Describe and/or quantify
may influence intervention imple- social, and political environ- aspects of the environ-
mentation or study outcomes; ment and how they impact ment that affected pro-
includes contamination or the implementation and needed gram implementation
extent to which the control group corrective action. and/or program impacts
was exposed to the program. or outcomes.

NOTE: Adapted from Steckler and Linnan (2002a) and Baranowski and Stables (2000).

be conveyed in a logic model that designed to reduce alcohol and products, and staff behaviors
specifies the theoretical constructs tobacco use among youth. (Scheirer et al., 1995). The goal of
of interest, those expected to change, this step is to state what would be
and mediators of the change process entailed in complete and acceptable
Step 2: Describe Complete and
(Scheirer et al., 1995; Steckler & delivery of the program (Bartholo-
Acceptable Delivery of the
Linnan, 2002a). A description of mew et al., 2001). A description of
Program
Media Matters, the fictional case complete and acceptable delivery of
example we will use to illustrate The elements that comprise the the program should be based on the
process-evaluation planning, is pro- program are described in more details of the program (e.g., program
vided in Figure 2. Media Matters is a detail in the second step of process- components, theory, and elements
school-based program with environ- evaluation planning; this includes in logic model) and guided by an
mental and curricula components specific strategies, activities, media external framework such as the rec-

Saunders et al. / DEVELOPING A PROCESS-EVALUATION PLAN 139


TABLE 2
Sample Process Evaluation Questions for Fidelity, Dose Delivered, Dose Received, Reach, Recruitment, and Context

Possible Question Information Needed

Fidelity 1. To what extent was the intervention 1. What constitutes high-quality delivery for
implemented consistently with the each component of the intervention? What
underlying theory and philosophy? specific behaviors of staff reflect the theory
and philosophy?
2. To what extent was training provided as 2. What behaviors of trainers convey the
planned (consistent with the underlying underlying theory and philosophy?
theory and/or philosophy)?
Dose delivered 3. To what extent were all of the intended units 3. How many units/components (and
or components of the intervention or program subcomponents as applicable) are in the
provided to program participants? intervention?
4. To what extent were all materials (written and 4. What specific materials are supposed to be
audiovisual) designed for use in the used and when should they be used?
intervention used?
5. To what extent was all of the intended 5. What specific content should be included
content covered? and when should it be covered? What is the
minimum and maximum time to spend on
the content?
6. To what extent were all of the intended 6. What specific methods, strategies, and/or
methods, strategies, and/or activities used? activities should be used in what sessions?
Dose received 7. To what extent were participants present at 7. What participant behaviors indicate being
intervention activities engaged in the activities? engaged?
8. How did participants react to specific aspects 8. With what specific aspects of the
of the intervention? intervention (e.g., activities, materials,
training, etc.) do we want to assess
participant reaction or satisfaction?
9. To what extent did participants engage in 9. What are the expected follow-up behaviors:
recommended follow-up behavior? reading materials, engaging in
recommended activities, or using resources?
Reach 10. What proportion of the priority target 10. What is the total number of people in the
audience participated in (attended) each priority population?
program session? How many participated
in at least one half of possible sessions?
Recruitment 11. What planned and actual recruitment 11. What mechanisms should be in place to
procedures were used to attract individuals, document recruitment procedures?
groups, and/or organizations?
12. What were the barriers to recruiting 12. How will we systematically identify and
individuals, groups, and organizations? document barriers to participation?
13. What planned and actual procedures were 13. How will we document efforts for
used to encourage continued involvement of encouraging continued involvement in
individuals, groups, and organizations? intervention?
14. What were the barriers to maintaining 14. What mechanisms should be in place to
involvement of individuals, groups, and identify and document barriers encountered
organizations? in maintaining involvement of participants?
Context 15. What factors in the organization, community, 15. What approaches will be used to identify
social/political context, or other situational and systematically assess organizational,
issues could potentially affect either community, social/political, and other
intervention implementation or the contextual factors that could affect the
intervention outcome? intervention? Once identified, how will
these be monitored?

140 HEALTH PROMOTION PRACTICE / April 2005


Evaluation and Practice

ommended elements of a process- POTENTIAL PROCESS-EVALUATION QUESTIONS


evaluation plan. These elements FOR MEDIA MATTERS
include fidelity (quality of imple-
mentation), dose (dose delivered— Through a series of meetings, program planners developed a list of potential
process-evaluation questions that included the following:
amount of program delivered by
implementers and dose received—
Fidelity
extent to which participants receive
and use materials or other • To what extent was each of the program elements implemented as planned
resources), and reach (degree to (as described in “complete and acceptable delivery” in Figure 3)?
which the intended priority audi-
ence participates in the interven- Dose Delivered
tion; Steckler & Linnan, 2002a). • Were all intervention components delivered (e.g., did teachers deliver all
These four elements, plus two addi- units of the curriculum)?
tional elements, are described in
more detail in Step 3 below. Dose Received
Describing fidelity or what consti- • To what extent did instructors participate on the Media Matters team (i.e.,
tutes high-quality implementation is coordinate with other team members on a strategic plan for delivering the
often a challenging task (Steckler & Media Matters program across several disciplines)?
Linnan, 2002a). Theory can provide • To what extent did instructors make changes in their own curriculum to
a possible guide for defining fidelity. incorporate Media Matters modules?
To illustrate, based on theory, two
• Did the school remove commercial advertising/media from school?
• Did students enjoy the Media Matters curriculum and activities?
possible sources of efficacy informa- • Were the Media Matters instructors in the intervention classes satisfied with
tion are direct experience (experi- the curriculum?
encing success) and vicarious expe-
rience (observing models similar to Reach
oneself experiencing success;
• Was the Media Matters curriculum delivered to at least 80% of the eighth-
Bandura, 1986). This suggests two grade students?
intervention strategies: modeling • To what extent did Media Matters instructors participate in training?
(participants observing persons sim-
ilar to themselves practicing a skill) Recruitment
and guided practice (opportunity for
• What procedures were followed to recruit teachers to training and to develop
participants to practice the skill with the Media Matters team?
constructive feedback and in such a
way they experience success). Fidel- Context
ity pertains to how well the imple-
mentation of these strategies reflects Organizational Factors
the spirit of the theory. For example, • Did the school allow common meeting time for Media Matters team
the models demonstrating the skills planning?
should be people that the partici- • Did the school provide release time for implementers to attend trainings?
pants can identify with, and the ver-
Other
bal and nonverbal behavior of staff
leading the guided practice should • What other barriers and facilitators influenced delivery of the Media Matters
be emotionally positive, providing program in the schools?
gentle redirection only as needed
and carefully reinforcing the aspects FIGURE 4 Media Matters Case Study—Step 3: Develop List of Potential Process-Eval-
of the skills practice that are done uation Questions
correctly. In Figure 3, the expected
characteristics of the four
components of Media Matters are
described to illustrate Step 2.

Saunders et al. / DEVELOPING A PROCESS-EVALUATION PLAN 141


TABLE 3
Issues to Consider When Planning Process-Evaluation Methods

Methodological Component General Definition Examples of Quantitative and Qualitative Approaches

Design Timing of data collection: Observe classroom activities at least twice per semes-
when and how often data ter with at least 2 weeks between observations.
are to be collected. Conduct focus groups with participants in the last
month of the program.
Data sources Source of information (e.g., For both quantitative and qualitative approaches, data
who will be surveyed, sources include participants, teachers/staff deliver-
observed, interviewed, etc.). ing the program, records, the environment, etc.
Data collection tools Instruments, tools, and guides For both quantitative and qualitative approaches, tools
or measures used for gathering process- include surveys, checklists, observation forms, inter-
evaluation data. view guides, etc.
Data collection Protocols for how the data Detailed description of how to do quantitative and/or
procedures collection tool will be qualitative classroom observation, face-to-face or
administered. phone interview, mailed surveys, etc.
Data management Procedures for getting data Staff turn in participant sheets weekly; process-
from field and entered; evaluation coordinator collects and checks surveys
quality checks on raw data and gives them to data entry staff.
forms and data entry. Interviewers transcribe information and turn in tapes
and complete transcripts at the end of the month.
Data analysis/ Statistical and/or qualitative Statistical analysis and software that will be used to
synthesis methods used to analyze, analyze quantitative data (e.g., frequencies and chi
synthesize, and/or squares in SAS).
summarize data. Type of qualitative analysis and/or software that will
be used (e.g., NUD*IST to summarize themes).

Step 3: Develop a List of Potential To illustrate, a sampling of poten- received is different from reach, dis-
Process-Evaluation Questions tial process-evaluation questions for cussed below! Although the quality
a so-called generic school-based pro- of training is presented as a fidelity
In Step 3, the initial wish list of
gram is presented in Table 2; also question in Table 2, it can also be
possible process-evaluation ques-
noted is the information needed to considered a separate intervention
tions based on the program (without
answer each process-evaluation component in which dose, reach,
full consideration of resources)
question, which should also be iden- and fidelity are assessed, as in the
needed is drafted. (Alternatively,
tified in this step. As noted previ- case example (see Figure 3).
these can be stated as process objec-
ously, defining fidelity can be chal- Process evaluation for reach,
tives.) This initial draft of process-
lenging and should be based on the recruitment, and context often in-
evaluation questions can be orga-
underlying theory and philosophy volves documentation and record
nized by intervention component
of the program. Specifying dose keeping. Part of the planning for
and guided by the elements of a
delivered is fairly straightforward these elements involves specifying
process-evaluation plan. The ele-
after the components of the interven- the specific elements to document or
ments (or components) of a process-
tion are defined because this per- monitor. Attendance or records of
evaluation plan are provided in
tains to the parts of the intervention participation often assess the reach
Table 1 and include fidelity, dose
that staff deliver. Dose received, the of the intervention into the priority
delivered, dose received, reach,
expected participant reaction or population. However, attendance
recruitment, and context (developed
involvement, is somewhat more records must be compared with the
from Baranowski & Stables, 2000;
challenging in that it requires number of potential program partici-
Steckler & Linnan, 2002a). Each com-
thoughtful consideration of the pants (the number of people in the
ponent can be used for both forma-
expected reactions and/or behaviors priority population) to be meaning-
tive and summative purposes, as
in participants. Note that dose ful. Note that reach can be reported
shown in the table.

142 HEALTH PROMOTION PRACTICE / April 2005


Evaluation and Practice

> Process-Evaluation Methods


Because the process evaluation of the Media Matters program will be for both formative and summative purposes, one of the
first issues to consider will be the timing for data collection and reporting. This required the planners to develop a data-collec-
tion, analysis, and reporting scheme that would assure that the program planners would receive formative feedback in a timely
manner so that they could make adjustments to the program if necessary. Media Matters would also require a highly trained and
coordinated data-collection and management staff. However, to maximize internal validity, evaluation staff will not perform
both formative and summative assessments; an independent evaluation team that had no part in the intervention design or plan-
ning would be utilized for summative measures.
Potential process-evaluation methods (data sources, data-collection tools, and timing) for the Media Matters curriculum are
summarized below:
Implementation fidelity for Media Matters curriculum. Possible data sources and methods include reports from teachers imple-
menting the curriculum and Media Matters staff observation; both require developing a checklist of the expected characteristics
of implementation.
Dose delivered for Media Matters curriculum. Possible data sources and methods include reports from teachers implementing
the curriculum and Media Matters staff observation; both require developing a checklist of content to be covered and methods to
be used in the curriculum.
Dose received. Possible data sources include teachers, staff, administrators, and students in the school; methods and tools
include administering brief satisfaction scales and conducting interviews or focus groups with open-ended questions.
Reach. Data sources are the classes in which the Media Matters curriculum is taught; for each Media Matters session, the teacher
could write down and report the head count, have students sign in on a sign-in sheet, or check students’ names off on a class roll.
Recruitment. Media Matters staff document all activities involved in identifying and recruiting teachers for the Media Matters
curriculum training.
Context. Possible data sources include school teachers, staff, and administrators. The primary method and tool are interviews
with open-ended questions to assess barriers to implementation.

FIGURE 5 Media Matters Case Study—Step 4: Determine Methods for Process Evaluation

> Program Resources, Context,


and Characteristics barriers to recruitment. Similarly, an
effective assessment of contextual
The Media Matters program is somewhat complex in terms of size and issues requires that the team identify
researcher control of the intervention. The initial study will involve two treatment potential factors in the organiza-
and two comparison middle schools, each having between 250 and 350 eighth tional, community, and political/
graders. Some aspects of the program are standardized (e.g., the curriculum),
whereas other aspects (e.g., the environmental component) are tailored to the spe- social environment that may affect
cific school. One of the primary issues that program planners had to consider when program implementation or program
prioritizing and developing the process-evaluation questions was the project staff outcome and that they develop effec-
and respondent burden. Because the program is to be delivered within schools by tive mechanisms to monitor or
instructors, training the instructors in the use of the curriculum is essential. Addi- assess these factors. See Figure 4 for
tionally, planners wanted to minimize the burden placed on these instructors in the Media Matters process-
terms of collecting process-evaluation data or intruding on them a great deal during
class times for observations. However, the planners wanted to ensure that the pro-
evaluation questions from Step 3.
cess data had a high level of reliability and validity. Finally, because of budget con-
straints, the project could devote only 1.5 FTE staff for process evaluation (coordi- Step 4: Determine Methods
nation, data collection, management, entry, and analysis).
for Process Evaluation
FIGURE 6 Media Matters Case Study—Step 5: Consider Program Resources, Context, In Step 4, the team begins to con-
and Characteristics sider the methods that will be used
to answer each question in the wish
for each session or activity within recruitment involves a clear articu- list of process-evaluation questions.
each component of the program as lation of recruitment procedures as Considering methods without also
well as overall for the program. well as developing mechanisms to considering the resources needed to
Effective process evaluation for document recruitment activities and carry out the process evaluation is

Saunders et al. / DEVELOPING A PROCESS-EVALUATION PLAN 143


144
TABLE 4
Final Process-Evaluation Plan for Media Matters Curriculum Implementation

Process-
Evaluation Data Tools/ Timing of Data Analysis
Question Sources Procedures Data Collection or Synthesis Reporting

Fidelity 1. To what extent was Teachers and Self-reported check- Teachers report Calculate score based Formative—informal
the curriculum Media Matters list and observa- weekly; at least two on percentage of feedback to staff
implemented as staff tion with checklist observations per intended characteristics weekly; summative—
planned? teacher scheduled included summarized by mod-
ule, school, and
overall
Dose delivered 2. To what extent were Teachers and Self-reported check- Teachers report Calculate score based Formative—informal
all modules and units Media Matters list and observa- weekly; at least two on percentage of feedback to staff
within the curriculum staff tion with checklist observations per intended modules and weekly; summative—
implemented? teacher scheduled units included summarized by mod-
ule, school, and
overall
Dose received 3. Did students enjoy Teachers and Focus groups with Focus groups and Teachers—themes Summative—reported
the Media Matters students open-ended ques- scales administered identified through qual- after curriculum
curriculum and tions for teachers; at end of semester in itative analysis; stu- implementation is
activities? brief satisfaction which curriculum dents—response fre- completed
4. Were Media Matters scales adminis- implemented quencies summarized
instructors satisfied tered to students
with the curriculum?
Reach 5. Was the intervention Teachers Classroom rosters Taken for each class Look at number of stu- Formative—report
delivered to at least in which Media dents participating in weekly by class;
80% of the eighth- Matters is taught at least 80% of class summative—report
grade sessions/total number by module, school,
students? of students and overall
Recruitment 6. What procedures Media Matters Media Matters staff Daily Narrative description Formative—examined
were followed to staff document all of procedures weekly to prevent/
recruit teachers to recruitment solve problems;
Media Matters activities summative—
training? described for project
overall
Context 7. What were barriers Teachers Focus groups with Administered at end Themes identified Summative—reported
and facilitators to open-ended ques- of semester in which through qualitative after curriculum
implementing the tions for teachers curriculum was analysis implementation is
Media Matters implemented completed
curriculum?
Evaluation and Practice

> The Final Process-Evaluation Plan that cannot be collected in a timely


manner for their intended purpose
After weighing carefully all the potential process questions that the program
planners would ideally ask against the complexity of a program with limited
may not need to be collected!
resources, the team decided to focus on fidelity of implementation of the curricu- Another issue concerns the level of
lum, reach of the curriculum into the student population, dose delivered (extent of objectivity needed for data collec-
curriculum implementation), dose received (satisfaction among students and tion and whether to use internal
teachers), and barriers to implementation (context). and/or external data collectors
The process-evaluation methods (data sources, tools/procedures, timing of data (Helitzer & Yoon, 2002). For exam-
collection, data analysis/synthesis, and reporting) are summarized in Table 4. For
ple, although the same instruments
each process-evaluation question the most appropriate data sources have been
identified, and the instruments and methods used to collect that data are outlined. or tools may be used for both pur-
For example, Media Matters program planners determined that the most appropri- poses, different staff and reporting
ate methods of assessing program fidelity (answering the question, “Was the pro- systems may be needed for
gram implemented as planned?”) was to use a combination of periodic observa- summative and formative process
tions by evaluation staff during program implementation and to collect reports evaluations.
from the instructors implementing the program. The decision to conduct periodic Because process evaluation can
observations necessitated the development of an observation instrument (check-
list) and protocol for adequately evaluating the implementation sessions. To ensure generate voluminous amounts of
reliability and validity of the evaluation instruments, an evaluation protocol was information, planning data manage-
developed and outlined during training sessions delivered to the evaluation staff. ment and analysis is essential.
Additionally, the evaluation instruments were pilot-tested prior to actual implemen- Where do data go after they are col-
tation of the program. lected? Who enters data? What is the
protocol for data entry? Having an
FIGURE 7 Media Matters Case Study—Step 6: Finalize the Process-Evaluation Plan analysis and reporting plan is
equally important. What type of data
analysis will be conducted (e.g., fre-
quency counts, qualitative analysis,
difficult. Accordingly, Steps 4 and 5 naires, and project archives (Bar- means)? Who analyzes the data?
are often considered simulta- tholomew et al., 2001; Devaney & How long will data analysis take?
neously. However, for clarity of pre- Rossi, 1997; McGraw et al., 2000; When will summary reports be gen-
sentation, they are presented Steckler & Linnan, 2002a). See erated? Who receives summary
separately here. Baranowski and Stables (2000) for a reports? When are the reports
Primary issues to consider in more detailed presentation of quali- needed? See Figure 5 for the prelimi-
planning process-evaluation meth- tative and quantitative aspects of nary process-evaluation methods for
ods include design (when data are to data collection for each component Media Matters from Step 4.
be collected), data sources (from of process evaluation.
where the information will come), The selection of specific methods
tools or measures needed to collect is based on the process-evaluation Step 5: Consider Program
data, data-collection procedures, questions (or objectives), resources Resources and Program
data-management strategies, and available, as well as the characteris- Characteristics and Context
data-analysis or data-synthesis plans tics and context of the program.
In Step 5, the team considers the
(see Table 3). Both qualitative Using multiple methods to collect
resources needed to answer the
and quantitative data-collection ap- data is recommended because differ-
potential process-evaluation ques-
proaches are used in process evalua- ent data sources may yield different
tions listed in Step 3 using the meth-
tion. Common qualitative data- conclusions (e.g., teacher report of
ods proposed in Step 4. More than
collection methods include open- classroom activity vs. observation of
likely, the team has already begun to
ended questions in interviews and classroom activity; Bouffard et al.,
consider resource limitations even
focus groups, logs, case studies, doc- 2003; Helitzer et al., 2000; Resnicow
as the wish list of questions is grow-
ument review, open-ended surveys, et al., 1998). How the information
ing! In addition to resources such as
and content analysis of videotapes; will be used (for formative—imme-
skilled staff, the team needs to con-
common quantitative methods in- diate, or summative—longer term
sider the characteristics of the pro-
clude surveys, direct observation, purposes) and the turnaround time
gram itself. Programs that are longer,
checklists, attendance logs, self- between data collection to reporting
more complex, meet frequently,
administered forms and question- are critical issues to consider. Data
and/or have large numbers of partic-
Saunders et al. / DEVELOPING A PROCESS-EVALUATION PLAN 145
ipants will require more resources, may vary somewhat by component. coalitions with practitioners in mind.
Health Promotion Practice, 5(2), 108-114.
including time, to implement and to See Figure 7 and Table 4 illustrating
monitor. Certain settings (e.g., Step 6 for the Media Matters curricu- Chinman, M., Imm, P., Wandersman, A.,
Kaftarian, S., Neal, J., Pendleton, K. T., et al.
schools) may have scheduling and lum component. Table 4 provides a (2001). Using the getting to outcomes (GTO)
other constraints. If the intervention template for a final process- model in a statewide prevention initiative.
involves working collaboratively evaluation plan. Health Promotion Practice, 2(4), 302-309.
with a number of organizations or Dehar, M. A., Casswell, S., & Duignan, P.
groups, complexity increases.
> CONCLUSIONS
(1993). Formative and process evaluation of
Resource considerations include health promotion and disease prevention
the availability of qualified staff to programs. Evaluation Review, 17(2), 204-
220.
develop and implement all aspects The components of and steps
of the process evaluation as well as Devaney, B., & Rossi, P. (1997). Thinking
involved in process-evaluation plan- through evaluation design options. Chil-
the time needed for planning, pilot- ning have been identified in previ- dren and Youth Services Review, 19(7),
testing instruments and protocols, ous studies. After considering issues 587-606.
data collection, entry, analysis, and concerning process-evaluation pur- Dusenbury, L., Brannigan, R., Falco, M., &
reporting. Program planners must poses, process-evaluation methods, Hansen, W. B. (2003). A review of research
also consider the feasibility of pro- and the context in which health- on fidelity of implementation: Implications
cess data collection within the con- for drug abuse prevention in school set-
promotion programs are imple- tings. Health Education Research, 18(2),
text of the intervention (e.g., disrup- mented, we present a six-step 237-256.
tiveness to the intervention or the approach to planning process evalu- Harachi, T. W., Abbott, R. D., Catalano, R.
organization’s regular operations) as ation, illustrated with a fictional F., Haggerty, K. P., & Fleming, C. B. (1999).
well as staff and respondent burden. case study. This description of how Opening the black box: Using process eval-
If new data-collection tools are to plan process evaluation can pro- uation measures to assess implementation
needed, consider the time, staff, and vide further guidance for practitio-
and theory building. American Journal of
level of expertise required. Most pro- Community Psychology, 27(5), 711-731.
ners who wish to develop compre-
jects find it necessary to priori- Helitzer, D., & Yoon, S. (2002). Process eval-
hensive process-evaluation plans for uation of the adolescent social action pro-
tize and reduce the number of the
health-promotion programs. gram in New Mexico. In A. Steckler &
process-evaluation questions to L. Linnan (Eds.), Process evaluation for
accommodate project resources. See public health interventions and research
Figure 6 for the Media Matters analy- REFERENCES (pp. 83-113). San Francisco: Jossey-Bass.
sis of program resources, character- Bandura, A. (1986). Social foundations of Helitzer, D., Yoon, S., Wallerstein, N., &
istics, and context in Step 5. thought and action: A social cognitive the- Garcia-Velarde, L. D. (2000). The role of
ory. Englewood Cliffs, NJ: Prentice Hall. process evaluation in the training of facilita-
tors for an adolescent health education pro-
Baranowski, T., & Stables, G. (2000). Pro- gram. Journal of School Health, 70(4), 141-
Step 6: Finalize the Process- cess evaluations of the 5-a-day projects. 147.
Evaluation Plan Health Education & Behavior, 27, 157-166.
Linney, J. A., & Wandersman, A. (1991).
Bartholomew, L. K., Parcel, G. S., Kok, G., & Prevention plus III: Assessing alcohol and
The final process-evaluation plan Gottlieb, N. H. (2001). Intervention map- other drug prevention programs at the
emerges from the iterative team- ping: Designing theory and evidence-based school and community level: A four-step
health promotion programs. New York:
planning process described in Steps guide to useful program assessment. Wash-
McGraw-Hill. ington, DC: U.S. Department of Health and
3 to 5. Although priorities for pro-
Bouffard, J. A., Taxman, F. S., & Silverman Human Services.
cess evaluation will vary in different
R. (2003). Improving process evaluations of McGraw, S. A., Sellers, D. E., Stone, E. J.,
projects, Steckler and Linnan correctional programs by using a compre- Bebchuk, J., Edmundson, E. W., Johnson, C.
(2002a) recommended a minimum hensive evaluation methodology. Evalua- C., et al. (1996). Using process evaluation to
of four elements of process evalua- tion and Program Planning, 26(2), 149-161. explain outcomes: An illustration from the
tion: dose delivered, dose received, Butterfoss, F., & Kegler, M. (2002). Toward a child and adolescent trial for cardiovascu-
reach, and fidelity. They also recom- comprehensive understanding of commu- lar health (CATCH). Evaluation Review,
mended documenting recruitment nity coalitions: Moving from practice to the- 20(3), 291-312.
ory. In R. DiClemente, L. Crosby, & M. C.
procedures and describing the con- Keger (Eds.), Emerging theories in health
McGraw, S. A., Sellers, D. E., Stone, E. J.,
text of the intervention. In the final Resnicow, K., Kuester, S., Fridinger, F., et
promotion practice and research (pp. 157- al. (2000). Measuring implementation of
plan, the process-evaluation ques- 193). San Francisco: Jossey-Bass. school programs and policies to promote
tions should be described for each Butterfoss, F. D., & Francisco, V. T. (2004). healthy eating and physical activity among
component of the intervention and Evaluation community partnerships and youth. Preventive Medicine, 31, S86-S97.

146 HEALTH PROMOTION PRACTICE / April 2005


Evaluation and Practice

McGraw, S. A., Stone, E. J., Osganian, S. K., curricula: A comparison of three measures. ventions and research. San Francisco:
Elder, J. P., Perry, C. L., Johnson, C. C., et al. Health Education Research, 13(2), 239-250. Jossey-Bass.
(1994). Design of process evaluation within Scheirer, M. A., Shediac, M. C., & Cassady, Viadro, C., Earp, J. A. L., & Altpeter, M.
the child and adolescent trial for cardiovas- C. E. (1995). Measuring the implementation (1997). Designing a process evaluation for a
cular health (CATCH). Health Education of health promotion programs: The case of comprehensive breast cancer screening
Quarterly, Suppl. 2, S5-S26. the breast and cervical-cancer program in intervention: Challenges and opportunities.
Millstein, B., Wetterhall S., & CDC Evalua- Maryland. Health Education Research, Evaluation and Program Planning, 20(3),
tion Working Group. (2000). A framework 10(1), 11-25. 237-249.
featuring steps and standards for program Steckler, A., & Linnan, L. (2002a). In A. Zapka, J., Goins, K. V., Pbert, L., & Ockene,
evaluation. Health Promotion Practice, Steckler & L. Linnan (Eds.), Process evalua- J. K. (2004). Translating efficacy research to
1(3), 221-228. tion for public health interventions and effectiveness studies in practice: Lessons
Resnicow, K., Davis, M., Smith, M., Laza- research (pp. 1-24). San Francisco: Jossey- from research to promote smoking cessa-
rus-Yaroch, A., Baranowski, T., Bass. tion in community health care centers.
Baranowski, J., et al. (1998). How best to Steckler, A., & Linnan, L. (Eds.). (2002b).
Health Promotion Practice, 5(3), 245-255.
measure implementation of school health Process evaluation for public health inter-

Saunders et al. / DEVELOPING A PROCESS-EVALUATION PLAN 147

You might also like