Guidance

TASK 4: Implement the Evaluation Plan

Your evaluation plan should address questions related to both process (i.e., program operations, implementation, and service delivery) and outcomes (the ultimate impact of your intervention).

Process Evaluation

A process evaluation monitors and measures your activities and operations. It addresses such issues as consistency between your activities and goals, whether activities reached the appropriate target audience(s), the effectiveness of your management, use of program resources, and how your group functioned.

 

Process evaluation questions may include the following:

·       Were you able to involve the members and sectors of the community that you intended to at each step of the way? In what ways were they involved?

·       Did you conduct an assessment of the situation in the way you planned? Did it give you the information you needed?

·       How successful was your group in selecting and implementing appropriate strategies? Were these the “right” strategies, given the intervening variables you identified?

·       Were staff and/or volunteers the right people for the jobs, and were they oriented and trained before they started?

·       Was your outreach successful in engaging those from the groups you intended to engage? Were you able to recruit the number and type of participants needed?

·       Did you structure the program as planned? Did you use the methods you intended? Did you arrange the amount and intensity of services, other activities, or conditions as intended?

·       Did you conduct the evaluation as planned?

·       Did you complete or start each element in the time you planned for it? Did you complete key milestones or accomplishments as planned?

Outcome Evaluation

An outcome evaluation looks at the intervention’s effect on the environmental conditions, events, or behaviors it aimed to change (whether to increase, decrease, or sustain). Usually, an intervention seeks to influence one or more particular behaviors or conditions (e.g., risk or protective factors), assuming that this will then lead to a longer-term change, such as a decrease in the use of a particular drug among youth. You may have followed your plan completely and still had no impact on the conditions you were targeting, or you may have ended up making multiple changes and still reached your desired outcomes. The process evaluation will tell how closely your plan was followed, and the outcome evaluation will show whether your strategy made the changes or results you had intended.

 

An outcome evaluation can be done in various ways:

·       The “gold standard” involves two groups that are similar at baseline. One group is assigned to receive the intervention and the other group serves as the control group. After the intervention, the outcomes among the intervention group are compared with the outcomes among the control group. Ideally, data should continue to be collected after the intervention ends in order to estimate effects over time.

·       If it is not possible to include a control group (e.g., due to financial constraints), you can evaluate just the intervention group, collecting data at several points before, during, and after the intervention (e.g., at 3-, 6-, and/or 12-month intervals). This design allows the evaluator to analyze any trends before the intervention and to project what would have happened without the intervention, so that the projection may be compared to the actual trend after the intervention. This type of impact evaluation is less conclusive than one using a control group comparison because it does not allow you to rule out other possible explanations for any changes you may find. However, having some supporting evidence is better than not having any.

 

If the intervention produced the outcomes you intended, then it achieved its goals. However, it is still important to consider how you could make the intervention even better and more effective. For instance:

·       Can you expand or strengthen parts of the intervention that worked particularly well?

·       Are there evidence-based methods or best practices out there that could make your work even more effective?

·       Would targeting more or different behaviors or intervening variables lead to greater success?

·       How can you reach people who dropped out early or who didn’t really benefit from your work?

·       How can you improve your outreach? Are there marginalized or other groups you are not reaching?

·       Can you add services—either directly aimed at intervention outcomes, or related services such as transportation—that would improve results for participants?

·       Can you improve the efficiency of your process, saving time and/or money without compromising your effectiveness or sacrificing important elements of your intervention?

Good interventions are dynamic; they keep changing and experimenting, always reaching for something better.

Tool
SAPC Planning Tool