Guidance

TASK 4: Implement the Evaluation Plan

Your evaluation plan should address questions related to both process and outcomes.

A process evaluation monitors and measures your implementation activities, program operations, and service delivery. It addresses the consistency between your activities and goals, whether your activities reached the appropriate target audience, the effectiveness of your management, your use of program resources, and how your group functioned.

Process evaluation questions may include the following:

  • Were you able to involve the members and sectors of the community that you intended to at each step of the way? In what ways were they involved?
  • Did you conduct an assessment of the situation in the way you planned? Did it give you the information you needed?
  • How successful was your group in selecting and implementing appropriate strategies? Were these the “right” strategies, given the intervening variables you identified?
  • Were staff and/or volunteers the right people for the jobs, and were they oriented and trained before they started?
  • Was your outreach successful in engaging those from the groups you intended to engage?
  • Were you able to recruit the number and type of participants needed?
  • Did you structure the program as planned? Did you use the methods you intended? Did you arrange the amount and intensity of services, other activities, or conditions as intended?
  • Did you conduct the evaluation as planned?
  • Did you complete or start each element in the time you planned for it? Did you complete key milestones or accomplishments as planned?

An outcome evaluation looks at the ultimate impact of your intervention—its effect on the environmental conditions, events, or behaviors it aimed to change (whether to increase, decrease, or sustain). An intervention generally seeks to influence one or more particular behaviors or conditions (e.g., risk or protective factors), assuming that this will then lead to a longer-term change, such as a decrease in the use of a particular drug among youth.

An outcome evaluation can be done in various ways:

  • The “gold standard” involves two groups that are similar at baseline. One group receives the intervention and the other group serves as the control group. After the intervention, the outcomes for each group are compared. Ideally, you’ll continue to collect data after the intervention ends to estimate its effects over time.
  • If it’s not possible to have a control group, collect data from the intervention group at several points before, during, and after the intervention (e.g., at 3-, 6-, and 12-month intervals). This allows you to analyze any trends before the intervention and to project what would have happened without the intervention, and to compare your projection to the actual trend after the intervention.

Note: This type of impact evaluation is less conclusive than one using a control group comparison because it does not allow you to rule out other possible explanations for any changes you may find. However, having some supporting evidence is better than not having any.

You may have followed your plan completely and still had no impact on the conditions you were targeting, or you may have ended up making multiple changes in the program or strategy and still reached your desired outcomes. The process evaluation will tell how closely your plan was followed, and the outcome evaluation will show whether your strategy made the changes or results you had intended.

If the intervention produced the outcomes you intended, then it achieved its goals. However, it’s still important to consider how you could make the intervention even better and more effective. For instance:

  • Can you expand or strengthen the parts of the intervention that worked particularly well?
  • Are there evidence-based methods or best practices that could make your work even more effective?
  • Would targeting more or different behaviors or intervening variables lead to greater success?
  • How can you reach people who dropped out early or who didn’t really benefit from your work?
  • How can you improve your outreach? Are there marginalized or other groups you are not reaching?
  • Can you add services—either directly aimed at intervention outcomes, or related services such as transportation—that would improve results for participants?
  • Can you improve the efficiency of your process, saving time and/or money without compromising your effectiveness or sacrificing important elements of your intervention?

Keep in mind that good interventions are dynamic; they keep changing and experimenting, always striving to improve.

Tool
MOAPC Planning Tool