By providing resources and technical assistance, the Wisconsin Partnership Program supports grantees’ evaluation efforts using methods that are most appropriate for each project.

The Partnership Program believes evaluation should: 

  • be based on a community-informed and evidence-based project theory to describe how it is expected to work
  • engage communities in decision-making throughout the evaluation process
  • commit and begin to measure outcomes from the beginning of the program
  • provide timely feedback to stakeholders for process improvement
  • only collect data that will be used

Aligning around a theory to improve planning and evaluation

An initial step in determining whether an initiative can be evaluated is to assess if the work is grounded in a thorough understanding of which activities are expected to lead to which outcomes. An approach adopted across the Partnership Program’s funding mechanisms are Theories of Change. A Theory of Change is a visual diagram created through a participatory process that displays which conditions are needed, step-by-step, for goals to be achieved.

The key to creating a thorough Theory of Change is gathering the program’s key stakeholders together to identify and discuss assumptions about how they will achieve their goals, ultimately resulting in aligned work towards the same goals. Once adequately theorized, an evaluation plan based on the theory of change can be completed. As initiatives or projects unfold and additional evaluative information emerges, the theory of change will need revision and redirection.

Resources

Validity demands cultural competence

Evaluation utilizes perspectives to assign meaning to data. Every perspective is shaped by experience and cultural norms. Projects should develop evaluations in partnership with communities and those with lived experience, at every step. This includes engaging community in decision-making regarding evaluation design, data collection, analysis and reporting. This approach provides power to those often left out and further sustains change.

Resources

Commit to measuring outcomes

The Cambridge Summer Youth Study is one example of a project that aimed to measure outcomes from the beginning of the study. The program intended to prevent crime by providing summer camp opportunities, counselors and other resources to at-risk boys. The evaluation found that the program actually had a negative effect on health and child outcomes compared to boys who weren’t provided those opportunities.

This example demonstrates the importance of creating an evaluation design to measure outcomes. The graphic below illustrates that the strongest evaluation designs include a comparison. While budget, time and resources may mean it’s not feasible to conduct an intensive, rigorous evaluation, projects should avoid only conducting a post-test with no comparison (No. 7). Having any of the Nos. 1-6 designs provides a comparison, allowing a project to demonstrate it changed an outcome by some degree.

A table of numbers
Illustration of more (No. 1) to less (No. 7) rigorous evaluation designs. Adapted from realworldevaluation.org

Resources

Provide timely feedback for process improvement

Health is determined by an intermingling of many systems and factors, necessitating complex solutions. When grantee projects aim to solve complex health problems, an evaluation approach that allows project leaders to quickly correct or change direction is necessary. This approach allows the program to re-align and increase effectiveness. Analysis of evaluation data should be prioritized to ensure data will be ready when decisions need to be made and should change measurement strategies as the process unfolds, rather than sticking to pre-determined strategies or measurement tools.

Resources

Only collect data that will be used

“If you don’t need to know, it’s gotta go.” Evaluation data should inform decision-making, the direction of work, or be useful to project stakeholders. If the data is not being used for a purpose, “It’s gotta go.” Evaluation processes and information should be useful and should avoid collecting unnecessary information.

Resources

Resources for individuals who are newer to evaluation

Evaluation is a systematic process of collecting data and evidence to inform program planning. There are many great resources for individuals who want to learn more about evaluation. In addition, for a more intensive help, the Partnership Program offers technical assistance as needed.

Share your resources or receive evaluation assistance

We’ve heard from our grantees that many want support to conduct evaluations. The Partnership Program provides technical assistance as requested. In addition, we want to hear about and learn from your evaluation successes. Please share with us tools or evaluation reports you think may be useful to us, or our grantees.

Contact Partnership Program evaluator Kate Westaby, at kwestaby@wisc.edu.

If you’re looking for Wisconsin’s professional evaluation organization, see ¡Milwaukee Evaluation! Inc.