40 Conducting and Reporting the Evaluation

When you’re getting ready to implement your program there are a few critical steps to consider, related to the evaluation plan, which are outlined in this section:

  • Finalizing data types and sources
  • Identifying and operationalizing variables

Finalizing Data Types and Sources

A strong evaluation will often include two types of data: primary and secondary.

Secondary data is what you already have access to and is typically used to provide evidence of the community’s strengths and issues and to identify trends. It provides context and an opportunity for comparison. It should never be relied upon to present the entire picture of a problem and point to a solution, although it presents a very important part of the picture. Secondary data helps illuminate the evidence for community change.

Primary data is new data, what you gather to contribute to your assessment of the community and the understanding of an issue.

These types of data are described in more detail in the earlier chapter Assessing Community.

Since one of the important goals of the evaluation is to identify any impacts of the intervention that were not anticipated, it is important to include a data source that will uncover this information. It makes sense that this would include some type of open-ended qualitative data collection since you are unable to identify exactly what the response would be. For example, you could include questions on a survey which ask:

  • Please describe other ways that this program impacted you, either in a good way or negative way.
  • Do you have other ideas for how we should improve this program? If so, please share.

Common sources of primary data:

  • focus groups
  • questionnaires
  • interviews
  • pre-post tests

Uncommon but meaningful sources of primary data:

  • observations (see example below)
  • document review
  • stories

Common Sources of secondary data:

  • case notes
  • ongoing data collection
  • goal progress reports

You are encouraged to be creative in your data collection and identify the sources of data that will provide the most meaning to the experience.

Example observation written as vignette: From YWCA Duluth’s Spirit Valley Young Mothers Program

What we saw was remarkable. Six young women shared their insights about the Expansion project while demonstrating unconditional love and concern for each other. Nothing short of that.

The women hugged each other’s children, laughed at each other’s jokes, and spoke up when someone’s child needed to be redirected. Two women held infants on their laps, one of whom was not her son. He looked curiously at her shirt which read, “I Love Consensual Sex.”

One of the women who had minimal contact for many months interacted with the others as if they’d seen each other just yesterday. The other women were excited to see her children, commenting on how much they’ve grown since they last saw them. The bond she has developed with the other women has clearly passed the test of time.

In addition to answering questions, they shared some stories with us about things they have done in the last year and their plans to change the world.

These women are clearly empowered, they are young, they are brave, they are nurturing, and they are compassionate.[1]

Identifying and Operationalizing Your Variables

Each of the factors that you have identified in your outcome indicators would be considered a variable. It is helpful if you go through and underline each of these so that you are very clear about what you need to measure.

The next step is to operationalize each variable by defining how you will measure it and what you will look for.

Considerations for Successful Data Collection

Evaluation questions and methods have to be meaningful, relevant, and inclusive

  • As you get ready to implement your evaluation plan, pause to ask yourself if each aspect of your plan will have meaning to the stakeholders, and if it is culturally relevant, inclusive, and accessible (the language should be accessible and include everyone’s perspective that is needed, etc.).

    Example: Participant survey at an Indigenous Food Expo[2]

    One-page survey for an Indigenous Foods Expo. Top box includes question, "Rank your excitement about Indigenous foods" and blueberries to circle, numbered one through 10. One is not excited to 10, very excited. Next row includes three boxes, the first with question, "What is your age group?", next with, "What's your home community?" and third, "How many vendors did you visit?". The third row also includes three boxes. The first asks, "What activities did you participate in or go to?" The second asks, "How did you hear about this event" and the third, "Has your interest in buying Indigenous foods grown after attending this event?"

     

Secure commitments for gathering data

  • Communicate with those you are collecting data from before you begin your activities, particularly the program participants. They should be aware that evaluation is a part of the process.
  • Secure written commitments as necessary.

Create protocols for collecting and storing the data

  • Often, the staff will be the ones to gather the evaluation data. They should be clear on the expectations for how this is done so they do not unfairly influence the results.
  • Consider the ethics of gathering data where you are asking people to be honest about how they have been impacted by your work (positively or negatively). This needs to be done with a lot of sensitivity.
  • The protocols should outline:
    • the message to participants when the evaluation is conducted
    • the importance of the data
    • how the information will be stored
    • how the information will be used

Train your data collectors and pilot the process

  • Pilot-test your collection processes and see if the questions and methods are clear.

Be very careful about the anonymity and confidentiality of stakeholders

  • The need to protect the anonymity and confidentiality of stakeholders during an evaluation process often depends on the number of people you serve and the number of staff you employ. People are often reluctant to be honest in their assessment of a program unless they are assured their comments will be protected and they will not be able to be connected to their comments.

Reporting and Sharing Evaluation Results

An evaluation is only valuable if you actually utilize the information that you have gathered. This may seem very obvious, but organizations will occasionally conduct evaluations because they are a requirement, and then not consider the recommendations or ideas that are shared for purposes of improving or changing their services. This is disingenuous to the feedback loop process, likely unethical, and counters what stakeholders expect from an organization.

Prior to sharing the results, however, it is necessary to write them into a report so that you can easily share and communicate the findings with others. Here is a simple outline for report writing:

  • Background: A short history of the organization and the program that is being evaluated.
  • Summary of evaluation findings: This section mirrors the abstract of an article, with two to three paragraphs highlighting the evaluation methods, results, and conclusions. For some people, this is the only part they will read, so it should still be detailed enough to be informative.
  • Evaluation Methods: A lengthier description of the evaluation (people involved, types of data collected, timeline, etc.).
  • Evaluation Results: This section provides detailed tables and descriptions of the qualitative and quantitative results from the evaluation, mirroring the evaluation plan you put together. Methods for analyzing the data should be included. Make sure that you include results of both anticipated and unanticipated outcomes.
  • Evaluation Conclusions and Recommendations: This is one of the most important sections because it includes the conclusions from the result analysis and recommendations for program changes (if any). This should be relevant to all stakeholders of the program and organization (program participants, staff, volunteers, board of directors, funders, etc.).
  • Attachments: You should attach copies of any instruments (surveys, interview questions, etc.) that were used in the evaluation as well as data tables and note summaries from interviews or primary data.

Sharing Evaluation Results

Three primary groups should receive the results of an Outcome Evaluation:

  1. Stakeholders who participated in the evaluation: This could be staff, program participants, community partners, volunteers, etc. Sharing the results communicates respect and honors their participation. Plus, if they agreed to participate, they are likely interested in the findings and invested in the outcome.
  2. Anyone involved in reviewing and implementing recommendations: Every outcome evaluation will result in some level of recommendations, and these should be reviewed by the staff, volunteers, board of directors, and advisory councils who would be involved in implementing the recommendations. Sharing the results will help communicate the rationale behind the recommendations and should include their buy-in.
  3. People and organizations who funded or invested in your program: All funders of your work should receive a copy of the evaluation. They clearly care about the work you are doing, the problems, and the community conditions you are working on, so they will be curious about how the work is going. They will not expect you to be perfect in your work but they will expect you to be transparent about what is working and what is not, and most importantly what you intend to do differently. Typically, conducting and sharing the evaluation results is a requirement for receiving funding.

The easiest way to share the outcome evaluation results is to send the report to all the relevant stakeholders listed above. I strongly encourage you to follow up with a conversation about the findings and recommendations so that it is clear to the stakeholders that the results are being listened to and considered.


  1. Outcome Evaluation (2009). https://www.ywcaduluth.org/
  2. American Indian Community Housing Organization; www.aicho.org

License

Macro Practice for Community and Organizational Change Copyright © by Lynn Amerman Goerdt. All Rights Reserved.