Toolkit for Community Organizers: Steps for Addressing Chlamydia in Your Community - Step 7: Evaluate the Processes - Minnesota Dept. of Health

Step 7: Evaluate the Processes

Toolkit for Community Organizers:
Steps for Addressing Chlamydia in Your Community

  • Evaluate your project in several areas in order to identify what was successful and what could be adjusted

Why evaluate? Good evaluation establishes clear links between past, present and future initiatives and results. It helps organizations extract relevant information from activities that can be used as the basis for programmatic fine-tuning, reorientation and future planning; and to judge to what extent the work is going in the right direction, whether progress and success can be claimed, and how future efforts might be improved.

Planning to evaluate

Planning for evaluation must start at the time of program or project design, and they should really be planned together.  Proper planning includes clear articulation of intended results, which makes clear what should be tracked during the program (processes) and measured at baseline, specific intervals and/or at the end of the program (outcomes). Outcomes evaluation will be discussed in Step 9: Evaluate Outcomes.  Good planning includes setting goals, developing strategies to achieve the goals, outlining the implementation arrangements for programming and evaluation and allocating resources to achieve those goals. The availability of clearly defined results and a theory to obtain those results help determine the ‘evaluability’ of the subject to be evaluated.

Process evaluation

Process evaluation is the real-time information gathering to obtain regular feedback on the progress being made toward achieving the goals and objectives of the program. It answers such questions as “Are we taking the actions at the level of participation we said we would?” and/or “Are we making progress on achieving the results that we said we wanted to achieve?”  Process evaluation can also be seen as a resource to validate the logic of a program. Process evaluation may track such things as attendance, satisfaction and fidelity and aim to determine whether changes are needed to achieve intended results.

Determining what to evaluate

Most programs have more to evaluate then they have resources.  Thus, determining what to evaluate is a key step in evaluation planning. Defining an evaluation purpose helps set the direction. 
Ask:

  • Why is the evaluation being conducted at that particular point in time?
  • Who will use the information?
  • What will the information be used for?

Then, setting the scope narrows the focus further by setting boundaries for what the evaluation will and will not cover in meeting the evaluation purpose.  It is better to answer fewer questions robustly than to answer more superficially. A clear and concise set of the most relevant evaluation questions increases the likelihood that evaluations are focused, manageable, cost efficient and useful.

Who should perform evaluation?

The question of who should perform the evaluation depends on the evaluation question.  Evaluations commonly happen via collaborative efforts.  At times, it is of utmost importance to have a lead evaluator who would be seen as independent and objective. This individual would usually be someone from outside the program or organization. However, because robust process evaluation can greatly improve the quality of outcome evaluations, staff members often have responsibilities with data collection. No matter who has roles in performing the evaluation, staff entrusted with process or outcome evaluation duties should have required technical expertise in the area.

Much of the language above is lifted from “Handbook on Planning, Monitoring and Evaluating for Development Results” from the United Nations Development Programme, published 2009

Types of process evaluation (1)

Type of evaluation Commonly used methods for data collection
Program planning
  • Key informant interviews
  • Community forum
  • Rates under treatment
  • Social indicators
  • Surveys, reports and census data
Program monitoring: target population
  • Documentation of services offered/provided
  • Survey of program participants
  • Survey of community members (participants, drop-outs and eligible, non-participants)
Program monitoring: delivery of services
  • Observational data
  • Documentation of services
  • Information about service providers
  • Participant information

 

Kandiyohi County & CHAS

The University of Minnesota’s Healthy Youth Development Prevention Research Center (PRC) was contracted to complete a program evaluation of the CHAS demonstration project. The partnership between KCPH and PRC served to:

  • determine evaluation questions and develop evaluation plan
  • develop process and outcome tools
  • implement evaluation plan and tools
  • analyze process and outcome data
  • write a report
  • contribute to Community Organizer’s Toolkit

The goal of this evaluation assisted CHAS in making timely project improvements, increase project functioning and performance, plan future activities and share lessons learned and emerging best practices.

Process evaluation questions and tools used to study process evaluation:

  1. Did CHAS convene a strategy group to establish a work plan related to the Minnesota Chlamydia Strategy?
    1. Who participated? Strategy Group attendance log (PDF)
    2. Was a work plan created?
    3. Were strategy group members satisfied with their involvement in the group? - Strategy Group post-test survey & survey summary report (from Evaluation Report Appendix B) (PDF)
  2. Did CHAS implement provider education? - Health Care Provider Survey
    1. Were provider packets created and distributed?
    2. Were participating providers satisfied? - CHAS Speaker Evaluation (from Evaluation Report Appendix B) (PDF)
  3. Did CHAS implement a media plan?
    1. What types of media were employed?
    2. How was media distributed or utilized? - Media Distribution Log (from Evaluation Report Appendix B) (PDF)
  4. Did CHAS coordinate a training of facilitators in It’s That Easy - Training of Parent Educators (Word)
    1. Were participants satisfied?
    2. Did trained facilitators do presentations in the community?
    3. Were community members satisfied? It’s That Easy - End of Training Survey (Word)
  5. Did CHAS target Ridgewater College with an awareness and education campaign? -
    1. What were the components of the campaign? - Outreach Contact Form (Word) & Outreach Contact Form Report (from Evaluation Report Appendix C).
  6. Did CHAS engage ALC students in a chlamydia-related media project?
    1. How many students were involved in the project?
    2. Were students satisfied with their involvement? - ALC project completion Focus group questions (Word)
For more details about the PRC’s evaluation and outcomes of the evaluation of CHAS’s work, access the full Evaluation Report (PDF)

 

Rockwood, T. (2015). Types of evaluation and their main data collection procedures [class handout]. Program Evaluation in Health and Mental Health Settings. University of Minnesota, Minneapolis, Minnesota.

Content Notice: This site contains HIV or STD prevention messages that may not be appropriate for all audiences. Since HIV and other STDs are spread primarily through sexual practices or by sharing needles, prevention messages and programs may address these topics. If you are not seeking such information or may be offended by such materials, please exit this web site.

Updated Friday, September 16, 2016 at 11:02AM