Listen Respectfully, Work Collaboratively.


Program Evaluation

HasNa considers monitoring and evaluation essential components of each program and begins by incorporating M&E into the program design. In order to determine how well a program is performing, HasNa involves an evaluator in all of our programs and allocates a portion of each program budget for evaluation activities. HasNa embraces the importance of M&E and aims to learn alongside program participants.

HasNa invests in program evaluation for two purposes:

  • To assess how well the program is functioning. Evaluation data is collected from participants and staff at various times while the program is in operation. This data is provided to the program staff. The staff uses this evaluation data to improve program design and operations.
  • To document program results for its stakeholders. Stakeholder audiences are interested in program effectiveness. These audiences include program sponsors and contributors, potential sponsors, and the HasNa Board of Directors as well as top HasNa leadership. Most importantly, this also includes members of the communities in which programs are implemented.

HasNa program evaluation typically involves the following phases:

  1. Collection of background and context information about the program, its purpose, logic, design, and management.
  2. Identification of guiding questions, data collection methods and their feasibility, appropriate analyses, a schedule of evaluation events, and appropriate ways to communicate the evaluation plan to participants.
  3. Use of various data collection techniques such as observation, interview, and questionnaires followed by rapid data summary and reporting to program staff.
  4. A summary report based on the data collected and analyzed over the course of the program to document the program for institutional memory and future review.

Sample Evaluations of Past Programs...

Evaluator Information...