Evaluation Guidebook

Program Self-Evaluation

 Table of Contents

  1. Introduction

  2. Key Considerations for Evaluation: Dos, Don’ts, and Ethics

  3. Step 1: Planning and preparation

  4. Step 2: Gathering Data

  5. Step 3: Working with the Data

  6. Step 4: Reporting Evaluation Data

  7. Ethical Considerations

  8. Acknowledgements

  9. Evaluation Resources

The Guidebook for Peer Support Program Self-Evaluation: Practical Steps and Tools can be used to document program operations and outcomes, and to build evidence for the efficacy of peer support programs.

In a world of limited resources, conducting
evaluations Evaluation: A systematic and objective assessment of an on-going or completed project, program or policy. Evaluations are undertaken to (a) improve the performance of existing interventions or policies, (b) asses their effects and impacts, and (c) inform decisions about future programming. Evaluations are formal analytical endeavors involving systematic collection and analysis of qualitative and quantitative information.
can be a challenge. We created this guidebook in response to frequent requests from peer-run organizations for practical, low-cost, or no-cost tools they could use to evaluate their programs.

We have included recommendations on best practices in
self-evaluation Self-Evaluation: An evaluation by those who are entrusted with the design and implementation of a project or program.
and
data monitoring Data Monitoring: The performance and analysis of routine measurements to detect changes in status. Monitoring is used to inform managers about the progress of an ongoing intervention or program, and to detect problems that may be able to be addressed through corrective actions.
based on techniques used by other peer support organizations and in the world of program evaluation. It provides basic, practical guidance on developing a
logic model Logic Model: A logic model, often a visual representation, provides a road map showing the sequence of related events connecting the need for a planned program with the program's desired outcomes and results.
, identifying
outcomes Outcome: A result or effect that is caused by or attributable to the program.
, selecting measures/
indicators Indicator: Quantitative or qualitative variable that provides reliable means to measure a particular phenomenon or attribute.
, collecting and analyzing
data Data: Information collected in the process of evaluation. Data gathered during an evaluation are manipulated and analyzed to yield findings that serve as the basis for conclusions and recommendations.
, and reporting
findings Findings: Factual statements about a project or program which are based on empirical evidence. Findings include statements and visual representations of the data, but not interpretations, judgments or conclusions about what the findings mean or imply.
.

We hope that program staff, managers, and administrators will find this guidebook helpful.