Evaluation Quality Control Standards

Capra International Inc. [Capra] subscribes to and practices the Program Evaluation Standards of the Canadian Evaluation Society [CES]. In particular, Capra’s internal quality control standards for evaluations and evaluation management systems are presented in three sections (1) planning, (2) data collection, and (3) reporting, using the following two-column table format. In each table, the left column sets out Capra’s Quality Control Standards, while the right column details approaches to their implementation. In addition Capra requests clients to identify their expected quality criteria before an assignment is undertaken. All evaluations for the Federal Government of Canada reflect the Treasury Board of Canada’s 2009 Evaluation Police and related documents. In all other evaluations, the client's norms and standards are used, where they exist. In international development evaluations, where no specific norms and standards exist, the OECD DAC criteria are used.

1. Planning

Quality Control Standards

Planned Implementation of Quality Control Standards

1.1

The evaluators are qualified

Ensure that only qualified evaluators are selected and assigned to an evaluation project, who have received proper guidance, possess critical subject-matter expertise and who understand their assigned responsibilities. The guidance to evaluators includes a review of, and acceptance by, potential evaluators of the CES Personnel Evaluation Standards and the CES Guidelines for Ethical Conduct. If an evaluation team is formed, consisting of internal and external evaluators, the review of Capra’s quality control standards regarding personnel and program evaluation precedes the review of the particular technical specifications of any given assignment and the development of the corresponding methodology and work plan. Critical subject matter expertise is assured either through professional background of the evaluators or through familiarization with relevant documents and literature and through interviews.

1.2

Data and process quality is assured

Ensure that an Evaluation Plan exists, comprising at least: Context, logic, objectives, criteria, scope, methodology and planned quality assurance steps.

1.3

The scope is clearly defined

Verify and ensure that the Terms of Reference for an assignment clearly spell out the objectives, purposes and range of coverage. A checklist may be used.

1.4

The evaluation criteria have been validated

Ensure that the criteria to be used in the evaluation have been defined in advance and been validated with reference to an existing authoritative source, e.g. Results-based Management and Accountability Frameworks RMAFs, including their indicators, or have been agreed upon with the client. The evaluation criteria should address: (1) relevance; (2) evaluation design; (3) data reliability; (4) robust findings.

1.5

The evaluation process is phased.

Ensure that the evaluation is properly phased, in terms of: (1) planning; (2) diagnosis (data gathering and data analysis; (3) reporting.

1.6

The evaluation is efficient

Ensure that the evaluation is executed within budget.

1.7

The evaluation is timely

Ensure that the evaluation is executed according to the timeline of the work plan, using project management tracking mechanisms, such as MsProject software.

1.8

The stakeholders are involved

Ensure a participatory, yet independent, environment during the planning and execution, using an advisory /steering committee and/or expert panel consisting of stakeholders and outside subject matter experts. The advisory/ steering committees can meet face to face, by teleconference or virtually, ideally at three stages: (1) At the end of the planning process to review the proposed methodology and work plan; (2) prior to the release of the draft report; and (3) prior to the release of the final report.

2. Data Collection and Analysis

Quality Control Standards

Planned Implementation of Quality Control Standards

2.1

The collected data are accurate

Ensure that statistically valid sampling processes are used to avoid selection bias and apply standard processes to ensure data validity and reliability. Use multiple lines of evidence, including secondary (files, documents, literature) and primary sources (interviews, focus groups, surveys, case studies). Assure oversight/review by an advisory/steering committee, expert panels or other quality reviewer. Assure careful data integration. Assure that lines of evidence involve program staff, recipients, stakeholders. Design evaluation themes and questions and use these in Interview/Focus Group/Survey Guides.

2.2

The data analysis is sound

Ensure that the data analysis methodology is robust. Data will be evidence-based. The data elements will be assessed in terms of “fact”, “opinion”, etc. “Facts” will describe “what is” observed, and these “facts” will be compared with “what is expected” or ought to be found. Attempts will be made to detect and control evaluator bias. For secondary data a common coding structure will be used.

2.3

The collected data are relevant to decision makers

Ensure that the relevance of the data is clearly understood throughout the planning, data collection and reporting stages.

2.4

The data analysis is transparent

Ensure that data sources are verifiable, and that there is an auditable trail

3. Reporting

Quality Control Standards

Planned Implementation of Quality Control Standards

3.1

The findings/ conclusions are substantiated, impartial and objective

Ensure that the findings and conclusions are supported by evidence in terms of data and data analysis and that these are reported impartially and objectively.

3.2

The final report is comprehensive

Ensure that the report comprises statements of: objectives, criteria, methodology, findings, conclusions relating findings to objectives, interpreting the significance of the findings to the client, offering options for corrective action, formal recommendations and the client response as to agreement or disagreement. The report may also include the evaluators’ perceptions about lessons learned in the evaluation process.

3.3

A draft/preliminary report invites challenge to the findings.

Ensure that a draft report is issued and reviewed before a final report is prepared. This is desirable to facilitate challenges to the findings, which would enhance quality by fostering completeness and by identifying errors of interpretation. Reviewers must include program managers and other stakeholders. They will typically check the facts and produce contrary evidence, if needed. In instances of disagreement with the conclusions or recommendations, these will be referenced in the final report.

3.4

The messages are clear.

Ensure that the reports are written using correct and appropriate structure, grammar and diction.

3.5

The reports are action- oriented.

Ensure that the report specifies, as appropriate, critical success factors, good/best practices, lessons learned and targeted recommendations.

3.6

Affected parties are notified about pertinent findings

Ensure that the affected parties are notified about findings that concern them before the report is released.

Bibliography: