PREPARING THE EVALUATION REPORT

Similar documents
Stakeholder Engagement and Communication Plan (SECP)

DESIGNPRINCIPLES RUBRIC 3.0

H2020 Marie Skłodowska Curie Innovative Training Networks Informal guidelines for the Mid-Term Meeting

EXPO MILANO CALL Best Sustainable Development Practices for Food Security

Standards and Criteria for Demonstrating Excellence in BACCALAUREATE/GRADUATE DEGREE PROGRAMS

State Parental Involvement Plan

2013 Peer Review Conference. Providence, RI. Committee Member Session: Topics and Questions for Discussion

TU-E2090 Research Assignment in Operations Management and Services

School Inspection in Hesse/Germany

FUNDING GUIDELINES APPLICATION FORM BANKSETA Doctoral & Post-Doctoral Research Funding

Last Editorial Change:

Kelso School District and Kelso Education Association Teacher Evaluation Process (TPEP)

STEPS TO EFFECTIVE ADVOCACY

DOCTOR OF PHILOSOPHY BOARD PhD PROGRAM REVIEW PROTOCOL

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

Copyright Corwin 2015

Assessment for Student Learning: Institutional-level Assessment Board of Trustees Meeting, August 23, 2016

STUDENT ASSESSMENT AND EVALUATION POLICY

Digital Media Literacy

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

Oklahoma State University Policy and Procedures

The Political Engagement Activity Student Guide

KIS MYP Humanities Research Journal

Section 3.4. Logframe Module. This module will help you understand and use the logical framework in project design and proposal writing.

Major Milestones, Team Activities, and Individual Deliverables

EDIT 576 (2 credits) Mobile Learning and Applications Fall Semester 2015 August 31 October 18, 2015 Fully Online Course

Get with the Channel Partner Program

EDIT 576 DL1 (2 credits) Mobile Learning and Applications Fall Semester 2014 August 25 October 12, 2014 Fully Online Course

MSW POLICY, PLANNING & ADMINISTRATION (PP&A) CONCENTRATION

Focus on. Learning THE ACCREDITATION MANUAL 2013 WASC EDITION

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Assessment and Evaluation

Planning a Dissertation/ Project

COMS 622 Course Syllabus. Note:

Writing for the AP U.S. History Exam

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

Davidson College Library Strategic Plan

USC VITERBI SCHOOL OF ENGINEERING

Guidelines for the Use of the Continuing Education Unit (CEU)

STUDENT ASSESSMENT, EVALUATION AND PROMOTION

Conceptual Framework: Presentation

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Position Statements. Index of Association Position Statements

MBA 5652, Research Methods Course Syllabus. Course Description. Course Material(s) Course Learning Outcomes. Credits.

Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd. Hertfordshire International College

AB104 Adult Education Block Grant. Performance Year:

California Professional Standards for Education Leaders (CPSELs)

Classroom Assessment Techniques (CATs; Angelo & Cross, 1993)

Colorado s Unified Improvement Plan for Schools for Online UIP Report

Physics/Astronomy/Physical Science. Program Review

Mathematics Program Assessment Plan

Chapter 2. University Committee Structure

BSM 2801, Sport Marketing Course Syllabus. Course Description. Course Textbook. Course Learning Outcomes. Credits.

Higher Education / Student Affairs Internship Manual

Summary results (year 1-3)

SPECIALIST PERFORMANCE AND EVALUATION SYSTEM

Procedures for Academic Program Review. Office of Institutional Effectiveness, Academic Planning and Review

Prepared by: Tim Boileau

Higher education is becoming a major driver of economic competitiveness

Preparing a Research Proposal

CONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION. Connecticut State Department of Education

ASSESSMENT OF STUDENT LEARNING OUTCOMES WITHIN ACADEMIC PROGRAMS AT WEST CHESTER UNIVERSITY

Strategic Planning for Retaining Women in Undergraduate Computing

Audit Documentation. This redrafted SSA 230 supersedes the SSA of the same title in April 2008.

Dublin City Schools Broadcast Video I Graded Course of Study GRADES 9-12

Shank, Matthew D. (2009). Sports marketing: A strategic perspective (4th ed.). Upper Saddle River, NJ: Pearson/Prentice Hall.

CONSULTATION ON THE ENGLISH LANGUAGE COMPETENCY STANDARD FOR LICENSED IMMIGRATION ADVISERS

MASTER OF ARTS IN APPLIED SOCIOLOGY. Thesis Option

UNIVERSIDAD DEL ESTE Vicerrectoría Académica Vicerrectoría Asociada de Assessment Escuela de Ciencias y Tecnología

Writing Research Articles

Ten Easy Steps to Program Impact Evaluation

DICE - Final Report. Project Information Project Acronym DICE Project Title

Program Alignment CARF Child and Youth Services Standards. Nonviolent Crisis Intervention Training Program

WP 2: Project Quality Assurance. Quality Manual

Success Factors for Creativity Workshops in RE

CORRELATION FLORIDA DEPARTMENT OF EDUCATION INSTRUCTIONAL MATERIALS CORRELATION COURSE STANDARDS / BENCHMARKS. 1 of 16

GUIDE FOR THE WRITING OF THE DISSERTATION

OCR LEVEL 3 CAMBRIDGE TECHNICAL

DISTRICT ASSESSMENT, EVALUATION & REPORTING GUIDELINES AND PROCEDURES

Writing an Effective Research Proposal

Unit 3. Design Activity. Overview. Purpose. Profile

Use the Canvas mail to contact me for class matters so correspondence is consistent and documented.

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Mapping the Assets of Your Community:

Academic Affairs Policy #1

P. Belsis, C. Sgouropoulou, K. Sfikas, G. Pantziou, C. Skourlas, J. Varnas

Researcher Development Assessment A: Knowledge and intellectual abilities

INSC 554: Public Library Management and Services Spring 2017 [Friday 6:30-9:10 p.m.]

Personal Project. IB Guide: Project Aims and Objectives 2 Project Components... 3 Assessment Criteria.. 4 External Moderation.. 5

EQuIP Review Feedback

PREPARING FOR THE SITE VISIT IN YOUR FUTURE

HISTORY COURSE WORK GUIDE 1. LECTURES, TUTORIALS AND ASSESSMENT 2. GRADES/MARKS SCHEDULE

Millersville University Degree Works Training User Guide

Chemistry Senior Seminar - Spring 2016

HDR Presentation of Thesis Procedures pro-030 Version: 2.01

ADDIE: A systematic methodology for instructional design that includes five phases: Analysis, Design, Development, Implementation, and Evaluation.

others have examples for how feedback mechanisms at the CBO level have been established?

John Jay College of Criminal Justice, CUNY ASSESSMENT REPORT: SPRING Undergraduate Public Administration Major

ABET Criteria for Accrediting Computer Science Programs

MPA Internship Handbook AY

Transcription:

PREPARING THE EVALUATION REPORT 19 March 2014 This checklist is meant to primarily instruct evaluation consultants in the requirements of formatting and finalizing an evaluation report for the ILO. ILO evaluation managers should be very well informed of this checklist and hand it over to the consultant. The checklist provides specific requirements for each formal element of the report, in addition to providing specific details on how to present the conclusions, recommendations, lessons learned and emerging good practice in the report. An annex presents ILO definitions of lessons learned and emerging good practices, along with the required templates for the evaluation consultant to complete. 1 1. TITLE PAGE The following elements must be included in the title page (see also Checklist 7 Filling in the Title Page) The same title as cited in the TOR; ILO project (TC/SYMBOL) reference code; The type of evaluation (e.g. Independent, Internal Review, Decent Work Country Programme Internal Review, etc.); Indication of the timing of the evaluation (mid-term or final); List all countries covered by the evaluation; Date of the evaluation when it was approved by EVAL; Date when the project officially ends; Name of the evaluation consultant(s); Name of evaluation manager; ILO office administrating the project; All UN agencies that participated in the evaluation; Donor and project budget in US$; Cost of the evaluation in US$; and Key words. 2. TABLE OF CONTENTS The Table of Contents must contain the following elements: Accurate reflection of the contents in the report; List of tables, figures and charts; List of acronyms or abbreviations, as appropriate; 1 This checklist is meant as a summary check to be used to ensure all critical elements are present in the evaluation report. For more detailed information on each section and a quality control of the content, see Section 1 of Checklist 6 Rating the quality of the evaluation report. REVISED MARCH 2014 1

List of appendices which must include the following items and any additional documents the consultant deems necessary: o Complete a lessons learned template (one per lesson); o Complete an emerging good practice template (one per lesson); o Terms of reference; o Inception report; o List of persons interviewed; o Data collection instruments; and o Bibliography. 3. EXECUTIVE SUMMARY The following elements must be included in the Executive Summary: The executive summary should be brief and concise (maximum 5-6 pages), in addition to conforming to the following criteria: Explanation of the project s purpose, logic and structure and objectives (project background) Overview of the purpose, scope, clients of the evaluation, time period, geographical coverage and groups or beneficiaries of the evaluation (evaluation background) Concisely describes the evaluation s methodology Summarizes the evaluation findings Concise list of conclusions List of all lessons learned and emerging good practices List of all recommendations (must be aligned with the conclusions) 4. BODY OF THE REPORT The body of the evaluation report should follow the organization of the Executive Summary and be in compliance with the TOR. The evaluator may add appropriate sections (if needed), however the following elements must be included: Project background - Provide a brief summary of the projects purpose, logic, structure and objectives. It should specifically outline the intervention logic, strategy and main means of action; geographic coverage; and management structure. In addition, the present situation of the project should be discussed, explaining any key information about its current state of implementation and relevant context: If useful, include a brief outline of the economic, political, social, cultural, historical context of the country; Describe the project s objectives; Describe the context and intervention logic of the project; Describe the project s funding arrangements (including the donor); Describe the organizational arrangements for the project s implementation; Describe the contributions, role of ILO, project partners and other stakeholders; and Brief description/review of the project s implementation (major events, milestones). REVISED MARCH 2014 2

Evaluation background - This section should present a concise summary of the purpose and scope of the evaluation; the clients of the evaluation and/or who will use the evaluation findings; the evaluated time period; geographical coverage; and the targeted groups or beneficiaries of the evaluation. Purpose and primary use of the evaluation; Scope of the evaluation (for example, geographic coverage, information on the project s phases when the evaluation covers more than one project, special focus areas such as gender, collaboration, exit strategy, etc.); The clients of the evaluation and the main audience of the report (e.g. donors, constituents, implementing parties); Evaluation criteria and questions that clients want the evaluation to answer are identified; and Dates, events and the operational sequence of the evaluation. Methodology - Include a concise description of the evaluation s methodology for data collection and analysis, including the rationale for selecting the methodology and data sources, in addition to a description of all methodological limitations. The evaluation criteria and questions (as per the OECD/DAC 2 ); Description of the evaluation methods and data collection instruments to be used, including their justification; Description of the sources of information / data used; List of evaluation limitations and potential sources of bias (method selections, data sources, etc.); Description and rationale for stakeholder participation in the evaluation process; and The report indicates adherence to evaluation norms, standards and ethical safeguards. Main findings - Include a brief, overall assessment of the project s performance, including its relevance, effectiveness, efficiency, impact and sustainability. Findings are supported by evidence. Unintended and unexpected outcomes are also discussed. Evidence-based and include the following elements: Findings are relevant to the purpose and scope of the evaluation; Findings are supported by evidence and are consistent with methods and data; All evaluation questions are addressed and explanations provided when not be answered; All data is disaggregated by sex, age, ethnic group or other relevant demographic categories, where feasible; Unintended and unexpected results are discussed; Factors contributing to the success/failure of the project are identified and discussed; and Cross-cutting issues such as: (i) gender issues; (ii) tripartite issues and; (iii) international labour standards are assessed. See below for more details. GENDER ISSUES ASSESSMENT There should be a summary assessment of gender issues in the evaluation conclusions, and indications of any pertinent budgetary dedication regarding gender or sex-aggregated data. For more guidance on 2 OECD/DAC criteria: relevance, effectiveness, efficiency, impact and sustainability. REVISED MARCH 2014 3

presenting gender in evaluation conclusions see Guidance Note 4 Considering gender in the monitoring and evaluation of projects. TRIPARTITE ISSUES ASSESSMENT There should be a summary assessment of tripartite issues in the evaluation conclusions. Further guidance on ILO tripartite constituents and their role in evaluation can be found in Guidance Note 7 Stakeholder Participation. INTERNATIONAL LABOUR STANDARDS ASSESSMENT There should be a summary assessment of any international labour standard (ILS) issues relevant to the project and its evaluation. Any pertinent budgetary dedication to ILS issues should also be presented. 5. CONCLUSIONS Conclusions are formulated by synthesizing the main findings into statements of merit and worth. The evaluative reasoning and critical thinking used to formulate the conclusions must be clear. As such, special care should be given to their validity and reliability. Conclusions are formulated by synthesizing the main findings into summary judgments of merit and worth; The evaluative reasoning and critical thinking used to formulate the conclusions must be clear; Judgments are fair, impartial, and consistent with the findings; and Brief discussion of how the validity and reliability of the conclusions were determined. 6. LESSONS LEARNED AND EMERGING GOOD PRACTICES Lessons learned One of the purposes of evaluation in the ILO is to improve project or programme performance and promote organizational learning. Evaluations are expected to generate lessons that can be applied elsewhere to improve programme or project performance, outcome, or impact. The section on lessons learned should consider: Is the lesson significant? Does it deal with a non-trivial matter? Does the lesson concisely capture the context from which it was derived? Is the lesson applicable in different contexts? Is it clear in which situations the lesson could be reused in future? Does the lesson identify target users? Does the lesson specifically suggest what should be repeated or avoided in future contexts to guide future action? REVISED MARCH 2014 4

Emerging good practices ILO evaluation sees lessons learned and emerging good practices as part of a continuum, beginning with the objective of assessing what has been learned, and then identifying successful practices from those lessons which are worthy of replication. The section on emerging good practices should consider: Does it describe how it works? Does it concisely capture the context from which it was derived? Is it applicable in different contexts? Is it potentially replicable in different contexts? Does it identify target users? Does it demonstrate their link with specific impacts? The sections on lessons learned and emerging good practices should be prepared using their respective templates found in Annex 1. These templates provide helpful definitions and criteria. 7. RECOMMENDATIONS The consultant is expected to follow the criteria below when drafting recommendations, as appropriate. Recommendations should: Recommendations follow logically from conclusions, lessons learned and good practices; Specify who is called upon to act: - ILO Country Office - Project Management - ILO HQ Administration - Tripartite Constituents - ILO HQ Technical Unit - ILO Regional Office Specify action needed to remedy the situation; Distinguish priority or importance (high, medium, low); Specify the recommended time frame for follow-up; and Acknowledge whether there are resource implications. The recommendations text is uploaded into separate data sets for a formal exercise on management follow-up. For this reason, recommendations must be expressed as complete, stand-alone concise statements which do not include any unexplained acronyms, and which must follow certain criteria and formatting. 8. APPENDICES List of appendices must include the following items and any additional documents the consultant deems necessary: Lessons learned template; Emerging good practice template; Terms of reference; Inception report; List of persons interviewed; Data collection instruments; and Bibliography. REVISED MARCH 2014 5

ANNEX 1. CITING LESSONS LEARNED AND EMERGING GOOD PRACTICES The Lessons Learned and Emerging Good Practices in ILO Evaluations continue to be used, analyzed and disseminated after a report is finalized. They are entered into the i-track database as concise texts and coded with additional metadata. They are then compiled with all lessons learned and emerging good practices collected by the ILO and later used for management reporting, validation and technical discussions. This Annex provides detailed definitions and criteria for identifying and presenting these as findings in an ILO evaluation and is targeted at evaluation managers and evaluators. Following the definitions and criteria here should prevent any problems later during the approval of the draft report, as well as ensuring a high level of quality and consistency across ILO evaluations and in line with UN evaluation standards. The ILO Evaluation Unit aims to track trends in evaluation findings and assist management to improve performance and achieve useful replication of good practice whenever possible. SECTION 1. CITING LESSONS LEARNED ILO evaluation wants to present only high quality and useful lessons learned in their evaluations. This requires a framework of understanding and consistency which must be adhered to by ILO evaluators. This section presents the ILO basic definition, presents certain key elements and a checklist for presenting the lesson learned in both the text of the report and in the required templates which are annexed to the report for each lesson learned. BASIC DEFINITION OF A LESSON LEARNED ILO Definition - Lessons learned A lesson learned is an observation from project or programme experience which can be translated into relevant, beneficial knowledge by establishing clear causal factors and effects. It focuses on a specific design, activity, process or decision and may provide either positive or negative insights on operational effectiveness and efficiency, impact on the achievement of outcomes, or influence on sustainability. The lesson should indicate, where possible, how it contributes to 1) reducing or eliminating deficiencies; or 2) building successful and sustainable practice and performance. A lesson learned may become an emerging good practice when it additionally shows proven results or benefits and is determined to be worthwhile for replication or up-scaling. The lessons collected during an evaluation are one of three key outputs included in EVAL s checklist for ensuring the quality of evaluation reports. Identifying and presenting lessons learned is an exercise to recognize and document the richest and most meaningful lessons gained from project events. The elements listed below provide a framework for identifying lessons learned. REVISED MARCH 2014 6

KEY ELEMENTS OF A LESSON LEARNED: A lesson learned can refer to a positive experience, in the case of successful results; or to a negative experience, in the case of malfunctioning processes, weaknesses or undesirable influences. A lesson learned should specify the context from which it is derived, establish potential relevance beyond that context, and indicate where it might be applied. A lesson learned explains how or why something did or did not work by establishing clear causal factors and effects. Whether the lesson signals a decision or process to be repeated or avoided the overall aim is to capture lessons that management can use in future contexts to improve projects and programmes. A lesson learned should indicate how well it contributes to the broader goals of the project or programme and establish, when possible, if those goals align appropriately with the needs of beneficiaries or targeted groups. Each of the following criteria should be considered, included and adequately explained, to ensure that lessons learned are complete and useful. Once a lesson learned has been selected for inclusion in the evaluation, the evaluator must fill out the Lessons Learned Template for each one, and annex it to this Checklist. These templates are submitted as standard annexes to the evaluation report. Lessons Learned Criteria Checklist The evaluator should cite and explain the points below, when appropriate. Context Explain the context from which the lesson has been derived (e.g. economic, social, political). If possible, point to any relevance to the broader ILO mandates or broader technical or regional activities. Challenges Briefly cite any difficulties, problems or obstacles encountered / solutions found - Positive and negative aspects of the lesson should be described. Project Goals Point out any contribution to the broader goals of the project, if relevant. Causal factors Present evidence for how or why something did or did not work? Beneficiaries Targeted users or beneficiaries affected by the lesson learned should be cited. Success The lessons learned should cite any decisions, tasks, or processes that reduced or eliminated deficiencies or built successful and sustainable practice and performance; or have the potential of success. Evaluators are encouraged to concentrate on the quality of lessons learned and are instructed to present them in the report only if they can adhere to the quality criteria listed above. Each lesson learned identified in the report should be accompanied by a completed template, annexed to this checklist. These templates are then included in a data roster, coded with additional metadata and re-used later in analysis on technical and administrative aspects of evaluation findings. If the evaluator identifies a lesson learned with strong positive and replicable components, it should be considered for inclusion as an emerging good practice. After applying the further criteria listed below for emerging good practices, the finding would be moved from lessons learned into the emerging good practice section. A practice or observation should not exist in both the lessons learned list and the emerging good practices list. REVISED MARCH 2014 7

SECTION 2. CITING EMERGING GOOD PRACTICES ILO Evaluation sees lessons learned and emerging good practices as part of a continuum, beginning with the objective of assessing what has been learned, and then identifying successful practices from those lessons which are worthy of replication. As defined in the previous section, a lesson learned may have negative or positive aspects. Some key differences between a simple lesson learned and an emerging good practice are that an emerging good practice: a) represents successful strategies or interventions that have performed well; b) through establishing a clear cause-effect relationship, the practice has achieved marked and measurable results or benefits; and c) related strategies are determined to be specifically useful for replication or up-scaling. Those successful lessons adhering to these criteria, therefore, are presented as emerging good practices, and details of the substantiating criteria are filled in the templates provided by the evaluator. The Evaluation Unit undertakes some further coding of these good practices, and they are subsequently made available to ILO staff as data sub-sets for further validation by technical experts and discussion groups. ILO Definition - Emerging Good Practice A lesson learned may become an emerging good practice when it additionally shows proven marked and sustainable results or benefits and is determined by the evaluator to be considered for replication or up-scaling to other ILO projects. An emerging good practice should demonstrate clear potential for substantiating a cause-effect relationship and may also show potential for replicability and broader application. It can derive from comparison and analysis of activities across multiple settings and policy sources or emerge from a simple, technically specific intervention. KEY ELEMENTS OF AN EMERGING GOOD PRACTICE: An emerging good practice is any successful working practice or strategy, whether fully or in part, that has produced consistent, successful results and measurable impact. An emerging good practice implies a mapped logic indicating a clear cause-effect process through which it is possible to derive a model or methodology for replication. An emerging good practice can demonstrate evidence of sustainable benefit or process. An emerging good practice has an established and clear contribution to ILO policy goals and demonstrates how that policy or practice aligns, directly or indirectly, to the needs of relevant beneficiaries or targeted groups. REVISED MARCH 2014 8

The following criteria for identifying an emerging good practice should be fully considered. Emerging Good Practice Criteria Checklist When possible, the evaluator should cite and explain any of the points below. The Good Practice - Cite as specifically as possible the task or practice which works and how it works, linking it to projects relevance and purpose. Context and relevant preconditions Indicate the circumstances under which the good practice took place and cite anything that might affect its application to other settings (e.g. problems and obstacles). Provide relevant preconditions or specific organizational aspects which contributed to the emerging good practice. Has it been used before? Causal factors This is sometimes linked to context and beneficiaries, and should also indicate where project design and objectives, or the prevailing theory of change in the project, can be linked to the emerging good practice. Beneficiaries Explain who are the targeted beneficiaries or users of the good practice and the impact on them. Measurable impact Demonstrate when a specific impact was linked to the emerging good practice, providing specific figures or details when possible. Potential for replication Cite specific reasons why the emerging good practice is considered to be potentially replicable in different contexts? How? Link to ILO policy goals When possible, indicate any relevant contribution or link to the broader ILO policy goals or country programme outcomes. Utilizing emerging good practices The Evaluation unit enters data from the good practice template into its internal database where additional coding is added. Data sets of the emerging good practices are then made available to generate management reports. Technical experts will be asked to conduct further validation on these. They should not contain any unexplained acronyms or references that would limit their usefulness as a stand-alone piece of knowledge. The data collected on emerging good practices will be clustered into thematic areas and presented to technical experts for further validation and analysis. For this reason, the evaluator is requested to keep these instructions in mind when drafting text in the templates. All information in the templates should correspond accurately to the findings in the full text of the report. REVISED MARCH 2014 9

ILO Emerging Lesson Learned Template Evaluation Title: Name of Evaluator: Project TC/SYMBOL: Date: The following Lesson Learned has been identified during the course of the evaluation. Further text can be found in the conclusions of the full evaluation report. LL Element Brief summary of lesson learned (link to project goal or specific deliverable) Text Context and any related preconditions Targeted users / Beneficiaries Challenges /negative lessons - Causal factors Success / Positive Issues - Causal factors ILO administrative issues (staff, resources, design, implementation) Other relevant comments REVISED MARCH 2014 10

ILO Emerging Good Practice Template Evaluation Title: Name of Evaluator: Project TC/SYMBOL: Date: The following emerging good practice has been identified during the course of the evaluation. Further text can be found in the full evaluation report. GP Element Brief summary of the good practice (link to project goal or specific deliverable, background, purpose, etc.) Text Relevant conditions and Context: limitations or advice in terms of applicability and replicability Establish a clear causeeffect relationship Indicate measurable impact and targeted beneficiaries Potential for replication and by whom Upward links to higher ILO Goals (DWCPs, Country Programme Outcomes or ILO s Strategic Programme Framework) Other documents or relevant comments REVISED MARCH 2014 11