CSC scores, Aug Uganda (DRC/DDG) 97% Yemen (DRC/DDG) 93% South Sudan (DRC/DDG) 91% Georgia (DRC) 89% Jordan (DRC) 88% Ukraine (DRC)

Similar documents
16-17 NOVEMBER 2017, MOSCOW, RUSSIAN FEDERATION OVERVIEW PRESENTATION

GHSA Global Activities Update. Presentation by Indonesia

DEVELOPMENT AID AT A GLANCE

MEASURING GENDER EQUALITY IN EDUCATION: LESSONS FROM 43 COUNTRIES

Michuki Mwangi Regional Development Manager - Africa ISOC. AFTLD AGM 7 th March 2010 Nairobi, Kenya

Introduction Research Teaching Cooperation Faculties. University of Oulu

RECOGNITION OF THE PREVIOUS UNIVERSITY DEGREE

The Rise of Populism. December 8-10, 2017

Baku Regional Seminar in a nutshell

Berkeley International Office Survey

SMASE - WECSA ASSOCIATION 10 th Anniversary

ITEC / SCAAP PROGRAMMES ITEC/SCAAP Programmes Sponsored by : Ministry of External Affairs, Government of India

AUTHORITATIVE SOURCES ADULT AND COMMUNITY LEARNING LEARNING PROGRAMMES

OHRA Annual Report FY16

Management and monitoring of SSHE in Tamil Nadu, India P. Amudha, UNICEF-India

Master of Statistics - Master Thesis

FRESNO COUNTY INTELLIGENT TRANSPORTATION SYSTEMS (ITS) PLAN UPDATE

APPLICATION GUIDE EURECOM IMT MASTER s DEGREES

Annual Report

Setting the Scene and Getting Inspired

The Ohio State University Library System Improvement Request,

THE WEB 2.0 AS A PLATFORM FOR THE ACQUISITION OF SKILLS, IMPROVE ACADEMIC PERFORMANCE AND DESIGNER CAREER PROMOTION IN THE UNIVERSITY

Post-16 Level 1/Level 2 Diploma (Pilot)

11. Education: Gender Disparities [205]

ANNUAL CURRICULUM REVIEW PROCESS for the 2016/2017 Academic Year

Department of Geography Geography 403: The Geography of Sub-Sahara Africa

GRANT WOOD ELEMENTARY School Improvement Plan

Northwestern University Archives Evanston, Illinois

Office Hours: Day Time Location TR 12:00pm - 2:00pm Main Campus Carl DeSantis Building 5136

Stakeholder Engagement and Communication Plan (SECP)

OHRA Annual Report FY15

Alternative education: Filling the gap in emergency and post-conflict situations

The IDN Variant Issues Project: A Study of Issues Related to the Delegation of IDN Variant TLDs. 20 April 2011

STUDENT LEARNING ASSESSMENT REPORT

Improving the impact of development projects in Sub-Saharan Africa through increased UK/Brazil cooperation and partnerships Held in Brasilia

JICA s Operation in Education Sector. - Present and Future -

Higher Education Six-Year Plans

Developing an Assessment Plan to Learn About Student Learning

Youth Sector 5-YEAR ACTION PLAN ᒫᒨ ᒣᔅᑲᓈᐦᒉᑖ ᐤ. Office of the Deputy Director General

University of Waterloo School of Accountancy. AFM 102: Introductory Management Accounting. Fall Term 2004: Section 4

New Education Division Documents No. 13. Post-basic Education in Partner Countries

Nottingham Trent University Course Specification

INTERNATIONAL STUDENT TIMETABLE BRISBANE CAMPUS

National Survey of Student Engagement (NSSE) Temple University 2016 Results

Overall student visa trends June 2017

West Georgia RESA 99 Brown School Drive Grantville, GA

Annex 4 University of Dar es Salaam, Tanzania

The Assistant Director-General for External Relations and Public lnfonnation

Personal Tutoring at Staffordshire University

EFA and the Institute of Education, University of London : implicit and explicit engagements

Western Australia s General Practice Workforce Analysis Update

EXECUTIVE SUMMARY. Online courses for credit recovery in high schools: Effectiveness and promising practices. April 2017

Meeting on the Recognition of Prior Learning (RPL) and Good Practices in Skills Development

ACCREDITATION STANDARDS

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Accommodation for Students with Disabilities

DICE - Final Report. Project Information Project Acronym DICE Project Title

Barstow Community College NON-INSTRUCTIONAL

Principal vacancies and appointments

ADULT VOCATIONAL TRAINING PROGRAM APPLICATION

COMM370, Social Media Advertising Fall 2017

INSIGHTS INTO THE IMPLEMENTATION OF MATHEMATICAL LITERACY

Reforms for selection procedures fundamental programmes and SB grant. June 2017

College Action Project Worksheet for CAP Projects March 18, 2016 Update

APPENDIX 2: TOPLINE QUESTIONNAIRE

MBA 510: Critical Thinking for Managers

TENNESSEE S ECONOMY: Implications for Economic Development

Educational Attainment

AB104 Adult Education Block Grant. Performance Year:

PIRLS. International Achievement in the Processes of Reading Comprehension Results from PIRLS 2001 in 35 Countries

Strategic Planning for Retaining Women in Undergraduate Computing

REVIEW CYCLES: FACULTY AND LIBRARIANS** CANDIDATES HIRED ON OR AFTER JULY 14, 2014 SERVICE WHO REVIEWS WHEN CONTRACT

COURSE WEBSITE:

Graduate Division Annual Report Key Findings

ABET Criteria for Accrediting Computer Science Programs

STUDY ABROAD INFORMATION MEETING

Information Session on Overseas Internships Career Center, SAO, HKUST 1 Dec 2016

Arizona GEAR UP hiring for Summer Leadership Academy 2017

Disability Resource Center St. Philip's College ensures Access. YOU create Success. Frequently Asked Questions

MSW POLICY, PLANNING & ADMINISTRATION (PP&A) CONCENTRATION

(ALMOST?) BREAKING THE GLASS CEILING: OPEN MERIT ADMISSIONS IN MEDICAL EDUCATION IN PAKISTAN

Rebecca McLain Hodges

Initial teacher training in vocational subjects

Regional Bureau for Education in Africa (BREDA)

U VA THE CHANGING FACE OF UVA STUDENTS: SSESSMENT. About The Study

Program Change Proposal:

3rd Grade Ngsss Standards Checklist

08-09 DATA REVIEW AND ACTION PLANS Candidate Reports

Providing Feedback to Learners. A useful aide memoire for mentors

BUS Computer Concepts and Applications for Business Fall 2012

Final. Developing Minority Biomedical Research Talent in Psychology: The APA/NIGMS Project

MANAGEMENT CHARTER OF THE FOUNDATION HET RIJNLANDS LYCEUM

Graduate Program in Education

A 90 Year Quest for Excellence in Education!

August 14th - 18th 2005, Oslo, Norway. Code Number: 001-E 117 SI - Library and Information Science Journals Simultaneous Interpretation: Yes

EQuIP Review Feedback

international PROJECTS MOSCOW

International Social Science Research in Africa, Asia, and Latin America: A Multidisciplinary Seminar on Concept, Design, and Praxis

Chart 5: Overview of standard C

School Inspection in Hesse/Germany

Transcription:

In Summer 2016, DRC Country Operations conducted the third MEL Compliance Self Check (CSC). The MEL CSC checks compliance with the Monitoring, Evaluation and Learning, Minimum Operational Procedures (MELMOPs) by Operations. In general the MEL CSC showed a satisfactory level of compliance with the MELMOPs. The average scores show a steady improvement over time from 61% in August 2015, to 66% in February 2016 to 71% in August 2016. As such DRC is steadily progressing towards its target of 75% compliance with MELMOP standards by August 2016 although there is clearly room for improvement. Formal development and use of a management response appears to be a weakness in DRC s management and use of evaluations. Regional and HQ leads should explore how this can be further mainstreamed into DRC evaluation practice. A partial solution may be to streamline the current practice of developing both lessons learnt notes and management responses, as well as formally include regional and HQ M&E staff in sign-off processes. Similarly, registration of evaluations with HQ is an identified weakness. This would support use of evaluations more broadly at the global level, and operations are encouraged to adopt this practice using the form provided. 1 There are likely differing interpretations of the MEL focal point role, which range from acting as focal point for communication on MEL matters with HQ to leading on MEL matters within the operation. Further, a MEL focal point may function alongside other MEL officers or may have a different primary function or technical expertise. A standard ToR for the role of the M&E Focal Point at the regional and country level will be developed and should be used for reference. In the meantime, country teams are encouraged to refer to existing guidance on MELMOP. 2 CSC scores, Aug 2016 Uganda (DRC/DDG) 97% Yemen (DRC/DDG) 93% South Sudan (DRC/DDG) 91% Georgia (DRC) 89% Jordan (DRC) Ukraine (DRC) Kosovo (DRC) Serbia (DRC) Kenya (DDG) 85% Iran (DRC) Somalia (DRC) Libya (DDG) Lebanon (DRC) Vietnam (DDG) Turkey (DDG) 8 Tajikistan (DRC) 78% Sudan (DRC) Kenya (DRC) 75% Somalia (DDG) 75% Ivory Coast (DRC/DDG) 72% Pakistan (DRC) 69% Afghanistan (DDG) Liberia (DRC/DDG) 67% Mali (DRC/DDG) 67% Guinea (DRC) Tanzania (DRC) 63% Burkina Faso (DRC/DDG) 62% MENA 58% DR Congo (DRC) 53% CAR (DRC) 5 Nigeria (DRC/DDG) Iraq (DDG) Niger (DRC/DDG) Syria (DDG) 36% Afghanistan (DRC) 23% 1 https://docs.google.com/a/drc.dk/forms/d/e/1faipqlscsyhxvfjxqy-khcink4m1iujw5ltcj1ycgymdygl0wgcftqa/viewform 2 http://melmop.drc.dk/wp-content/uploads/mel-peo-7k-me-staff-tor-guideline.pdf

Under-budgeting for MEL activities and personnel appears to be an ongoing issue. DRC operations should continue to reference existing MELMOP guidance. 3 DRC HQ should consider whether this guidance can be developed further to help define what is sufficient for a MEL budget. There is wide variation in scores across countries in each region. HQ and regional M&E advisors should consider these variations when planning annual support across DRC operations. In general the MEL CSC showed a satisfactory level of compliance with the MELMOPs. The average scores show a steady improvement over time from 61% in August 2015, to 66% in February 2016 to 71% in August 2016. As such DRC is steadily progressing towards its target of 75% compliance with MELMOP standards by Region Standalone CA CASWA MENA Range (% points) 8 41 46 45 Average % 86 71 64 67 August 2016 although there is clearly room for WA 28 60 improvement. HOAY 41 76 At regional level, the scores varied considerable from country to country. It was notable that standalone countries demonstrated the highest average score and lowest range, despite not being a coherent geographical region. Variation amongst formal DRC regions was high, with West African countries demonstrating least variation. An analysis of scoring across the 18 MELMOP indicators demonstrates consistently stronger scoring across system, people and some evaluation indicators. Scoring against learning-oriented indicators and those relating to registration of evaluations and lesson learning notes were noticeably lower. DRC Libya has partially developed a MEL system for the entire programme. In West Africa, the regional office has developed a standard template for all projects in the region, which comprises a workplan and logframe with objectives, indicators and outputs. This is updated through an activity database for each CO. This facilitates accurate monitoring of project progress. The logframe is developed by the CO with contributions from the regional office and M&E specialists. Data disaggregation appeared to be a common practice for most operations, with 79% responding positively and 12 operations confirming this with qualitative comments. A few operations responded more cautiously. 1 operation implied that only a third of programmes could produce disaggregated data (Afghanistan). 1 operation indicated that sex and age was disaggregated as standard, but other vulnerabilities were not usually considered (Kenya). 1 operation implied that there were limitations from the government on how data was collected. (Iran). Overall, operations responded positively to having an M&E focal point in post. A few implied that the focal point had been in post for some time (Somaliland, Guinea), but many operations also have been experiencing turnover. In some cases a focal point has been recently recruited (Burkina, CAR, Côte d'ivoire and Mali), and in others M&E staff will be coming into to post in the coming months (Afghanistan, Nigeria). A few operations mentioned that the M&E focal point is being carried out by staff with other duties or technical expertise (Afghanistan, DRC, Tanzania, Tajikistan), such as CD, grants officer or protection officer. One operation indicated that budget was not currently available for a M&E focal point (DRC). Regional offices are clearly play an important role in training focal points, with a comprehensive training programme having been conducted in West Africa earlier this year, and a planned regional training in Nairobi during October 2016. Broadly speaking, operations are confident that appropriate terms of reference have been developed for MEL focal points, and that they are involved in the development of proposals. 3 See http://melmop.drc.dk/people/ 2 of 9

M12 M11 M10ii M10i M10 M9 M8 M7 M6i M6 M5 M4 M3 M2i M2 M1ii M1i M1 Operations are less confident that there is sufficient budget and personnel capacity in place to support the MEL function. This was the lowest scoring area in the people category at 49%. Qualitative comments in this area varied widely. For some, there appeared to be strong policies and capacity in place (Jordan, Kenya DDG, Somaliland DDG, South Sudan). Others noted concerns with either financial (DRC Kenya, Lebanon) or personnel resources for MEL activities (DRC Afghanistan, Guinea, Pakistan, Tanzania). 3 to 5% of the every project budgets is reserved to run M&E activities and support M&E staff. DRC Jordan There is appropriate level of human resource capacity with some minor gaps. The budget to support monitoring and evaluation plans is adequate in each project. DDG Somaliland This has been recently scaled up with addition of MEL Officer and some data assistants, but will not be sufficient for longer term programming and will need to be considered in future budget proposals DRC Tanzania MEL involvement in budgeting for their own staff is increasingly improving with each grant proposal submitted to new donors, but there still remains a relatively lack of human resources and budget in a large number of grants compared to the support requested from the teams. DRC Lebanon Chart 2: Aug 2016 scores by question. M&E-plan? Yes; 57% Partially; 4 No; 3% indicator selection Yes; 89% Partially; 9% 3% Data disaggregation Yes; Partially; 2 M&E Focal point? Yes; Partially; 14% No; 3% Focal point trained? Yes; 66% Partially; 2 No; 9% ToR Focal Point? Yes; Partially; 11% No; 11% Proposal development? Yes; 63% Partially; 2 No; 17% Budget & Capacity? Yes; 49% Partially; 43% No; 9% Evaluations threshold followed? Yes; 63% Partially; 2 No; 6% N/A; 11% Evaluation per year? Yes; 8 Partially; 9% No; 11% Planned Evaluations registered with HQ? Yes; 29% Partially; 17% No; 51% Completed evaluations uploaded? Yes; 43% Partially; 14% No; 2 N/A; 23% Management response? Yes; 31% Partially; 26% No; 23% N/A; 2 M&E informs AR? Yes; 57% Partially; 14% No; 23% N/A; 6% LL-note for Annual Review? Yes; 43% Partially; 26% No; 17% N/A; 14% AR LL-note shared (mel@drc.dk)? Yes; 17% Partially; 9% No; 54% N/A; 2 Evaluations inform proposals? Yes; 57% Partially; 2 No; 11% N/A; 11% Evaluation LL-notes prepared and shared? Yes; 11% Partially; 17% No; 43% N/A; 29% 3 of 9

22 operations responded positively to planning evaluations according to DRC evaluation policy thresholds and conducting at least 1 evaluation per year. For those that responded negatively, qualitative comments implied that this was because there were no current projects that meet the threshold criteria (Niger, Turkey, DDG Kenya, Yemen). 16 operations indicated plans are fully or partially registered with HQ, whilst 20 operations indicated that completed evaluations were uploaded to GMS. 16 operations affirmed that this is either in process or already done in qualitative comments. 8 operations registered details of current evaluations in the evaluation planning form, which are due to take place between now and February next year. Of 10 registered evaluations 4 had a protection focus, whilst 6 had a livelihoods focus, consistent with DRC s main areas of programming. The development of a management response was one of the lowest scoring indicators for this CSC exercise, with only 39% of operations responding positively. This was confirmed in qualitative comments with 10 operations indicating that whilst efforts are certainly being made to incorporate lessons learnt into programming and practice, this is predominantly done on an informal basis and a formal management response is not prepared. Table 1: Evaluation plans registered by operations (September 2016) South Sudan End of project evaluation Protection and Livelihoods Oct - Nov 2016 South Sudan Mid-term evaluation Food Security & Livelihoods Jan-Mar 2017 Libya End of project evaluation HMA Oct- Dec 2016 Georgia End of project evaluation Livelihoods; Community Development and Advocacy; Capacity Sep Oct 2016 Building. Turkey End of project evaluation Livelihoods May-Jun 2016 (completed) Turkey End of project evaluation Protection and Emergency (cash) Jan-Feb 2017 Kenya End of project evaluation Protection, Livelihood Aug - Sep 2016 Kosovo End of project evaluation Livelihood Sep-16 HOAY region End of project evaluation protection? Uganda TQA Rural Infrastructure Oct-16 Operations were relatively confident that MEL data informed the annual review process, with 57% responding positively. For some though, this was seen as an area for improvement. However, significantly fewer (43%) reported that an annual review lessons learnt note had been produced. In many cases, this was because the annual review for this year had not yet taken place or the lessons learnt note was still being prepared. This appears to have had a knock-on effect with the submission of lesson learnt notes to the HQ M&E unit, with only 17% responding positively. The findings are almost identical for the use of evaluations to inform new proposals. 57% of operations were confident that evaluation findings indeed inform project development, but this almost exclusively done on an informal basis and few formal lessons learnt notes are prepared. This was the lowest scoring indicator for the exercise with only 11% responding positively. Some operations felt the use of evaluations for this purpose is problematic. Lessons learned notes are better maintained by programme and MEL staff in the operation. External evaluators are not appropriate for such a task, given the difference between the objectives of learning and evaluations. (Ukraine) 4 of 9

The purpose of the CSC is to assess compliance with the MELMOPs, which are minimum operating standards for MEL in DRC. Monitoring MELMOP compliance allows DRC managers in headquarters and the field to identify areas where additional effort might be needed to ensure quality MEL systems across the organization. (see ). This report highlights the results and main issues from the analysis of the 3rd MEL CSC. It is the responsibility of each Country Operation Senior Management Team, together with the Country M&E Focal Point to assess their MEL CSC and react to areas of non-compliance. This report also acts as a point of departure for dialogue, analysis and planning on rectifying action between field operations, M&E Focal Points, Support Units and the MEL Unit in OPSU. It is recommended that Country Operations with a low level of compliance ensure that rectifying actions are written into the Result Contract. The next MEL CSC is planned for first quarter of 2017, which will allow further opportunities to monitor progress towards MELMOP compliance. For the third round of MELMOP self-checks, some adjustments were made to the CSC indicators. These are minor adjustments which seek to clarify existing CSC indicators and their links to the MELMOPs, emphasise the role of monitoring in MEL practice and ensure alignment with relevant indicators of the Core Humanitarian Standard. Further revisions will be considered as appropriate, whilst also seeking to ensure continuity of existing indicators to the extent possible - so that self-check results remain comparable over successive years. 4 In particular, the CSC indicators have been adjusted to clarify as follows: A. Sufficient resources should be allocated to MEL needs both human and financial. (M5) B. It is now possible for operations to specify whether a focal point is in place AND has been trained appropriately. (M2 and M2i) C. There is a clear distinction between two MELMOP requirements: At least one evaluation should be planned per year. Evaluations should be also planned where project grants are required according to the MELMOP thresholds. D. The MELMOP evaluation indicators are applicable to all evaluations not just those conducted to meet MELMOP requirements. E. There is greater clarity on the mechanism to register evaluations with HQ. Planned evaluations should be registered through the specified online form. Completed evaluations should be uploaded to GMS: Whilst MELMOP indicators are already well aligned with CHS requirements particularly learning on use of MEL for learning and decision-making, additional indicators have been introduced to explicitly ensure the use of indicators at all levels of a logframe and to ensure the collection of disaggregated data. A final innovation was to request that operations upload MEP to GMS, as key documents related to grant managements. This would allow a sample to be reviewed as part of the CSC feedback exercise. There was a high response rate to the exercise. 35 CSCs were received from country operations representing DRC, DRC/DDG and DDG operations. In 3 countries, DDG and DRC submitted individual CSC assessments (Afghanistan, Kenya, Somalia). Operations from Libya, Turkey and Vietnam, Syria and Iraq only submitted assessments of DDG 4 For example, there is currently no CSC indicator which measures compliance with the first MELMOP standard to have in place a country/regional MEL system. There are plans to further develop MELMOP guidance in this area. 5 of 9

operations. MENA region also submitted a regional aggregation of their CSC assessments. This represent a slight decrease from February 2016, where 36 CSC forms were returned including a regional assessment of the Sahel operations. 5 In contrast to previous CSC rounds, the 18 CSC indicators were weighted equally, which reduces comparability with previous results. Where a CSC indicator was rated yes, full points were awarded. For partial ratings half points were awarded and no points were allocated for no ratings. Where an indicator was rated not applicable, points from this indicator were not used in the percentage calculation. 27 country operations have now completed the exercise three times, allowing comparison over a 1 year period for these operations. 6 The analysis of quantitative scores has been supplemented and crosschecked against qualitative feedback. As mentioned it is the responsibility of each Country Operation, Country Director and Senior Management Team, together with the Country M&E Focal Point to assess their MEL CSC and react to issues of non-compliance. However, the MEL Unit in OPSU offers a number of support options for strengthening M&E at the country level: A. Country M&E System Reviews and Support Missions B. DRC M&E Training either conducted remotely or Face to Face C. Evaluations and Reviews D. Guidelines and Tools are provided on melmop.drc.dk If any of the above support options is relevant for your Country Operation in 2017 send an email to mel@drc.dk. 5 At the time of writing CSC assessments had not been received from DRC Turkey, Myanmar and Ethiopia/Djibouti. 6 Due to revisions only current M1, M4, M6, M7 and M9 are comparable with previous CSC rounds. However, it should be noted that previous results were calculated differently, so even in these areas comparability is limited. 6 of 9

CA Asia/Europe Chart 1: Stand alone operations - Comparison between CSC Scores (Aug 16; Feb 15; Aug 15) 1 2 3 4 5 6 7 8 9 10 Georgia (DRC) Kosovo (DRC) Myanmar (DRC/DDG) Serbia (DRC) Ukraine (DRC) Vietnam (DDG) 4 5 58% 62% 62% 71% 74% 89% 82% 9 84% Chart 2: Central Africa- Comparison between CSC Scores (Aug 16; Feb 15; Aug 15) 1 2 3 4 5 6 7 8 9 10 CAR (DRC) 15% 29% 5 DR Congo (DRC) 29% 53% 54% South Sudan (DRC/DDG) Tanzania (DRC) 35% 4 65% 63% 91% Uganda (DRC/DDG) 97% 10 7 of 9

HOAY CASWA Chart 3: CASWA region - Comparison between CSC Scores (Aug 16; Feb 15; Aug 15) 1 2 3 4 5 6 7 8 9 10 Afghanistan (DDG) Afghanistan (DRC) 23% 42% 73% Iran (DRC) 65% 74% Pakistan (DRC) 16% 69% Tajikistan (DRC) 32% 78% Chart 4: HOAY region- Comparison between CSC Scores (Aug 16; Feb 15; Aug 15) 1 2 3 4 5 6 7 8 9 10 Ethiopia (DRC) 3 Kenya (DDG) 79% 85% 9 Kenya (DRC) 75% 74% 73% Somalia (DDG) 75% 9 Somalia (DRC) 19% 9 Sudan (DRC) 28% 63% Yemen (DRC/DDG) 61% 93% 8 of 9

WA MENA Chart 5: MENA region - Comparison between CSC Scores (Aug 16; Feb 15; Aug 15) 1 2 3 4 5 6 7 8 9 10 Iraq (DDG) Jordan (DRC) 84% 85% Lebanon (DRC) Libya (DDG) Region 55% 58% 71% 94% 9 Syria (DDG) 36% 71% Turkey (DDG) 8 Turkey (DRC) 71% 74% Chart 6: West Africa region - Comparison between CSC Scores (Aug 16; Feb 15; Aug 15) 1 2 3 4 5 6 7 8 9 10 Sahel Region (DDG) 5 Burkina Faso (DRC/DDG) 48% 62% Guinea (DRC/DDG) 58% Ivory Coast (DRC) 72% 66% Liberia (DRC) 45% 6 67% Mali (DRC/DDG) 48% 67% Niger (DRC/DDG) Nigeria (DRC/DDG) 9 of 9