Module 5. Defining Evaluation Questions

Similar documents
Section 3.4. Logframe Module. This module will help you understand and use the logical framework in project design and proposal writing.

Colorado s Unified Improvement Plan for Schools for Online UIP Report

Conceptual Framework: Presentation

elearning OVERVIEW GFA Consulting Group GmbH 1

Volunteer State Community College Strategic Plan,

The Rise of Results-Based Financing in Education 2015

Introduction to Simulation

Self Assessment. InTech Collegiate High School. Jason Stanger, Director 1787 Research Park Way North Logan, UT

Assessment Method 1: RDEV 7636 Capstone Project Assessment Method Description

ADDIE: A systematic methodology for instructional design that includes five phases: Analysis, Design, Development, Implementation, and Evaluation.

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

1. Answer the questions below on the Lesson Planning Response Document.

STEPS TO EFFECTIVE ADVOCACY

Graduate Program in Education

THE 2016 FORUM ON ACCREDITATION August 17-18, 2016, Toronto, ON

Governors and State Legislatures Plan to Reauthorize the Elementary and Secondary Education Act

Arizona s English Language Arts Standards th Grade ARIZONA DEPARTMENT OF EDUCATION HIGH ACADEMIC STANDARDS FOR STUDENTS

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

Examining the Structure of a Multidisciplinary Engineering Capstone Design Program

VOL VISION 2020 STRATEGIC PLAN IMPLEMENTATION

Date Re Our ref Attachment Direct dial nr 2 februari 2017 Discussion Paper PH

Regional Bureau for Education in Africa (BREDA)

How do we balance statistical evidence with expert judgement when aligning tests to the CEFR?

Position Statements. Index of Association Position Statements

Assessment. the international training and education center on hiv. Continued on page 4

Software Development Plan

Software Maintenance

Activities, Exercises, Assignments Copyright 2009 Cem Kaner 1

Developing an Assessment Plan to Learn About Student Learning

Preparing a Research Proposal

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

2 Participatory Learning and Action Research (PLAR) curriculum

Early Warning System Implementation Guide

Evidence-based Practice: A Workshop for Training Adult Basic Education, TANF and One Stop Practitioners and Program Administrators

Executive Summary. Laurel County School District. Dr. Doug Bennett, Superintendent 718 N Main St London, KY

Conceptual and Procedural Knowledge of a Mathematics Problem: Their Measurement and Their Causal Interrelations

Trends & Issues Report

MASTER S THESIS GUIDE MASTER S PROGRAMME IN COMMUNICATION SCIENCE

Youth Peer Education Toolkit. Standards for Peer Education Programmes

Ex-Post Evaluation of Japanese Technical Cooperation Project

Major Milestones, Team Activities, and Individual Deliverables

Strategic Practice: Career Practitioner Case Study

Implementing Pilot Early Grade Reading Program in Morocco

1.1 Examining beliefs and assumptions Begin a conversation to clarify beliefs and assumptions about professional learning and change.

Abstract. Janaka Jayalath Director / Information Systems, Tertiary and Vocational Education Commission, Sri Lanka.

Automating the E-learning Personalization

Generating Test Cases From Use Cases

SEN SUPPORT ACTION PLAN Page 1 of 13 Read Schools to include all settings where appropriate.

Changes to GCSE and KS3 Grading Information Booklet for Parents

School Leadership Rubrics

Practice Examination IREB

Introduction: SOCIOLOGY AND PHILOSOPHY

Presentation of the English Montreal School Board To Mme Michelle Courchesne, Ministre de l Éducation, du Loisir et du Sport on

READY TO WORK PROGRAM INSTRUCTOR GUIDE PART I

A Game-based Assessment of Children s Choices to Seek Feedback and to Revise

Initial English Language Training for Controllers and Pilots. Mr. John Kennedy École Nationale de L Aviation Civile (ENAC) Toulouse, France.

Livermore Valley Joint Unified School District. B or better in Algebra I, or consent of instructor

Augusta University MPA Program Diversity and Cultural Competency Plan. Section One: Description of the Plan

Section 1: Program Design and Curriculum Planning

others have examples for how feedback mechanisms at the CBO level have been established?

Davidson College Library Strategic Plan

City of Bellingham Department of Public Works Homeowner Incentive Program Evaluation Plan. Katie Tozier Colby Mitchell Mollie Behn

Chicago State University Ghana Teaching and Learning Materials Program:

Strategic Planning for Retaining Women in Undergraduate Computing

School Data Profile/Analysis

Summary results (year 1-3)

Community Power Simulation

THE CONSENSUS PROCESS

Success Factors for Creativity Workshops in RE

CST Readiness: Targeting Bubble Students

No educational system is better than its teachers

Community engagement toolkit for planning

How to Read the Next Generation Science Standards (NGSS)

Tools to SUPPORT IMPLEMENTATION OF a monitoring system for regularly scheduled series

ASCD Recommendations for the Reauthorization of No Child Left Behind

Leading the Globally Engaged Institution: New Directions, Choices, and Dilemmas

EQuIP Review Feedback

Cooking Matters at the Store Evaluation: Executive Summary

MINNESOTA STATE UNIVERSITY, MANKATO IPESL (Initiative to Promote Excellence in Student Learning) PROSPECTUS

USING SOFT SYSTEMS METHODOLOGY TO ANALYZE QUALITY OF LIFE AND CONTINUOUS URBAN DEVELOPMENT 1

School Inspection in Hesse/Germany

Specification. BTEC Specialist qualifications. Edexcel BTEC Level 1 Award/Certificate/Extended Certificate in Construction Skills (QCF)

Title II of WIOA- Adult Education and Family Literacy Activities 463 Guidance

Dakar Framework for Action. Education for All: Meeting our Collective Commitments. World Education Forum Dakar, Senegal, April 2000

Observing Teachers: The Mathematics Pedagogy of Quebec Francophone and Anglophone Teachers

State of play of EQF implementation in Montenegro Zora Bogicevic, Ministry of Education Rajko Kosovic, VET Center

George Mason University Graduate School of Education Education Leadership Program. Course Syllabus Spring 2006

Measurement & Analysis in the Real World

SACS Reaffirmation of Accreditation: Process and Reports

FUNDING GUIDELINES APPLICATION FORM BANKSETA Doctoral & Post-Doctoral Research Funding

Productive partnerships to promote media and information literacy for knowledge societies: IFLA and UNESCO s collaborative work

The Site Visit: How to Prepare for It & What to Expect. STARTALK Conference Atlanta, GA May 3, 2012

MKTG 611- Marketing Management The Wharton School, University of Pennsylvania Fall 2016

Xenia High School Credit Flexibility Plan (CFP) Application

Experience Corps. Mentor Toolkit

Expanded Learning Time Expectations for Implementation

Faculty of Social Sciences

Milton Keynes Schools Speech and Language Therapy Service. Central and North West London NHS Foundation Trust. Additional support for schools

This Performance Standards include four major components. They are

Public School Choice DRAFT

Transcription:

Module 5 Defining Evaluation Questions

Module Objectives By the end of this module you will be able to: 1 Identify the 3 types of evaluation questions. 2 Identify what makes a good evaluation question. 3 Know difference between typical performance evaluation and impact evaluation questions. 4 Break questions down into sub-questions. 5 Use the design matrix as an organizing tool. 6 Identify additional sources to tap for questions. 7 Use a results framework to develop questions. 8 Identify and apply criteria for prioritizing questions. 2

ADS References for this Module ADS Reference ADS 201.3.1.6 and 201.3.15 ADS 203.3.1 ADS 203.3.1 ADS 203.3.1 ADS 203.3.16 ADS 203 3.1.4 ADS 203 3.1.4 ADS 203 3.1.5 Topic or Issue PE s focus on descriptive and normative questions and others related to operational decision-making. Careful selection of evaluation questions to test fundamental assumptions underlying project design; thematic questions. Identify questions during program design. Link questions to required decisions. Consultation with in-country partners and beneficiaries is essential to ensure question relevance. Stakeholders consulted to assist in prioritizing questions. Identify a small number of key questions and specific issues answerable with empirical evidence. Evaluation reports to address all questions in SOW. ADS 203 3.1.5 Statement of Work to include evaluation questions. 3

Types of Questions R to R Chapter 6 1. Descriptive questions 2. Normative questions 3. Cause and Effect questions Represent what is Comparisons of what is to what should be Identify if results have been achieved due to the intervention 4

1. Descriptive Questions Represent what is Like a snapshot Seek to describe or understand a program or process, or attitudes towards it Are usually straightforward questions (who? what? where? when? how? how much/how many?) Often used to seek opinions from project participants or beneficiaries Typically, Performance Evaluations contain descriptive questions 5

Examples Descriptive Questions What do stakeholder groups see as the goals of the program? What are the primary activities of the program? Where has the program been implemented? Who received what services? 6

A Special Point- Questions about the extent of gains or changes over a period of time whether concerning crop production, traffic flows, trade patterns, test scores, attitudes or behaviors- are descriptive questions 7

2. Normative (Comparative) Questions Is what is what should be? Often use monitoring data Measuring against previously established criteria (or norms) Often are questions about outputs or outcomes Did the project meet the targets on specified indicators? Did the project conform to general standards? 8

Examples Normative Questions Did the project spend as much as was budgeted? To what extent was the budget spent efficiently? To what extent was the target of vaccinating 80% of the nation s children met? To what extent was the program gender equitable? 9

Where to Find the Standards/Norms Program design documents Monitoring systemsindicators with targets Should the evaluation team set them? Documented standards from USAID or other agencies Accreditation systems, blueribbon panels, professional organizations 10

3. Cause and Effect Questions Identify if results have been achieved due to the intervention Seek to determine what difference the intervention made Eliminate all other possible explanations Ask if the desired results have been achieved AND whether it is the intervention that has caused results Suggest before & after and with & without comparisons Impact Evaluations focus on cause and effect questions 11

Examples Cause and Effect Questions Did the three-country partnership strategy preserve the biodiversity of the affected area while sustaining livelihoods? As a result of the job training program, do participants have higher paying jobs than they otherwise would have? Did the increased tax on gasoline improve air quality? 12

What do you think? Can Performance Evaluations address cause and effect questions? 13

Exercise 5-1 What type of question is it? If unclear, rewrite to clarify 14

Good Questions Are listed in priority order Link clearly to the evaluation purpose Are realistic in number and kind given the time and resources available Can be answered definitively Reflect stakeholder needs and interests; including those of women and minority groups Consider the timing relative to the program or policy cyclereflect the requirement for utility Are not two (or more) questions in one Are extent, how, what questions rather than yes/no questions (unless normative) 15

Sub-questions are Often Needed How relevant was the intervention? To what extent did the project fit within USAID s country health strategy? To what extent was the intervention aligned with the host country government s priorities? Was the timing appropriate given the political context? What were the characteristics of those served? What were the most prevalent age groups? To what extent was gender equity achieved? What were participants most frequent health issues? Did we target the youth most considered at risk? 16

The Design Matrix C-8: Evaluation Design Matrix Evaluation Purpose: Page 1 A General Design: Question Subquestions Type of subquestion Measure or Indicator Target or Standard (if normative) Baselin e Data? IPDET 2009 17

The Design Matrix General Design: Page 1 B Data Source Sample or Census Data Collection Instrument Data Analysis Comments IPDET 2009 18

Some Sources of Evaluation Questions What are some sources for evaluation questions? 19

OECD/DAC Principles: Evaluation Questions OECD/DAC Principles: R-4 Relevance Effectiveness Efficiency Sustainability Impact (See note in handout) 20

Logical Frameworks Generate Questions Narrative Summary Objectively Verifiable Indicator Means of Verification Assumptions Increased standard of living in the eastern province % of targeted population with higher standard of living 3 yrs after program completion Sample survey Economic and weather conditions remain stable Literacy at the 6 th grade level is improved % of population with literacy rates at or above the 6 th grade level Annual standardized literacy test Parents, teachers, and students believe literacy is important Capacity for schools to teach reading and writing is improved # of people trained % of participants successfully completing the 2-week course Sign-in sheets; course evaluations Skills learned are applied in school settings Recruit skilled instructors; train teachers; provide teaching materials # training sessions offered % teachers recruited with necessary skill sets Competency assessments; program records A pool of qualified instructors exists in-country 21

Identifying and Selecting Questions Two phases: Divergent Phase: develop a comprehensive list of questions Convergent Phase: narrow down the list 22

Balance A sound balance between evaluation questions and the time and resources available to answer them is essential. Evaluation Questions Budget Time 23

Some Criteria for Prioritizing Does this fit the stated purpose of the evaluation? Will this accommodate key stakeholders? Who would use the information for what decision? Would the information possibly change the course of events? Does the question focus on a critical or major issue, or is the question merely of passing interest? Is it feasible to adequately answer the question, given time and budget constraints? Would the evaluation be compromised if this question was dropped? Is it critical to the study s credibility? Others? 24

Matrix for Scoring and Ranking Question # 1 2 3 4 5 1. Of interest to key stakeholders? 2. Answerable? 3. Likely to yield information linked to a decision? 4. Critical to the study s credibility? 5. Likely to impact the course of events? 25

Which Would You Prefer? Evaluation SOW 1 Ethiopia Evaluation SOW 2 Kosovo (KCBS) 26

A Final Point Defining evaluation questions should be a collaborative and iterative process that serves to promote later use of evaluation findings! 27

Review Questions What are the three types of evaluation questions? Which evaluation questions are typically found in performance evaluations? Impact evaluations? What are some good sources for evaluation questions? 28

Case Work 1. Using your project documents as a base, develop 3 major questions your group wants to ask in your evaluation, ensuring they are consistent with your evaluation purpose; 2. Identify the type of question (D, N, or C&E); 3. Identify a few subquestions as appropriate for each question; 4. Be ready to present your group work 29