PUBPOL 551: Measuring Social Impact: Advanced Program Design and Evaluation

Save this PDF as:

Size: px
Start display at page:

Download "PUBPOL 551: Measuring Social Impact: Advanced Program Design and Evaluation"


1 PUBPOL 551: Measuring Social Impact: Advanced Program Design and Evaluation Professor Mary Kay Gugerty Office: Room 225, Parrington Hall Phone: Office Hours: Wednesday 1:30 3:30 This course covers advanced topics in evaluation for social sector organizations. The assumption is that students have completed previous work in organizational management and program evaluation. While the focus of the class will be on nonprofits and NGOs, the theory, tools and techniques we will explore are applicable to any mission-based organization, regardless of tax status. At the end of the course, students should be able to: Employ theory of change and design approaches to analyze and improve program design, as well as to develop evaluation designs. Understand the link between performance management, learning and evaluation for social sector organizations Understand and employ a variety of approaches to evaluating organization and program performance. Develop evaluations design for complex social settings including advocacy initiatives, collective impact, social impact funds, etc. Be attentive to and address issues of power, equity and inclusion as they arise in evaluation efforts. The class will rely on a mix of in-class case discussion, group exercise and lecture/discussion. Each week we will have 1-2 readings that focus on theory or concepts and several readings that focus on applications. Please note that this course is deliberately constructed to avoid the topics covered in PUBPOL 529, Advanced Analysis. This includes a number of multivariate regression techniques (logit, probit) as well as quasi-experimental approaches to measuring impact (instrumental variables, propensity score matching, etc). Course materials Most of the course readings will be available on the Canvas site. In addition a coursepack will be available through the Harvard Business School; the link with be posted on the classroom Canvas website. There are three sources we will use throughout the course: 1. The Goldilocks Challenge, by Mary Kay Gugerty and Dean Karlan, forthcoming from Oxford University press. Chapters will be posted online. 2. an online resource for evaluation approaches. 3. The Goldilocks Toolkit: Missed Class Policy

2 If you need to miss class for a professional or personal reason, please let me know in advance. For the day you miss, you should submit a 2-3 page reflection paper on the week s readings/cases. This does not need to be a highly structured memo, but can be a basic reflection on the week s readings. What did you find useful about the readings? What new ideas did the readings stimulate? Do you agree with the authors take on a particular issue? If you miss a class and don t let me know and complete this paper, the missed class will affect your participation grade. You may not do an Evaluation Approach memo on the same week you miss class. Missed Class memos should be uploaded to the appropriate Canvas page under assignments. Academic Integrity The Evans School and the University of Washington seek to uphold high standards of honesty, respect for others viewpoints and integrity in interactions and academic effort. The topics we cover in this course typically benefit greatly from collaboration and discussion, but for paper assignments the work you turn should be your own. Students working on applied projects will turn in only one paper, but will evaluate each other through the peer evaluation process and are expected to contribute equally to the project. If you are participating in a group project, you may be using existing frameworks to undertake analysis in this class- use of frameworks should be acknowledged but all analysis should be original. Needs for Specific Accommodations The university will provide reasonable accommodation for academically qualified students with disabilities so those students can participate fully in the university's educational programs. Any student requesting academic accommodation based on disability is required to register with Disability Resources for Students (DRS). Please inform me of your accommodation needs. Evans School Conversation Norms I want to remind us of the Evans School Community Conversation Norms. We will discuss these and build upon them for our own class. At the Evans School, we value the richness of our differences and how they can greatly enhance our conversations and learning. As a professional school, we also have a responsibility to communicate with each other inside and outside of the classroom in a manner consistent with conduct in today s increasingly diverse places of work. We hold ourselves individually and collectively responsible for our communication by: Listening carefully and respectfully Sharing and teaching each other generously Clarifying the intent and impact of our comments Giving and receiving feedback in a relationship-building manner Working together to expand our knowledge by using high standards for evidence and analysis

3 1. Classroom participation 15% Assignments Participation includes contributing usefully in large class discussions, contributing in small breakout groups, contributing ideas or articles via the Canvas website, sharing of resources with the group, and contributing on the day you complete your evaluation approach memo. We will construct together norms for participation on the class. At the end of the quarter, students will also complete a self- and peer-evaluation form in which they will have an opportunity to reflect on their own and others participation. 2. Brief memo on an evaluation approach of your choice, varied due dates - 20% Throughout the early part of the quarter we ll be looking at a variety of approaches to evaluation; pick one that seems interesting to you and write a brief memo on it, no more than 2-3 pages single spaced, i.e words or so. Describe the method, what is aims to accomplish, when you think it would be useful, an example of a situation in which you could imagine using it (or if you can find an example of where it was used, you can briefly describe that evaluation example), and a brief assessment of the advantages and disadvantages of the approach. You can start with, but for most methods you will need to do a little more reading. There are four possible dates to complete the memo; you need complete only one memo. The memo itself is due at 3pm the Wednesday before class. The memo will be graded largely on analytic content, rather than writing style, although clear and concise writing will facilitate deeper analysis. 1. Realist Evaluation & Qualitative Impact Assessment Protocol (QUIP), January Rapid Cycle Evaluation & A/B Testing, January Developmental Evaluation January Empowerment Evaluation & Appreciative Inquiry, February 8 3. Evaluation Design, Due February 26-25% In this assignment you will develop an evaluation design for Moving to Progress, a hypothetical housing voucher program designed to help families move to higher income neighborhoods in order to create better social, educational, and economic opportunities for children. The materials will be provided on the assignments page. The assignment will be posted on February 19 th and is due by midnight on February 26 th on Canvas. 4. Final project, due March 15-40% There are two ways to complete this project, which will be undertaken in small groups. We will discuss in detail during Week 2, including interim dates for deliverables. For both projects, I will provide some time during some classes for groups to meet. A. RFP Response for an Evaluation Evaluation of Communities of Opportunity I will provide you with an RFP for an evaluation bid issued by King County this past fall, along with associated materials. Your group will be responsible for developing a bid for the evaluation and making a short presentation to the committee awarding the contract on the final class. Materials for this assignment can be found on the Assignments page. B. Evaluation consulting project Students may work in small groups of up to four students on a project for an organization identified by the students. This project could be a program design, evaluation design or data-oriented performance management issue. The specific details of the project will be worked out between the students and the instructor. Please note that this should be a separate project from whatever you are working on as part of your capstone project.

4 PUBPOL 551 Class Outline Week Date Topic Assignments 1 January 4 2 January 11 3 January 18 4 January 25 5 February 1 6 February 8 7 February 15 Perspectives on Evaluation, Evidence and Social Impact Measurement Theory of Change Contribution Analysis and Process Tracing as Evaluation Techniques Realist Evaluation Design Thinking and Feedback Loops Rapid Cycle Evaluation Process and Implementation Evaluation (Implementation Science) Evaluability Assessments Developmental Evaluation Evidence, Impact and Performance Management Evaluating Quality of Evidence (I) Participatory Approaches Empowerment Evaluation & Appreciative Inquiry Measuring Social Impact & Social Impact Bonds Data & evidence quality (II) Evaluation Approaches - Option 1 Evaluation Approaches - Option 2 Evaluation Approaches - Option 3 Evaluation Approaches - Option 4 8 February 22 Grantmakers and Evaluation Outcomes and Question Assignment due February 26 9 March 1 Evaluating Advocacy and Collective Impact 10 March 8 Presentations Presentations of projects March 15 Final projects due

5 PUBPOL 551 Readings PLEASE NOTE: The ultimate guide to what to read for each week is the weekly study guide posted in the folder for the week. I will likely make changes to the readings over the quarter. January 4 Perspectives on Evaluation, Evidence and Social Impact Measurement Mary Kay Gugerty and Dean Karlan, The Goldilocks Challenge, Chapters 1-2. Jennifer Brooks, "Making the Case for Evidence-Based Decision-Making", Stanford Social Innovation Review. Carden, Fred and Marvin Alkin, Evaluation Roots: An International Perspective. Journal of MultiDisciplinary Evaluation, 8(17), Just read and look at the Evaluation Tree. Thinking about evaluations, position and power: In-class simulation, provided in class: Development in Dhankura To prepare for the simulation, please read the following short pdfs, which provide a perspective for thinking about power and privilege. We ll consider how these concepts work in an evaluation format in class through our simulation. Nieto, Leticia. Understanding Oppression: Strategies for Addressing Power & Privilege. January 11 Theory of Change Contribution Analysis and Process Tracing as Evaluation Techniques Realist Evaluation & Qualitative Impact Assessment Protocol (QUIP) Case: Casa Esperanza, Electronic Hallway Theory of Change: Mary Kay Gugerty and Dean Karlan, The Goldilocks Challenge, Chapter 3 Goldilocks Toolkit: Guiding Your Program to Build a Theory of Change Working with Assumptions in a Theory of Change Process Irene Gujit. (2013). TOC Reflections Notes 3. Theory of Change Portal. Theory of Change vs. Logical Frameworks, Process Tracing, Contribution Analysis, Realist Evaluation & QUIP Derek Beach, Process Tracing Methods an introduction. Befani, B and J. Mayne. Process Tracing and Contribution Analysis: A Combined Approach to Generative Causal Inference for Impact Evaluation. IDS Bulletin. November 2014.

6 Read Better Evaluation pages on Realist Evaluaton and Qualitative Impact Assessment Protocol (QUIP) Theory of Change Resources and Recommended Readings: Examples: Aspen Institute, Community Builder s Guide to Theory of Change Read through material on program theory on Carol Weiss, Nothing So Practical as a Good Theory A classic and the first articulation of key ideas about theory-based evaluation. January 18 Design Thinking and Feedback Loops Rapid Cycle Evaluation & A/B Testing Tim Brown & Jocelyn Wyatt, Design Thinking for Social Innovation, Stanford Social Innovation Review, Winter 2010 The Field Guide to Human-Centered Design Read the introduction and Mindsets sections, then look over the Methods section and select three tools you think you might use in your current work or in your class projects and read them in more detail. Read the entire Ideation section but pay particular attention to the first four steps, plus Create Frameworks, Design Principles, and Story Board. Christian, Brian. "Test Everything: Notes on the A/B Revolution". Wired Magazine: May 9, Scott Cody and Andrew Asher. Proposal 14: Smarter, Better, Faster: The Potential for Predictive Analytics and Rapid-Cycle Evaluation to Improve Program Development and Outcomes. The Hamilton Project, Brookings Institution. Goldilocks Toolkit, Introduction to Rapid Fire Operational Testing for Social Programs January 25 Process and Implementation Evaluation (Implementation Science) Evaluability Assessments Case: Protocol for the SPIRIT study, Implementation Science. SPIRIT study logic model Block Watch Mini-Case (1.5 pages) (we'll use this in class to consider the design of an implementation evaluation) Readings

7 Vu Le, Weaponized Data: How the Obsession with Data is Hurting Marginalized Communities (Links to an external site.)links to an external site.. Review CDC, Types of Evaluation (2 pages) CDC Evaluation Brief: Evaluating Policy Implementation. No 4, February (4 pages). Weiss, et al. A Conceptual Framework for Studying the Sources of Variation in Program Effects, Journal of Public Policy & Management (JPAM). Diana Epstein and Jacob Alex Klerman, When is a Program Ready for Rigorous Impact Evaluation? The Role of a Falsifiable Logic Model. Evaluation Review 36(5) Optional: Nilsen, Per Making sense of implementation theories, models and frameworks. 10(15): Implementation Science. February 1 Evidence, Impact and Performance Management Collecting High Quality Data Developmental Evaluation Case: Teen ACTION Evaluation, distributed in class Goldilocks, The Goldilocks Challenge: Right-Fit Evidence for the Social Sector. " Chapters 4 - The CART Principles - pages (61-77 in the PDF file) Chapter 5 - CART for Monitoring - pages ( in the PDF) Chapter 7 - Collecting High Quality Data ( in PDF) Le, Vu. How the concept of effectiveness has screwed nonprofits and the people we serve (Links to an external site.)links to an external site.. Blog post. (Links to an external site.)links to an external site.patrick Lester, Defining Evidence Down, Stanford Social Innovation Review, (Links to an external site.)links to an external site. Developmental Evaluation - A Primer. Read pages Recommended: Glennester, Rachel and Kudzai Takavarasha, Running Randomized Evaluations. Chapter 3 the Right Questions.. Asking Sasha Dichter, Tom Adams, & Alnoor Ebrahim, The Power of Lean Data, (Links to an external site.)links to an external site.stanford Social Innovation Review, Winter Dave Algoso Monitoring and Evaluation versus Management (Links to an external site.)links to an external site.. and Monitoring and Evaluation versus Feedback Loops (Links to an external site.)links to an external site.. and Praxis Blog.

8 February 8 Participatory Approaches & Feedback Empowerment Evaluation & Appreciative Inquiry Case: Teen ACTION Program Review, Teen ACTION Qualitative Evaluation Readings: Michael Quinn Patton, Qualitative Methods for Evaluation, Chapter 3: Designing Qualitative Evaluations. Part 1, Part 2 SKIM Robert Chambers, Whose Reality Counts? Chapter 3, "Professional Realities Community Toolkit, Participatory Evaluation (Links to an external site.)links to an external site. Participatory Evaluation: What is it? What are the challenges? Ann Zukoski and Mia Luluquisen. Empowerment Evaluation & Appreciative Inquiry Fetterman, David. Steps of empowerment evaluation : From California to Cape Town. Evaluation and Program Planning, (focus on intro and the steps; don t worry about the case study info). Read the Appreciative Inquiry page at Review Developmental Evaluation reading from last week (pages 11-24) February 15 Measuring Social Impact & Social Impact Bonds Collecting High Quality Evidence Case: Social Impact at the Robin Hood Foundation 1) Michael M. Weinstein. How the Robin Hood Foundation Estimates the Impact of Grants. November 12, Pages ) Ivy So & Alina S. Capanyola Impact Investing: How Impact Investors Actually Measure Impact. Stanford Social Innovation Review. May 16, ) Srik Gopal & Lisbeth Schorr. Getting MoneyBall Right in the Social Sector. SSIR, June 2016 (you may want to review this reading from February 1, that respond to this article: Patrick Lester, Defining Evidence Down, SSIR.) 4) Sample "evidence hierarchy" from the Edna Clark McConnell Foundation: Assessing an Organization s Evidence of Effectiveness. (1 page handout) Application: Asking Good Questions - Qualitative v. Quantitative approaches Asking Good Questions. Handout, Mary Kay Gugerty Asking Open Ended Questions, Handout, Mary Kay Gugerty Review, Chapter 7, Goldilocks Challenge, Collecting High Quality Data.

9 February 22 Evaluation and Foundations More on Participation Vu Le, Grantmakers, Funders your grant application process may be perpetuating inequity (Links to an external site.)links to an external site.. How Do You Measure Up? Finding Fit Between Foundations and their Evaluation Functions, Julia Coffman and Tanya Beer. Packard Foundation, Grantee Perception Report (skim to see the kinds of questions asked; Packard is one of the few foundations that makes this report public). If you are interested, here is the original foundation survey. Measuring Foundation Performance, just look at pages Participation and Feedback: (Be sure to read the Making Progress case that is part of the assignment before class) Promoting the Participation,Learning and Action of Young People tools starting on page 22. (UNICEF), Just look at the Facilitators Guide to Participatory Evaluation with Young People Keystone Accountability: Constituent Voice. Focus on pages Optional Resources on Foundations Advancing Evaluation Practices in Philanthropy, Special Issue of SSIR (skim to get a sense of the different foundation approaches) Coffman, Beer and Patrizi, Benchmarking Evaluation in Foundations: Do We Know What we are Doing? The Role of Evaluation in Strategic Philanthropy, Holley and Carr, Nonprofit Quarterly. Kellogg Foundation Evaluation Guide, Chapter 3 Levels of Evaluation

10 March 1 Evaluating Advocacy & Collective Impact Case: One America, Electronic Hallway Pathways for Change, 10 Theories to Inform Advocacy and Policy Change Efforts, Organizational Research Services (ORS). Julia Coffman and Ehren Reed, Unique Methods in Advocacy Evaluation. Summary slides for Collective Impact Forum's approach to Evaluating Collective Impact Optional: User Guide to Advocacy Evaluation (has helpful examples of outcomes and indicators) Optional (these may be helpful for the final Communities of Opportunity project): FSG, Guide to Evaluating Collective Impact, Parts 1-3 Aspen Institute, Measuring Community Capacity (this is a pretty detailed guide) March 8 Final Presentations