IMPACT ASSESSMENT & LEARNING FRAMEWORK

Similar documents
Delaware Performance Appraisal System Building greater skills and knowledge for educators

Expanded Learning Time Expectations for Implementation

DESIGNPRINCIPLES RUBRIC 3.0

Indiana Collaborative for Project Based Learning. PBL Certification Process

Davidson College Library Strategic Plan

Governors and State Legislatures Plan to Reauthorize the Elementary and Secondary Education Act

CHAPTER 2: COUNTERING FOUR RISKY ASSUMPTIONS

Innovating Toward a Vibrant Learning Ecosystem:

Linguistics Program Outcomes Assessment 2012

State Budget Update February 2016

EQuIP Review Feedback

Envision Success FY2014-FY2017 Strategic Goal 1: Enhancing pathways that guide students to achieve their academic, career, and personal goals

I set out below my response to the Report s individual recommendations.

Developing an Assessment Plan to Learn About Student Learning

Second Annual FedEx Award for Innovations in Disaster Preparedness Submission Form I. Contact Information

Title Columbus State Community College's Master Planning Project (Phases III and IV) Status COMPLETED

VOL VISION 2020 STRATEGIC PLAN IMPLEMENTATION

Priorities for CBHS Draft 8/22/17

California Professional Standards for Education Leaders (CPSELs)

Stakeholder Engagement and Communication Plan (SECP)

new research in learning and working

1GOOD LEADERSHIP IS IMPORTANT. Principal Effectiveness and Leadership in an Era of Accountability: What Research Says

Strategic Plan SJI Strategic Plan 2016.indd 1 4/14/16 9:43 AM

Student Learning Outcomes: A new model of assessment

University of Colorado Skaggs School of Pharmacy and Pharmaceutical Sciences Programmatic Evaluation Plan

School Leadership Rubrics

Short Term Action Plan (STAP)

Opening Essay. Darrell A. Hamlin, Ph.D. Fort Hays State University

Common Core Path to Achievement. A Three Year Blueprint to Success

Preliminary Report Initiative for Investigation of Race Matters and Underrepresented Minority Faculty at MIT Revised Version Submitted July 12, 2007

elearning OVERVIEW GFA Consulting Group GmbH 1

AUTHORITATIVE SOURCES ADULT AND COMMUNITY LEARNING LEARNING PROGRAMMES

Dakar Framework for Action. Education for All: Meeting our Collective Commitments. World Education Forum Dakar, Senegal, April 2000

The Rise of Results-Based Financing in Education 2015

Final Teach For America Interim Certification Program

Lincoln School Kathmandu, Nepal

Queensborough Public Library (Queens, NY) CCSS Guidance for TASC Professional Development Curriculum

A Framework for Articulating New Library Roles

Regional Bureau for Education in Africa (BREDA)

Implementing an Early Warning Intervention and Monitoring System to Keep Students On Track in the Middle Grades and High School

DRAFT Strategic Plan INTERNAL CONSULTATION DOCUMENT. University of Waterloo. Faculty of Mathematics

A Strategic Plan for the Law Library. Washington and Lee University School of Law Introduction

The Characteristics of Programs of Information

Denver Public Schools

Early Warning System Implementation Guide

An Introduction to LEAP

FACULTY GUIDE ON INTERNSHIP ADVISING

APPENDIX A-13 PERIODIC MULTI-YEAR REVIEW OF FACULTY & LIBRARIANS (PMYR) UNIVERSITY OF MASSACHUSETTS LOWELL

Texas Woman s University Libraries

The Rise and Fall of the

Intervention in Struggling Schools Through Receivership New York State. May 2015

e-portfolios in Australian education and training 2008 National Symposium Report

University of Toronto

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

Advancing the Discipline of Leadership Studies. What is an Academic Discipline?

Fundraising 101 Introduction to Autism Speaks. An Orientation for New Hires

Comprehensive Student Services Program Review

Chart 5: Overview of standard C

Program Assessment and Alignment

Preparing a Research Proposal

Deploying Agile Practices in Organizations: A Case Study

Getting Ready for the Work Readiness Credential: A Guide for Trainers and Instructors of Jobseekers

ASCD Recommendations for the Reauthorization of No Child Left Behind

Inquiry Learning Methodologies and the Disposition to Energy Systems Problem Solving

ABET Criteria for Accrediting Computer Science Programs

School Performance Plan Middle Schools

Monitoring & Evaluation Tools for Community and Stakeholder Engagement

DIGITAL GAMING & INTERACTIVE MEDIA BACHELOR S DEGREE. Junior Year. Summer (Bridge Quarter) Fall Winter Spring GAME Credits.

Patient/Caregiver Surveys

ONTARIO FOOD COLLABORATIVE

Mathematics Program Assessment Plan

Position Statements. Index of Association Position Statements

Learning Objectives by Course Matrix Objectives Course # Course Name Psyc Know ledge

Curricular Reviews: Harvard, Yale & Princeton. DUE Meeting

Strategic Planning for Retaining Women in Undergraduate Computing

State Improvement Plan for Perkins Indicators 6S1 and 6S2

Cultivating an Enriched Campus Community

Community Based Participatory Action Research Partnership Protocol

Synthesis Essay: The 7 Habits of a Highly Effective Teacher: What Graduate School Has Taught Me By: Kamille Samborski

Every Student Succeeds Act: Building on Success in Tennessee. ESSA State Plan. Tennessee Department of Education December 19, 2016 Draft

This Performance Standards include four major components. They are

TRI-STATE CONSORTIUM Wappingers CENTRAL SCHOOL DISTRICT

Promotion and Tenure Guidelines. School of Social Work

- COURSE DESCRIPTIONS - (*From Online Graduate Catalog )

Building Mutual Trust and Rapport. Navigating the Intersection of Administrators and Faculty in Short-Term Program Planning

Loyola University Chicago Chicago, Illinois

CONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION. Connecticut State Department of Education

SACS Reaffirmation of Accreditation: Process and Reports

Requirements for the Degree: Bachelor of Science in Education in Early Childhood Special Education (P-5)

UK Institutional Research Brief: Results of the 2012 National Survey of Student Engagement: A Comparison with Carnegie Peer Institutions

DOCTOR OF PHILOSOPHY BOARD PhD PROGRAM REVIEW PROTOCOL

Colorado s Unified Improvement Plan for Schools for Online UIP Report

World s Best Workforce Plan

Executive Summary. DoDEA Virtual High School

LATTC Program Review Instructional -Department Level

MOESAC MEDIUM TERM PLAN

VOCATIONAL EDUCATION AND TRAINING PROGRAMME FOR OIC MEMBER COUNTRIES (OIC-VET)

The Teaching and Learning Center

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Project Management for Rapid e-learning Development Jennifer De Vries Blue Streak Learning

Transcription:

IMPACT ASSESSMENT & LEARNING FRAMEWORK

WHY DO WE ENGAGE IN IMPACT ASSESSMENT & LEARNING? The Irvine Foundation s goals are to expand economic and political opportunity for families and young adults who are working but struggling with poverty. Impact assessment and learning (IA&L) helps the staff and board advance these goals in three ways: Guiding our strategy and grantmaking It helps us decide where we invest and how, what the impact of our investments is, and when we need to shift to make sure that we are most effectively using our resources and talents. Supporting us in being accountable and transparent It positions us to share, internally and externally, what we are doing and why, as well as what we are learning, and how we are applying that learning. Enhancing our field impact We share what we learn, successes and challenges, with the fields and sectors in which we work, especially our nonprofit, public sector, and philanthropic colleagues. We do so with the intent of informing their work and augmenting our impact (e.g., through cofunding, model replication, and scaling). We also look to others to guide our efforts and enhance our impact. We define IMPACT ASSESSMENT & LEARNING as the processes we use to measure our progress and impact in meeting our goals. We collect a variety of quantitative and qualitative information through different methods including pre-grant due diligence, grant monitoring, grantee site visits, grantee-funder convenings, targeted research, evaluation, community listening sessions, and grantee perception surveys. We regularly synthesize and reflect on the information we gather to inform decisions about our work and consider adjustments that will enhance our impact. 1

WHY AN IMPACT ASSESSMENT & LEARNING FRAMEWORK? We developed the IA&L Framework to ensure that Irvine staff and board, are clear about our approach to impact assessment and learning. This includes: (1) how we are holding ourselves accountable to assess our progress and impact against our portfolio goals and outcomes, and (2) how we will learn and apply that learning to inform decisions about our strategy on an ongoing basis. This necessitates our engagement in two critical practices. First, strategy, and impact assessment and learning, must regularly inform each other. Strategy needs to inform how we assess our progress and learn. Conversely, what we learn from assessing our progress needs to inform our strategy. ENHANCED IMPACT STAFF The second practice involves assessment activities feeding into an ongoing cycle of learning, reflecting, and adapting to facilitate continuous improvement. Staff are most deeply involved in this practice and to different extents collaboratively engage with grantees. In our learning approaches, we always aim to enhance grantees effectiveness in reaching their goals in addition to informing our own efforts. In addition, the board and staff leadership periodically engage in learning in ways that build on this practice by staff. GRANTEES STAFF LEADERSHIP BOARD 2

HOW ARE WE EVOLVING OUR IMPACT ASSESSMENT & LEARNING APPROACH? Our IA&L Framework is influenced by best practices in evaluation and learning and builds on our own experiences. We are intentionally evolving our approach to match our new goals and grantmaking initiatives, as well as a desire to more closely integrate impact assessment and learning with strategy. The application of this framework is especially timely as we move into a new way of working with: Cross-functional initiative teams in which most portfolio staff work on more than one initiative Greater partnership with grantees to develop initiatives The launch of multi-year initiatives with specified time frames and outcomes A portfolio approach designed for greater impact than the sum of individual initiatives With these changes, we have shifted our IA&L efforts in the following ways: Developing and implementing IA&L plans for all new initiatives at their launch Less emphasis on measuring past impact and greater emphasis on forward-looking information that informs the work of the Foundation and grantees Greater engagement of staff, leadership, the board, and grantees in continuous improvement processes Broadening and strengthening our listening and feedback practices with grantees and our ultimate beneficiaries 3

WHO IS ACCOUNTABLE FOR IMPACT ASSESSMENT & LEARNING? The Foundation board, staff leadership, staff, and grantees all have a role in being accountable for impact assessment and learning. ACCOUNTABILITY The Board has a significant accountability role, asking hard questions about what success looks like, ensuring we are most effectively using resources, and providing strategic advice, guidance, and decision making based on what we learn. Staff Leadership is accountable to ensure that the organizational culture supports and incentivizes learning and continuous improvement and that impact assessment and learning is integrated with strategy and adequately resourced and prioritized. Staff are accountable to learn from initiative implementation, make adjustments that increase the odds of success, and ensure that staff leadership and the board have the information they need for their accountability role. Grantees are accountable for laying out a plan of work and reporting on implementation, impact, and what they are learning (e.g., grant proposals and reporting). We also ask grantees to be part of collective learning efforts with us. 4

HOW DO WE INCORPORATE FEEDBACK FROM GRANTEES AND BENEFICIARIES INTO OUR WORK? We are accountable to our ultimate beneficiaries: Californians who are working but struggling with poverty. As a result, we are committed to broadening and strengthening our feedback practices asking and listening, using what we hear to inform our work, and letting those we listen to know how we used what we learned. This takes place in a number of ways, including feedback loops with grantees (A) through grantee perception surveys, engagement in strategy development, and grantee gatherings. We also support our grantees in their own feedback loops with those they serve (B) through our participation in the Fund for Shared Insight, support of Listen for Good grants, and other efforts. Lastly, we seek feedback from our ultimate beneficiaries (C) through efforts such as our recent Community Listening Sessions. Grantee-tofoundation Grantees to-grantee Beneficiaries Beneficiary- Irvine Foundation feedback feedback A Beneficiaryto-foundation feedback C B 5

WHAT PRINCIPLES GUIDE IMPACT ASSESSMENT & LEARNING? The following principles guide our IA&L efforts both our approach and partnerships. While these principles are relevant to IA&L efforts across the Foundation, they most frequently apply to our work within initiatives. These principles have been informed by discussions with the Irvine board, staff, and grantees, and align with best practices from the field. OUR APPROACH We are clear about the purpose and audience for the specific IA&L activity. We cultivate a learning culture that supports innovation, experimentation, reflection, flexibility, and adaptation. We use multiple data collection methods and gather information from diverse perspectives (e.g., grantees, beneficiaries, other key stakeholders). This allows us to learn about our own work, the relevant work of others, and the broader context that influences our impact. We prioritize gathering information that helps identify when adjustments need to be made in real time. This includes seeking to understand not only what s happening, but why the factors that are facilitating or impeding progress. We share what we learn with others, including grantees, partners, the field, and beneficiaries. OUR PARTNERSHIPS We collaborate with our grantees and other partners to design and implement IA&L efforts. In doing, so we seek to: Ensure that IA&L is reflective of the needs and priorities of grantees and the field; Level or balance the power dynamic between Irvine as the funder and grantees to create strong and candid relationships. For all of our partnerships, we seek to use a diversity, equity, and inclusion approach; and Increase our individual and collective capacity to gather and use data in the process. We make every effort to ensure that our grantee ask is appropriate given the amount and type of resources we provide and the usefulness of the information to be collected. We seek to avoid processes that are burdensome for staff, grantees, and beneficiaries. Where possible, we build on existing efforts (e.g., data grantees already collect) and streamline efforts (e.g., use information grantees provide to other funders). 6

HOW DO WE FOCUS OUR IMPACT ASSESSMENT & LEARNING EFFORTS? We need to make decisions about when, where, and how to spend finite resources and staff time; this is especially the case when we make decisions about external evaluations. The following factors help us more effectively focus these efforts: Life cycle of an initiative. Information needs will vary and be heightened at key points of an initiative, such as when decisions need to be made to continue or exit an initiative, to renew or augment specific grants, or to increase resources for scaling. We will prioritize use of resources for evaluations that can inform critical decision points. Capacity. Our use of resources must also take into account the capacity of grantees and our partners to gather, use, and report data. We seek to take advantage of contexts in which quality data exists, while also recognizing that some situations may require strategic investments in capacity to yield evaluative information. Standard of proof. We recognize that not all data are created equal. Some types of evaluation yield rigorous assessment of the impact of program models while others favor rapid feedback that can inform more adaptive initiatives. We prioritize evaluation approaches that are appropriate to the nature of specific interventions, their stage of development, and evidence of effectiveness from similar efforts. Measurability. Some things are harder to measure than others, and our efforts need to align with what can realistically be measured. For example, when assessing a career training program, discrete quantitative outcomes can be used to measure tangible impact on participants overtime (e.g., change in hourly wage and career progression). In contrast, efforts to influence systems change, such as passage of a state policy, typically rely on proxy measures (e.g., framing and frequency of related discussions) to indicate progress toward the desired outcome (e.g., policy passage). 7

HOW DO WE PUT IMPACT ASSESSMENT & LEARNING INTO PRACTICE? As part of impact assessment and learning, we collect and reflect on different types of information at the grant, initiative, and portfolio levels. This information is used in a variety of ways, not only within, but across the three levels. For example, learning at the grant level informs learning at the initiative and portfolio levels (e.g., interim grant reports within a specific initiative can be reviewed to assess the extent to which outcomes are being achieved and key challenges facing grantees progress. A similar analysis can take place across all grants within the portfolio). We also collect and reflect on different elements of our internal operations to ensure that each one most effectively supports the implementation of our strategy and grantmaking. PORTFOLIO GRANT Pre-grant due diligence of organizational capacity Grant goals, objectives, outcomes, and audience Grantee progress, changes in capacity, challenges, and lessons learned Extent objectives and outcomes achieved Focus, type, and amount of grant, grantee organizations type and size Learning insights from grants and key audiences for sharing INITIATIVE Progress against initiative goals and outcomes Key initiative accomplishments, opportunities, and challenges Broader context and its implications for initiative strategy Lessons learned, reflections, and adjustments Summary of information about grantees, or a cluster of grantees, within a specific initiative INTERNAL OPERATIONS Portfolio accomplishments and impacts Reflections and adjustments for and across initiatives Feedback from those we seek to serve (community and grantees) Portfolio level indicators that reflect the state of the state Summary of aggregate grantee level data from Foundation Connect Broader context and its implications for portfolio strategy Grantmaking processes Finance Investments Communications IT Talent development and advancement Staff engagement Diversity, equity, and inclusion practices 8

WHAT DOES IMPACT ASSESSMENT & LEARNING LOOK LIKE ACROSS THE FOUNDATION? From this point forward, all new initiatives will have an IA&L plan. This plan will be developed by the launch of each initiative and updated as each initiative evolves. At least annually, the plan will be reviewed for needed adjustments due to changing assessment and learning needs and opportunities. Elements of IA&L Initiative Plans Purpose and audience for IA&L Multiyear goals and outcomes of the Initiative Assessment and learning questions/hypotheses Methods to answer assessment and learning questions/hypotheses Design/approach, including strengths and challenges Key relevant context Data collection tools and data sources Processes for regular data synthesis, sharing, reflection, and adaptation Key products and their use (e.g., for decision making, to inform the field, and to enhance grantee effectiveness) Grantee engagement (i.e., how grantees are involved in the IA&L design and implementation and how they and their beneficiaries will benefit from IA&L efforts) Capacity and resource needs Budget Staffing allocations and roles (program staff, IA&L staff, consultants) Assessment of IA&L capacity for staff and grantees and needed supports Timeline We also pursue IA&L efforts outside of new initiatives. We are currently examining how to strengthen existing IA&L processes at the grant, portfolio, and internal operations level and have a variety of IA&L efforts underway for our current and culminating initiatives (e.g., Arts Engagement, Immigrant Integration, and Linked Learning). As we enter into our new way of working, we also are developing IA&L practices to reflect on different phases that lead to the launch of new initiatives, including research and development, landscaping, and piloting. Research & Development Landscaping Piloting Initiative Launch 9

WHAT ARE OUR IMMEDIATE PRIORITIES FOR IA&L? Since it will take a few years to fully live into this framework, we have chosen some immediate priorities given our internal IA&L capacity, where we are in the launch of our new strategy, and important steps to operationalize the framework. These include: Cultivating a learning culture that supports innovation, experimentation, reflection, flexibility, and adaptation Building internal and external IA&L capacity on the part of the IA&L team, staff, the board, and grantees Harvesting and sharing internal and external learning and knowledge applicable to the work of Irvine, our grantees, and partners Applying a diversity, equity, and inclusion lens across IA&L policies, practices, and systems As we move forward, similar to other work within the Foundation, we expect to learn, reflect, and adjust our IA&L efforts. We look forward to collaborating and learning alongside our colleagues as they pursue similar efforts and to be changed as a result. As always, we welcome input on our approach and practices as we seek for IA&L to contribute to improving the lives of Californians who are working but struggling with poverty. 10