BUILDING EVALUATION CAPACITY

Similar documents
Strategic Practice: Career Practitioner Case Study

Coaching Others for Top Performance 16 Hour Workshop

STUDENT PERCEPTION SURVEYS ACTIONABLE STUDENT FEEDBACK PROMOTING EXCELLENCE IN TEACHING AND LEARNING

KENTUCKY FRAMEWORK FOR TEACHING

Worldwide Online Training for Coaches: the CTI Success Story

PREP S SPEAKER LISTENER TECHNIQUE COACHING MANUAL

The SREB Leadership Initiative and its

How Might the Common Core Standards Impact Education in the Future?

White Paper. The Art of Learning

Using research in your school and your teaching Research-engaged professional practice TPLF06

Major Milestones, Team Activities, and Individual Deliverables

SACS Reaffirmation of Accreditation: Process and Reports

Best Practices in Internet Ministry Released November 7, 2008

Title II of WIOA- Adult Education and Family Literacy Activities 463 Guidance

STUDENT LEARNING ASSESSMENT REPORT

CONSISTENCY OF TRAINING AND THE LEARNING EXPERIENCE

Copyright Corwin 2015

Fundraising 101 Introduction to Autism Speaks. An Orientation for New Hires

COACHING A CEREMONIES TEAM

Chart 5: Overview of standard C

School Inspection in Hesse/Germany

Leader s Guide: Dream Big and Plan for Success

1GOOD LEADERSHIP IS IMPORTANT. Principal Effectiveness and Leadership in an Era of Accountability: What Research Says

GUIDE TO STAFF DEVELOPMENT COURSES. Towards your future

RAISING ACHIEVEMENT BY RAISING STANDARDS. Presenter: Erin Jones Assistant Superintendent for Student Achievement, OSPI

Davidson College Library Strategic Plan

MENTORING. Tips, Techniques, and Best Practices

ESTABLISHING A TRAINING ACADEMY. Betsy Redfern MWH Americas, Inc. 380 Interlocken Crescent, Suite 200 Broomfield, CO

DESIGNPRINCIPLES RUBRIC 3.0

Committee to explore issues related to accreditation of professional doctorates in social work

Learning Lesson Study Course

Basic Skills Initiative Project Proposal Date Submitted: March 14, Budget Control Number: (if project is continuing)

Hawai i Pacific University Sees Stellar Response Rates for Course Evaluations

The KAM project: Mathematics in vocational subjects*

Program Assessment and Alignment

Massachusetts Juvenile Justice Education Case Study Results

Public School Choice DRAFT

Initial teacher training in vocational subjects

Advances in Assessment The Wright Institute*

AGENDA Symposium on the Recruitment and Retention of Diverse Populations

Division of Student Affairs Annual Report. Office of Multicultural Affairs

Executive Summary. DoDEA Virtual High School

Core Strategy #1: Prepare professionals for a technology-based, multicultural, complex world

MGMT 479 (Hybrid) Strategic Management

Augusta University MPA Program Diversity and Cultural Competency Plan. Section One: Description of the Plan

Process Evaluation Power of the Wind Pilot Project

Governors State University Student Affairs and Enrollment Management: Reaching Vision 2020

R01 NIH Grants. John E. Lochman, PhD, ABPP Center for Prevention of Youth Behavior Problems Department of Psychology

Superintendent s 100 Day Entry Plan Review

Monitoring & Evaluation Tools for Community and Stakeholder Engagement

BSP !!! Trainer s Manual. Sheldon Loman, Ph.D. Portland State University. M. Kathleen Strickland-Cohen, Ph.D. University of Oregon

Education: Professional Experience: Personnel leadership and management

Selling Skills. Tailored to Your Needs. Consultants & trainers in sales, presentations, negotiations and influence

Chapter 9 The Beginning Teacher Support Program

Leadership Development at

AUTHORITATIVE SOURCES ADULT AND COMMUNITY LEARNING LEARNING PROGRAMMES

Governors and State Legislatures Plan to Reauthorize the Elementary and Secondary Education Act

RETURNING TEACHER REQUIRED TRAINING MODULE YE TRANSCRIPT

GLBL 210: Global Issues

Student-led IEPs 1. Student-led IEPs. Student-led IEPs. Greg Schaitel. Instructor Troy Ellis. April 16, 2009

Section 1: Program Design and Curriculum Planning

Juvenile Detention Alternatives Initiative Inter-site Conference. Improving Conditions in Detention Centers: Recent Innovations New Incentive System

TASK 2: INSTRUCTION COMMENTARY

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

SECTION I: Strategic Planning Background and Approach

EDUCATIONAL ATTAINMENT

Special Educational Needs & Disabilities (SEND) Policy

MSW POLICY, PLANNING & ADMINISTRATION (PP&A) CONCENTRATION

Strategic Plan SJI Strategic Plan 2016.indd 1 4/14/16 9:43 AM

George Mason University Graduate School of Education Education Leadership Program. Course Syllabus Spring 2006

Examining the Structure of a Multidisciplinary Engineering Capstone Design Program

re An Interactive web based tool for sorting textbook images prior to adaptation to accessible format: Year 1 Final Report

Indicators Teacher understands the active nature of student learning and attains information about levels of development for groups of students.

I. Proposal presentations should follow Degree Quality Assessment Board (DQAB) format.

COUNSELLING PROCESS. Definition

CONTRACT TENURED FACULTY

Great Teachers, Great Leaders: Developing a New Teaching Framework for CCSD. Updated January 9, 2013

Post-Master s Certificate in. Leadership for Higher Education

BLENDED LEARNING IN ACADEMIA: SUGGESTIONS FOR KEY STAKEHOLDERS. Jeff Rooks, University of West Georgia. Thomas W. Gainey, University of West Georgia

An Industrial Technologist s Core Knowledge: Web-based Strategy for Defining Our Discipline

St. Mary Cathedral Parish & School

School: Business Course Number: ACCT603 General Accounting and Business Concepts Credit Hours: 3 hours Length of Course: 8 weeks Prerequisite: None

lourdes gazca, American University in Puebla, Mexico

FRANKLIN D. CHAMBERS,

Student Experience Strategy

SEARCH PROSPECTUS: Dean of the College of Law

By Merrill Harmin, Ph.D.

DIOCESE OF PLYMOUTH VICARIATE FOR EVANGELISATION CATECHESIS AND SCHOOLS

We seek to be: A vibrant, excellent place of learning at the heart of our Christian community.

Guidelines for the Use of the Continuing Education Unit (CEU)

Date: 9:00 am April 13, 2016, Attendance: Mignone, Pothering, Keller, LaVasseur, Hettinger, Hansen, Finnan, Cabot, Jones Guest: Roof

TITLE 23: EDUCATION AND CULTURAL RESOURCES SUBTITLE A: EDUCATION CHAPTER I: STATE BOARD OF EDUCATION SUBCHAPTER b: PERSONNEL PART 25 CERTIFICATION

Testimony to the U.S. Senate Committee on Health, Education, Labor and Pensions. John White, Louisiana State Superintendent of Education

What does Quality Look Like?

5 Early years providers

Priorities for CBHS Draft 8/22/17

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

University of Toronto

The Talent Development High School Model Context, Components, and Initial Impacts on Ninth-Grade Students Engagement and Performance

Glenn County Special Education Local Plan Area. SELPA Agreement

Transcription:

BUILDING EVALUATION CAPACITY EVALUATION REPORT INITIAL TRAINING PERIOD EXECUTIVE SUMMARY Submitted To: HARTFORD FOUNDATION FOR PUBLIC GIVING NONPROFIT SUPPORT PROGRAM Submitted By: Anita M. Baker, Ed.D. August 2007 Anita Baker Consulting 101 E. Blair Tr Lambertville, NJ 08530

THE BEC PILOT 2006-2008 The Building Evaluation Capacity (BEC) project was initiated in the fall of 2006 by the Hartford Foundation for Public Giving s Nonprofit Support Program (NSP). BEC is a two-year project (2006-07, 2007-08) designed to provide comprehensive, long-term training and coaching to increase both evaluation capacity and organization-wide use of evaluative thinking for participating organizations. The program, adapted from the similar Bruner Foundation-sponsored Rochester Effectiveness Partnership 1 was developed and delivered by Anita Baker, Ed.D., an independent evaluation consultant, to 12 selected nonprofit organizations from the Hartford area. The Initial Training period for the pilot group included nine didactic sessions with opportunities to practice and apply new skills. It culminated with the development, by each participating organization, of a rigorous evaluation design for a selected program of their own, to be implemented during 2007-08, the second year of BEC. NSP undertook the development of BEC because, evaluation, as identified by a previous planning effort, was an area of organizational capacity the NSP was not yet addressing. Many organizations were requesting help with evaluation in response to requirements by their funders, be they government or foundations, to establish community outcomes for the work they were doing. Nonprofits were increasingly being asked to collect an array of data in response to numerous funders. It was felt that helping them not only collect better data, but also use those data to improve their performance, would be useful to them. In its first year, BEC included comprehensive training about evaluation planning, data collection, analysis and use, as well as evaluative thinking assessment, for key staff from NSP The BEC Pilot Group Amistad Center for Arts and Culture Center City Churches Community Renewal Team Compass Youth Collaborative Families in Crisis Family Life Education Governor s Prevention Partnership Latino Community Services MARC Inc., Manchester Mercy Housing and Shelter Corp. My Sisters Place Organized Parents Make A Difference The first group of BEC participants was broadly representative of HFPG grantees. By design, it included both larger and smaller organizations (3 organizations had overall budgets less than $1,000,000, 4 had budgets between $1,000,000 and $2,000,000, the rest had budgets greater than $2,000,000) that delivered different types of services (e.g., Afterschool Education, Health Services, Services to the Developmentally Disabled) grantee organizations. BEC organization participants sent two to four people to training for 36 hours each in total. Executive directors or senior staff members were required in the process. 2 The purpose of the Initial Training for BEC was to inform participants about evaluation planning, data collection, and use and to provide through classroom learning, hands-on assignments, and projectbased learning, opportunities for participants to apply what they learned. 3 This component of the BEC 1 REP was a self-governing partnership of funders, nonprofit service provider organizations and evaluation professionals committed to increasing knowledge and use of participatory program evaluation through comprehensive training and guided evaluation projects. 2 Though team sizes varied from two to four members, every participant organization included at least one senior official capable of decision-making (eight of the twelve directly involved their Executive Directors in training). The organizations also involved individuals from various positions (e.g., Director of Programs, Grants Manager, Director of Operations, Special Project Coordinator) according to their own needs for training. 3 In addition to the training sessions, the evaluation consultant/trainer also provided individual technical assistance for all participants as needed, via email or in person. This individual technical assistance was mostly conducted to help participants complete their homework or directly apply what they had learned in their own organizations (e.g., to revise an existing survey, assess existing data collection strategies, or review an evaluation design being proposed for one of their programs). In addition, a special training session was conducted for local consultants to clarify what BEC focused on and

project took place from September 2006 through June 2007. It concluded with successful development of evaluation designs and completion of evaluative thinking assessments by all participating organizations. The ultimate outcomes of BEC are expected to include enhanced knowledge about evaluation, enhanced skill to conduct evaluation and use evaluation findings (for decision-making and fund development), and apply evaluation-related skills to multiple organizational tasks (i.e., increase knowledge about and use of evaluative thinking). During 2007-08 (Yr. 2), the pilot group will continue learning about evaluation and evaluative thinking, conduct their own evaluations, and develop action plans/responses to findings from their evaluations and their assessments of evaluative thinking at their organizations By all accounts, BEC s initial year was very productive. The participants from all 12 organizations attended regularly and demonstrated that they were learning about evaluation and thinking deeply about evaluative thinking. They gained or honed numerous evaluation-related skills such as: ability to ask clear evaluation questions, design and select data collection methods, construct evaluation designs, assess presence or absence of evaluative thinking in numerous organizational areas. Most importantly every organization conducted initial assessments of evaluative thinking, began to formulate action plans to enhance evaluative thinking, and developed evaluation designs for their selected programs. Plans are underway for those designs to be implemented by participants during 2007-08 (Yr. 2.). KEY FINDINGS INITIAL TRAINING PERIOD Participant feedback regarding BEC Training sessions was consistently positive. On a four point scale ranging from Not so Good, to Okay, to Very Good to Excellent, most participants (75% - 95%) rated each session as Excellent or Very Good. Similarly, most participants indicated that each session would help a lot with their BEC evaluation project, and about half to two-thirds of the participants reported that what they learned in the training would help a lot with their regular work. A substantial majority (88%) rated the training overall as Very Good or Excellent. Additionally, all the participants indicated the Initial Training was at least somewhat worthwhile to them personally and all the participants indicated that BEC was worthwhile to their organizations. All but one participant described the trainer as Very Good or Excellent. BEC Training Final Ratings Personally Not Worthwhile 0 Somewhat worthwhile 10% Worthwhile 65% Very Worthwhile 26% For the Organization Not Worthwhile o Somewhat worthwhile 7% Worthwhile 52% Very Worthwhile 42% Trainer Overall Not so Good 0 Okay 3% Very Good 39% Excellent 58% how training was delivered, to discuss evaluation basics and how to help build evaluation capacity in organizations, and to introduce them to the concepts of evaluative thinking.

NSP support was recognized and valued. Almost all BEC participants (97%) acknowledged NSP s role in BEC and described them as supportive (including 72% who described them as very supportive, and 25% who described them as supportive or somewhat supportive). Participants developed important skills. As a result of the Initial Training, all participants demonstrated understanding of or ability to do key evaluation-related skills. By June, 2007, all participants could do the following. Design Evaluations Clarify the Purpose Specify Questions Select Data Collection Methods Specify Timelines and Level of Effort Estimate Cost of Evaluation Commission Evaluation for their Organizations Develop, Assess and Use Logic Models Involve others in Evaluation (RIPPLE) Assess Evaluative Thinking in their Organizations for 15 Different Capacity Areas and Think about Responses/Actions Document Program Implementation/Service Delivery (recruitment, retention, target populations, information tracking) Design Surveys, Identify/Fix Bad Surveys, Determine How many Surveys are Needed, Develop Survey Administration and Analysis Plans Design and Conduct Interviews, Observations and Record Reviews While many participants came with knowledge/experience about various BEC topics, they definitely enhanced and added to their knowledge through BEC. Most participants who were unfamiliar with training topics/skills reported that BEC helped them a little or a lot to develop important skills, especially specifying evaluation questions, developing evaluation designs and selecting data collection methods.. By the end of the Initial Training period, everyone was facile with a common language about evaluation, and every group demonstrated they could apply what they learned. Many of those entering with knowledge also reported learning more through BEC. The areas where participants indicated BEC had only been helped a little will be reinforced throughout the 2 nd year of BEC. Participants experienced BEC Initial Training as important. Through their responses and more importantly through their completion of assignments and ultimately development of evaluation designs, BEC Initial Training participants clearly indicated they were learning and developing skills. They also experienced BEC very positively and acknowledged its importance. All or almost all participants indicated that all key aspects of BEC were at least somewhat important. Specifically, 100 percent of the participants reported that opportunities to learn about evaluation were important (90% indicated it was Very Important) and 100 percent of participants indicated that opportunities for consultation were important (70% indicated it was Very Important). All but one participant indicated it was important to design an actual evaluation for a project, and to have opportunities to interact with colleagues in their own and other organizations.

Participants successfully developed evaluation designs. The final project for the Initial Training period was development of evaluation designs. These designs had to conform to standard professional evaluation practice and they showed that BEC participants were able to apply what they learned. Each design described the subject program and why it was selected, specified evaluation questions, and specified their choices regarding which data collection strategies would be used to obtain data. The designs also included projections of level of effort (i.e., who would do each task and how much time in days or hours would be reserved for them), proposed timelines for evaluation activities (i.e., when months/days/seasons evaluation activities would happen), and plans for use of the evaluation results. Each organization prepared specific plans to show how each data collection strategy would be used to answer each evaluation question. Participants were able to easily explain their design choices. BEC participants understand and have begun to Ripple. In order for BEC to have the broadest impact, it is required that participants extend or Ripple what they learn through BEC. That was clear and desirable for NSP, the evaluation trainer, and all participant organizations at the outset. During the Initial Training period, pilot group participants were briefed about strategies for doing this. It is clear that BEC participants are thinking about Ripple. While they still need more help to accomplish this, all organizations have plans to approach Ripple through multiple strategies. Many have already begun to use their new skills for projects beyond their evaluations, and to involve others in evaluation processes. On the final survey almost all participants indicated they had begun to address Ripple (61% reported they had done so a little, 32% reported they had done so some, and 1 participant indicated Ripple had been taking place a lot.) Specifically, BEC participants are: paying more attention to the need for systematic evaluation. We have implemented a staff evaluation through a series of meetings. [This] had been a goal for quite some time but it was never realized. modifying tool and data collection strategies to get better data. We ve reviewed and revised many forms, are using excel for data collection and have provided staff training about evaluative thinking. beginning to change their thinking and their evaluation-related practices. Developing evaluations has caused us to think more purposefully about our program and what we want clients to get out of it. reaching out and thinking about other people and other programs in their organizations. Participants were sharing information about the importance of evaluation, and were beginning to generate curiosity and interest about evaluation. We ve challenged workers to think about the programs they work within and to develop questions that [would help them know whether] clients are or are not benefiting from the program. Evaluative Thinking is being enhanced through BEC. Evaluative thinking is a type of reflective practice that incorporates use of systematically collected data to inform organizational actions. As has been discussed several times during the BEC Initial Training period, Evaluative Thinking can be applied to many organizational functions (e.g., mission development, HR decision-making, communications/marketing) in addition to program

development and delivery. All BEC participant organizations have conducted initial and first follow-up assessments of Evaluative Thinking in their own organizations. On the Initial Training period final survey 100 percent of participants indicated that participating in BEC had enhanced evaluative thinking in their organizations: 30 percent reported it had happened a little, and 70 percent reported being in BEC had enhanced evaluative thinking in their organizations a lot. Evaluative thinking has allowed for more clearly focused attention in our organization. The importance of planning, strategies, data management have been in the forefront as we seek to have answers to evaluation questions. NEXT STEPS During the 2 nd year of BEC, the participant organizations will be guided through the process of implementing the evaluation designs they developed during their Initial Training. This will provide the participants with an opportunity to genuinely use their skills and new knowledge to address questions/issues that they identify. Conducting an actual evaluation is the feature of this training which past participants in similar initiatives have acknowledged as the crucial element, and the component most likely to inspire continual learning and a view of evaluation that transcends external accountability. The 2 nd year will also provide ongoing opportunities to consider relationships between enhanced evaluation capacity and evaluative thinking, and to more seriously pursue strategies to Ripple BEC and enhance Evaluative Thinking. As described above, training for 2007-08 (Yr. 2 for the pilot group) will mostly focus on helping the participants successfully complete the evaluations they designed. In addition, there will be ongoing, and intensified focus on evaluative thinking, especially how to develop action steps to enhance it, and more attention to extending (i.e., Rippling ) what participants have learned through BEC. The training format will be much different and will regularly include individual consultations with each group to go through the specifics of their projects, help them revise data collection strategies as needed, and most importantly help them to conduct analyses and summarize findings. Year 2 also provides an opportunity for the participants to get exposure to some advanced topics and to focus on data analysis using data they have collected from their own evaluation efforts. By June 2008, the participants will be ready to present the findings from their own evaluations and to discuss evaluation challenges and accomplishments. BEC is an evaluation capacity building project that is also being evaluated. The BEC evaluation is a participatory evaluation commissioned by the NSP. Data are collected by participants and the evaluation trainer, as well as by other Foundation officials. All products developed during training sessions will be assessed and the final evaluation reports will be graded by the evaluation trainer and an independent reader using a standardized point system. The BEC participants will also have to present their evaluation designs and findings at the final Conference (June 2008) and project assessments will be collected from selected conference attendees. The results of the BEC Evaluations will inform decision making regarding the future of BEC (e.g., whether to expand or discontinue it). The results will also be included as part of a larger study being conducted by the Bruner Foundation to help compare and clarify productive strategies for building evaluation capacity and enhanced Evaluative Thinking.

ISSUES FOR FURTHER CONSIDERATION The Initial Training period ended smoothly and everything is in order for 2007-08 (Yr. 2). The following however will deserve ongoing or initial attention as the project continues. Ensuring that participants get meaningful opportunities to analyze real data from their own organizations and successfully learn how to plan for and conduct evaluation data analyses. Developing productive strategies for supporting the needs and interests of participants with different skill levels. Helping participants stay focused on BEC evaluation learning and especially their projects, while also managing other organization demands. Helping participants deal with the rigor required to analyze evaluation data and summarize findings for external communication (i.e., develop evaluation reports). Pushing to embed/institutionalize evaluation capacity, inspire and support efforts to use multiple Ripple strategies (apply knowledge to other evaluation needs, involve others in evaluation, provide training to others about evaluation). Clarifying strategies for and supporting efforts to enhance Evaluative Thinking at BEC organizations. With ongoing assistance and support from NSP, the evaluation trainer/consultant will continue to modify efforts and focus on the above to ensure BEC increases Evaluation Capacity and Enhances Evaluative Thinking for participant organizations. Thanks to HFPG and [the trainer] for including us in the program. It is a tremendous resource and valuable investment in our future ability to both design and collect accurate data, as well as use the data to improve services.