BUILDING EVALUATION CAPACITY EVALUATION REPORT INITIAL TRAINING PERIOD EXECUTIVE SUMMARY Submitted To: HARTFORD FOUNDATION FOR PUBLIC GIVING NONPROFIT SUPPORT PROGRAM Submitted By: Anita M. Baker, Ed.D. August 2007 Anita Baker Consulting 101 E. Blair Tr Lambertville, NJ 08530
THE BEC PILOT 2006-2008 The Building Evaluation Capacity (BEC) project was initiated in the fall of 2006 by the Hartford Foundation for Public Giving s Nonprofit Support Program (NSP). BEC is a two-year project (2006-07, 2007-08) designed to provide comprehensive, long-term training and coaching to increase both evaluation capacity and organization-wide use of evaluative thinking for participating organizations. The program, adapted from the similar Bruner Foundation-sponsored Rochester Effectiveness Partnership 1 was developed and delivered by Anita Baker, Ed.D., an independent evaluation consultant, to 12 selected nonprofit organizations from the Hartford area. The Initial Training period for the pilot group included nine didactic sessions with opportunities to practice and apply new skills. It culminated with the development, by each participating organization, of a rigorous evaluation design for a selected program of their own, to be implemented during 2007-08, the second year of BEC. NSP undertook the development of BEC because, evaluation, as identified by a previous planning effort, was an area of organizational capacity the NSP was not yet addressing. Many organizations were requesting help with evaluation in response to requirements by their funders, be they government or foundations, to establish community outcomes for the work they were doing. Nonprofits were increasingly being asked to collect an array of data in response to numerous funders. It was felt that helping them not only collect better data, but also use those data to improve their performance, would be useful to them. In its first year, BEC included comprehensive training about evaluation planning, data collection, analysis and use, as well as evaluative thinking assessment, for key staff from NSP The BEC Pilot Group Amistad Center for Arts and Culture Center City Churches Community Renewal Team Compass Youth Collaborative Families in Crisis Family Life Education Governor s Prevention Partnership Latino Community Services MARC Inc., Manchester Mercy Housing and Shelter Corp. My Sisters Place Organized Parents Make A Difference The first group of BEC participants was broadly representative of HFPG grantees. By design, it included both larger and smaller organizations (3 organizations had overall budgets less than $1,000,000, 4 had budgets between $1,000,000 and $2,000,000, the rest had budgets greater than $2,000,000) that delivered different types of services (e.g., Afterschool Education, Health Services, Services to the Developmentally Disabled) grantee organizations. BEC organization participants sent two to four people to training for 36 hours each in total. Executive directors or senior staff members were required in the process. 2 The purpose of the Initial Training for BEC was to inform participants about evaluation planning, data collection, and use and to provide through classroom learning, hands-on assignments, and projectbased learning, opportunities for participants to apply what they learned. 3 This component of the BEC 1 REP was a self-governing partnership of funders, nonprofit service provider organizations and evaluation professionals committed to increasing knowledge and use of participatory program evaluation through comprehensive training and guided evaluation projects. 2 Though team sizes varied from two to four members, every participant organization included at least one senior official capable of decision-making (eight of the twelve directly involved their Executive Directors in training). The organizations also involved individuals from various positions (e.g., Director of Programs, Grants Manager, Director of Operations, Special Project Coordinator) according to their own needs for training. 3 In addition to the training sessions, the evaluation consultant/trainer also provided individual technical assistance for all participants as needed, via email or in person. This individual technical assistance was mostly conducted to help participants complete their homework or directly apply what they had learned in their own organizations (e.g., to revise an existing survey, assess existing data collection strategies, or review an evaluation design being proposed for one of their programs). In addition, a special training session was conducted for local consultants to clarify what BEC focused on and
project took place from September 2006 through June 2007. It concluded with successful development of evaluation designs and completion of evaluative thinking assessments by all participating organizations. The ultimate outcomes of BEC are expected to include enhanced knowledge about evaluation, enhanced skill to conduct evaluation and use evaluation findings (for decision-making and fund development), and apply evaluation-related skills to multiple organizational tasks (i.e., increase knowledge about and use of evaluative thinking). During 2007-08 (Yr. 2), the pilot group will continue learning about evaluation and evaluative thinking, conduct their own evaluations, and develop action plans/responses to findings from their evaluations and their assessments of evaluative thinking at their organizations By all accounts, BEC s initial year was very productive. The participants from all 12 organizations attended regularly and demonstrated that they were learning about evaluation and thinking deeply about evaluative thinking. They gained or honed numerous evaluation-related skills such as: ability to ask clear evaluation questions, design and select data collection methods, construct evaluation designs, assess presence or absence of evaluative thinking in numerous organizational areas. Most importantly every organization conducted initial assessments of evaluative thinking, began to formulate action plans to enhance evaluative thinking, and developed evaluation designs for their selected programs. Plans are underway for those designs to be implemented by participants during 2007-08 (Yr. 2.). KEY FINDINGS INITIAL TRAINING PERIOD Participant feedback regarding BEC Training sessions was consistently positive. On a four point scale ranging from Not so Good, to Okay, to Very Good to Excellent, most participants (75% - 95%) rated each session as Excellent or Very Good. Similarly, most participants indicated that each session would help a lot with their BEC evaluation project, and about half to two-thirds of the participants reported that what they learned in the training would help a lot with their regular work. A substantial majority (88%) rated the training overall as Very Good or Excellent. Additionally, all the participants indicated the Initial Training was at least somewhat worthwhile to them personally and all the participants indicated that BEC was worthwhile to their organizations. All but one participant described the trainer as Very Good or Excellent. BEC Training Final Ratings Personally Not Worthwhile 0 Somewhat worthwhile 10% Worthwhile 65% Very Worthwhile 26% For the Organization Not Worthwhile o Somewhat worthwhile 7% Worthwhile 52% Very Worthwhile 42% Trainer Overall Not so Good 0 Okay 3% Very Good 39% Excellent 58% how training was delivered, to discuss evaluation basics and how to help build evaluation capacity in organizations, and to introduce them to the concepts of evaluative thinking.
NSP support was recognized and valued. Almost all BEC participants (97%) acknowledged NSP s role in BEC and described them as supportive (including 72% who described them as very supportive, and 25% who described them as supportive or somewhat supportive). Participants developed important skills. As a result of the Initial Training, all participants demonstrated understanding of or ability to do key evaluation-related skills. By June, 2007, all participants could do the following. Design Evaluations Clarify the Purpose Specify Questions Select Data Collection Methods Specify Timelines and Level of Effort Estimate Cost of Evaluation Commission Evaluation for their Organizations Develop, Assess and Use Logic Models Involve others in Evaluation (RIPPLE) Assess Evaluative Thinking in their Organizations for 15 Different Capacity Areas and Think about Responses/Actions Document Program Implementation/Service Delivery (recruitment, retention, target populations, information tracking) Design Surveys, Identify/Fix Bad Surveys, Determine How many Surveys are Needed, Develop Survey Administration and Analysis Plans Design and Conduct Interviews, Observations and Record Reviews While many participants came with knowledge/experience about various BEC topics, they definitely enhanced and added to their knowledge through BEC. Most participants who were unfamiliar with training topics/skills reported that BEC helped them a little or a lot to develop important skills, especially specifying evaluation questions, developing evaluation designs and selecting data collection methods.. By the end of the Initial Training period, everyone was facile with a common language about evaluation, and every group demonstrated they could apply what they learned. Many of those entering with knowledge also reported learning more through BEC. The areas where participants indicated BEC had only been helped a little will be reinforced throughout the 2 nd year of BEC. Participants experienced BEC Initial Training as important. Through their responses and more importantly through their completion of assignments and ultimately development of evaluation designs, BEC Initial Training participants clearly indicated they were learning and developing skills. They also experienced BEC very positively and acknowledged its importance. All or almost all participants indicated that all key aspects of BEC were at least somewhat important. Specifically, 100 percent of the participants reported that opportunities to learn about evaluation were important (90% indicated it was Very Important) and 100 percent of participants indicated that opportunities for consultation were important (70% indicated it was Very Important). All but one participant indicated it was important to design an actual evaluation for a project, and to have opportunities to interact with colleagues in their own and other organizations.
Participants successfully developed evaluation designs. The final project for the Initial Training period was development of evaluation designs. These designs had to conform to standard professional evaluation practice and they showed that BEC participants were able to apply what they learned. Each design described the subject program and why it was selected, specified evaluation questions, and specified their choices regarding which data collection strategies would be used to obtain data. The designs also included projections of level of effort (i.e., who would do each task and how much time in days or hours would be reserved for them), proposed timelines for evaluation activities (i.e., when months/days/seasons evaluation activities would happen), and plans for use of the evaluation results. Each organization prepared specific plans to show how each data collection strategy would be used to answer each evaluation question. Participants were able to easily explain their design choices. BEC participants understand and have begun to Ripple. In order for BEC to have the broadest impact, it is required that participants extend or Ripple what they learn through BEC. That was clear and desirable for NSP, the evaluation trainer, and all participant organizations at the outset. During the Initial Training period, pilot group participants were briefed about strategies for doing this. It is clear that BEC participants are thinking about Ripple. While they still need more help to accomplish this, all organizations have plans to approach Ripple through multiple strategies. Many have already begun to use their new skills for projects beyond their evaluations, and to involve others in evaluation processes. On the final survey almost all participants indicated they had begun to address Ripple (61% reported they had done so a little, 32% reported they had done so some, and 1 participant indicated Ripple had been taking place a lot.) Specifically, BEC participants are: paying more attention to the need for systematic evaluation. We have implemented a staff evaluation through a series of meetings. [This] had been a goal for quite some time but it was never realized. modifying tool and data collection strategies to get better data. We ve reviewed and revised many forms, are using excel for data collection and have provided staff training about evaluative thinking. beginning to change their thinking and their evaluation-related practices. Developing evaluations has caused us to think more purposefully about our program and what we want clients to get out of it. reaching out and thinking about other people and other programs in their organizations. Participants were sharing information about the importance of evaluation, and were beginning to generate curiosity and interest about evaluation. We ve challenged workers to think about the programs they work within and to develop questions that [would help them know whether] clients are or are not benefiting from the program. Evaluative Thinking is being enhanced through BEC. Evaluative thinking is a type of reflective practice that incorporates use of systematically collected data to inform organizational actions. As has been discussed several times during the BEC Initial Training period, Evaluative Thinking can be applied to many organizational functions (e.g., mission development, HR decision-making, communications/marketing) in addition to program
development and delivery. All BEC participant organizations have conducted initial and first follow-up assessments of Evaluative Thinking in their own organizations. On the Initial Training period final survey 100 percent of participants indicated that participating in BEC had enhanced evaluative thinking in their organizations: 30 percent reported it had happened a little, and 70 percent reported being in BEC had enhanced evaluative thinking in their organizations a lot. Evaluative thinking has allowed for more clearly focused attention in our organization. The importance of planning, strategies, data management have been in the forefront as we seek to have answers to evaluation questions. NEXT STEPS During the 2 nd year of BEC, the participant organizations will be guided through the process of implementing the evaluation designs they developed during their Initial Training. This will provide the participants with an opportunity to genuinely use their skills and new knowledge to address questions/issues that they identify. Conducting an actual evaluation is the feature of this training which past participants in similar initiatives have acknowledged as the crucial element, and the component most likely to inspire continual learning and a view of evaluation that transcends external accountability. The 2 nd year will also provide ongoing opportunities to consider relationships between enhanced evaluation capacity and evaluative thinking, and to more seriously pursue strategies to Ripple BEC and enhance Evaluative Thinking. As described above, training for 2007-08 (Yr. 2 for the pilot group) will mostly focus on helping the participants successfully complete the evaluations they designed. In addition, there will be ongoing, and intensified focus on evaluative thinking, especially how to develop action steps to enhance it, and more attention to extending (i.e., Rippling ) what participants have learned through BEC. The training format will be much different and will regularly include individual consultations with each group to go through the specifics of their projects, help them revise data collection strategies as needed, and most importantly help them to conduct analyses and summarize findings. Year 2 also provides an opportunity for the participants to get exposure to some advanced topics and to focus on data analysis using data they have collected from their own evaluation efforts. By June 2008, the participants will be ready to present the findings from their own evaluations and to discuss evaluation challenges and accomplishments. BEC is an evaluation capacity building project that is also being evaluated. The BEC evaluation is a participatory evaluation commissioned by the NSP. Data are collected by participants and the evaluation trainer, as well as by other Foundation officials. All products developed during training sessions will be assessed and the final evaluation reports will be graded by the evaluation trainer and an independent reader using a standardized point system. The BEC participants will also have to present their evaluation designs and findings at the final Conference (June 2008) and project assessments will be collected from selected conference attendees. The results of the BEC Evaluations will inform decision making regarding the future of BEC (e.g., whether to expand or discontinue it). The results will also be included as part of a larger study being conducted by the Bruner Foundation to help compare and clarify productive strategies for building evaluation capacity and enhanced Evaluative Thinking.
ISSUES FOR FURTHER CONSIDERATION The Initial Training period ended smoothly and everything is in order for 2007-08 (Yr. 2). The following however will deserve ongoing or initial attention as the project continues. Ensuring that participants get meaningful opportunities to analyze real data from their own organizations and successfully learn how to plan for and conduct evaluation data analyses. Developing productive strategies for supporting the needs and interests of participants with different skill levels. Helping participants stay focused on BEC evaluation learning and especially their projects, while also managing other organization demands. Helping participants deal with the rigor required to analyze evaluation data and summarize findings for external communication (i.e., develop evaluation reports). Pushing to embed/institutionalize evaluation capacity, inspire and support efforts to use multiple Ripple strategies (apply knowledge to other evaluation needs, involve others in evaluation, provide training to others about evaluation). Clarifying strategies for and supporting efforts to enhance Evaluative Thinking at BEC organizations. With ongoing assistance and support from NSP, the evaluation trainer/consultant will continue to modify efforts and focus on the above to ensure BEC increases Evaluation Capacity and Enhances Evaluative Thinking for participant organizations. Thanks to HFPG and [the trainer] for including us in the program. It is a tremendous resource and valuable investment in our future ability to both design and collect accurate data, as well as use the data to improve services.