REACHING RESULTS LEARNING FROM LOGIC MODELS: AN EXAMPLE OF A FAMILY/SCHOOL PARTNERSHIP PROGRAM HARVARD FAMILY RESEARCH PROJECT LOGIC MODEL DEFINED

Similar documents
OFFICE OF HUMAN RESOURCES SAMPLE WEB CONFERENCE OR ON-CAMPUS INTERVIEW QUESTIONS

Assessment. the international training and education center on hiv. Continued on page 4

Deans, Chairpersons, and Directors

Preliminary Report Initiative for Investigation of Race Matters and Underrepresented Minority Faculty at MIT Revised Version Submitted July 12, 2007

Early Warning System Implementation Guide

1GOOD LEADERSHIP IS IMPORTANT. Principal Effectiveness and Leadership in an Era of Accountability: What Research Says

Examining the Structure of a Multidisciplinary Engineering Capstone Design Program

TU-E2090 Research Assignment in Operations Management and Services

Curricular Reviews: Harvard, Yale & Princeton. DUE Meeting

DICE - Final Report. Project Information Project Acronym DICE Project Title

School Leadership Rubrics

DESIGNPRINCIPLES RUBRIC 3.0

E-3: Check for academic understanding

ADDIE: A systematic methodology for instructional design that includes five phases: Analysis, Design, Development, Implementation, and Evaluation.

Problem-Solving with Toothpicks, Dots, and Coins Agenda (Target duration: 50 min.)

Chiltern Training Ltd.

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

BSP !!! Trainer s Manual. Sheldon Loman, Ph.D. Portland State University. M. Kathleen Strickland-Cohen, Ph.D. University of Oregon

Graduate Program in Education

Cambridge NATIONALS. Creative imedia Level 1/2. UNIT R081 - Pre-Production Skills DELIVERY GUIDE

MPA Internship Handbook AY

Note Taking Handbook Mount Aloysius College Disability Services

Grade 3: Module 2B: Unit 3: Lesson 10 Reviewing Conventions and Editing Peers Work

Scoring Guide for Candidates For retake candidates who began the Certification process in and earlier.

CONFLICT OF INTEREST CALIFORNIA STATE UNIVERSITY, CHICO. Audit Report June 11, 2014

Social Emotional Learning in High School: How Three Urban High Schools Engage, Educate, and Empower Youth

Math Pathways Task Force Recommendations February Background

Unit 2. A whole-school approach to numeracy across the curriculum

Conceptual Framework: Presentation

Higher Education / Student Affairs Internship Manual

Using Team-based learning for the Career Research Project. Francine White. LaGuardia Community College

Classroom Assessment Techniques (CATs; Angelo & Cross, 1993)

e-portfolios in Australian education and training 2008 National Symposium Report

State Parental Involvement Plan

Your Guide to. Whole-School REFORM PIVOT PLAN. Strengthening Schools, Families & Communities

Guidelines for the Use of the Continuing Education Unit (CEU)

STEPS TO EFFECTIVE ADVOCACY

Section 3.4. Logframe Module. This module will help you understand and use the logical framework in project design and proposal writing.

Writing the Personal Statement

Reducing Spoon-Feeding to Promote Independent Thinking

Create A City: An Urban Planning Exercise Students learn the process of planning a community, while reinforcing their writing and speaking skills.

Facing our Fears: Reading and Writing about Characters in Literary Text

Stakeholder Engagement and Communication Plan (SECP)

An Industrial Technologist s Core Knowledge: Web-based Strategy for Defining Our Discipline

OCR LEVEL 3 CAMBRIDGE TECHNICAL

The ADDIE Model. Michael Molenda Indiana University DRAFT

COACHING A CEREMONIES TEAM

Volunteer State Community College Strategic Plan,

Planning a research project

Embedded College Review for Educational Oversight by the Quality Assurance Agency for Higher Education

Show and Tell Persuasion

WORK OF LEADERS GROUP REPORT

Understanding Co operatives Through Research

The SREB Leadership Initiative and its

Implementing Pilot Early Grade Reading Program in Morocco

Practices Worthy of Attention Step Up to High School Chicago Public Schools Chicago, Illinois

$0/5&/5 '"$*-*5"503 %"5" "/"-:45 */4536$5*0/"- 5&$)/0-0(: 41&$*"-*45 EVALUATION INSTRUMENT. &valuation *nstrument adopted +VOF

Higher education is becoming a major driver of economic competitiveness

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

Implementing a tool to Support KAOS-Beta Process Model Using EPF

Strategic Practice: Career Practitioner Case Study

Strategic Plan SJI Strategic Plan 2016.indd 1 4/14/16 9:43 AM

PERFORMING ARTS. Unit 2 Proposal for a commissioning brief Suite. Cambridge TECHNICALS LEVEL 3. L/507/6467 Guided learning hours: 60

United states panel on climate change. memorandum

Freshman On-Track Toolkit

Title Columbus State Community College's Master Planning Project (Phases III and IV) Status COMPLETED

Ruggiero, V. R. (2015). The art of thinking: A guide to critical and creative thought (11th ed.). New York, NY: Longman.

Major Milestones, Team Activities, and Individual Deliverables

FOR TEACHERS ONLY RATING GUIDE BOOKLET 1 OBJECTIVE AND CONSTRUCTED RESPONSE JUNE 1 2, 2005

Building Extension s Public Value

Professional Learning Suite Framework Edition Domain 3 Course Index

H2020 Marie Skłodowska Curie Innovative Training Networks Informal guidelines for the Mid-Term Meeting

Promoting the Social Emotional Competence of Young Children. Facilitator s Guide. Administration for Children & Families

Mapping the Assets of Your Community:

Unit 7 Data analysis and design

ACCREDITATION STANDARDS

Examples of Individual Development Plans (IDPs)

The Talent Development High School Model Context, Components, and Initial Impacts on Ninth-Grade Students Engagement and Performance

MBA 510: Critical Thinking for Managers

Should a business have the right to ban teenagers?

Using research in your school and your teaching Research-engaged professional practice TPLF06

Patient/Caregiver Surveys

Dear Internship Supervisor:

What s in Your Communication Toolbox? COMMUNICATION TOOLBOX. verse clinical scenarios to bolster clinical outcomes: 1

TIMSS ADVANCED 2015 USER GUIDE FOR THE INTERNATIONAL DATABASE. Pierre Foy

Fundraising 101 Introduction to Autism Speaks. An Orientation for New Hires

MSW POLICY, PLANNING & ADMINISTRATION (PP&A) CONCENTRATION

Textbook Evalyation:

Ministry of Education General Administration for Private Education ELT Supervision

Hardhatting in a Geo-World

How to Take Accurate Meeting Minutes

MODULE 4 Data Collection and Hypothesis Development. Trainer Outline

SERVICE-LEARNING Annual Report July 30, 2004 Kara Hartmann, Service-Learning Coordinator Page 1 of 5

St Michael s Catholic Primary School

Arizona s English Language Arts Standards th Grade ARIZONA DEPARTMENT OF EDUCATION HIGH ACADEMIC STANDARDS FOR STUDENTS

4.0 CAPACITY AND UTILIZATION

An Introduction to Simio for Beginners

EDIT 576 DL1 (2 credits) Mobile Learning and Applications Fall Semester 2014 August 25 October 12, 2014 Fully Online Course

Visual CP Representation of Knowledge

St Philip Howard Catholic School

Transcription:

REACHING RESULTS LEARNING FROM LOGIC MODELS: AN EXAMPLE OF A FAMILY/SCHOOL PARTNERSHIP PROGRAM LOGIC MODEL DEFINED A logic model provides the basic framework for an evaluation. It is a graphic that describes a program or organization in evaluation terms. It illustrates a program s theory of change, showing how day-to-day activities connect to the results or outcomes the program is trying to achieve. Similar to a flowchart, it lays out program activities and outcomes using boxes, and, using arrows to connect the boxes, shows how the activities and outcomes connect with one another. Developing a logic model should be one of the first steps in an evaluation. Once the model is completed, the evaluation can be designed to determine whether the program is working as shown in the logic model. The logic model can also become a tool for learning when evaluation data are applied directly to the model. A logic model can be created by anyone with knowledge of the program or organization that is to be evaluated. It is helpful for program personnel and evaluators to work together to create a logic model because program personnel can offer the expertise needed to describe the program and its intended results accurately, and evaluators can help to translate this knowledge into evaluation terms. The advantages in graphically displaying a program s activities and outcomes using a logic model as opposed to simply defining and listing them are at least threefold. First and foremost, there is power in visual representation. Visual displays or graphics are proven and effective learning instruments. Second, a logic model ensures that a program s process is not overlooked in an evaluation. The model makes the evaluator accountable for looking at both program process and outcomes. Third, a logic model can enhance the process of learning through evaluation. As data are collected, the logic model can be used to put the data in perspective, examine the theory that underlies the program, and make program midcourse corrections if needed. USING THIS BRIEF This brief offers a step-by-step approach for developing and using a logic model as a framework for a program or organization s evaluation. Its purpose is to provide a tool to guide evaluation processes and to facilitate practitioner and evaluator partnerships. The brief is written primarily for program practitioners, but is also relevant and easily applied for evaluators. The example of a hypothetical program, the Family Involvement Project (FIP), is used throughout the text to provide a realistic context for and help clarify each of the steps described. In addition, a completed logic model for the program is presented, from which examples throughout the text are taken. Before you begin the brief, examine the FIP logic model on the next page carefully. Note whether the model adds to your understanding of the program and whether it helps you to understand how the program s process connects to the outcomes. This logic model example should not only give you an idea of what a completed model looks like, but it should also illustrate the value in using the logic model format. HARVARD FAMILY RESEARCH PROJECT

FAMILY INVOLVEMENT PROJECT (FIP): HYPOTHETICAL PROGRAM EXAMPLE Schools and community-based organizations are implementing family involvement programs throughout the country. Sponsors of these programs, as well as the implementing organizations themselves, share an interest in understanding and learning from the outcomes of these efforts so they can identify the most effective approaches or program designs. Implementing organizations are especially interested in seeking ways to improve their program strategies or activities so they can increase the likelihood of achieving their desired outcomes. The Family Involvement Project (FIP) is a national organization that has a vision of improving outcomes for children by increasing family member (with a primary focus on parents) involvement in their children s education. To achieve this, FIP recruits and trains parent leaders in communities using a series of workshops. The workshops teach parent leaders how to get involved in their children s education and also how to train other parents in their community to get involved. FIP then supports ongoing training in communities by providing parent leaders with technical assistance on training and disseminating materials on family involvement to all parents who go through the workshops. FIP also works within communities to build a system of family involvement and training that is sustainable over time. FIP builds relationships with schools to ensure that family involvement is welcomed and supported. In addition, FIP builds coalitions of organizations locally that are interested in sustaining and building a family involvement agenda within their community. FAMILY INVOLVEMENT PROJECT (FIP) LOGIC MODEL inputs activities short-term outcomes long-term outcomes Resources: Funding National and local program staff Parents Recruit and train parent leaders Provide technical assistance to parent leaders Increase # of parents in communities who are trained by parent leaders on parent involvement Increase involvement of parents associated with FIP in their children s education Collaborations: School administrators and teachers Local organizations National organizations Develop and disseminate materials on parent involvement to parents Build relationships with schools Develop coalitions of local parent involvement programs and organizations Increase parent knowledge on how to become involved in their children s education Build or strengthen local collaborations at each site to promote and sustain parent involvement Increase parents levels of involvement in their children s education - parents work closely with school systems and seek leadership positions Planning: Evaluation Program sustainability Disseminate FIP materials to local and national funders and organizations Build relationships with potential local or national long-term funders Increase awareness of and commitment to FIP and its sustainability Establish FIP sustainability

STEP 1: DETERMINE THE APPROPRIATE SCOPE FOR THE LOGIC MODEL. The first step in logic model construction is to determine the appropriate scope for your model. Decide whether your logic model should focus on a specific component of your work or broadly cover the entire program or organization. Your answer should be driven by your evaluation or information needs. For example, consider the issue that FIP, our hypothetical organization, may be experiencing. Program staff could find that parent leaders are not recruiting and training as many parents in their communities as they originally estimated. This could stem from a number of issues, including a problem with the content of FIP s training for parent leaders, a problem with their recruitment criteria for parent leaders, or FIP staff s unrealistic expectations about parent capacity and availability for recruitment and training. To determine the answer to their question, they can construct a logic model like the one below that is smaller in scope than the one shown on the previous page and lays out the parent training component in detail. This model will help them to design a system for collecting data to determine the source of their specific problem and identify possible solutions. inputs activities short-term outcomes long-term outcomes Secure funding to conduct training. Hire and train FIP staff Develop training curricula Recruit parent leaders for training Conduct trainings for parents leaders Provide technical assistance to parent leaders Develop the capacity of parent trainers to recruit other parents Develop the capacity of parent trainers to train other parents Develop a cadre of parent trainers that are actively training parents in their community STEP 2: IDENTIFY YOUR MODEL S COMPONENTS. Working within your chosen scope of work, begin to construct your model s main components or the information that will go in the boxes on your model. This is the most time-intensive part of this process, but, as shown below, this can be done in stages by starting with basic components and adding more detail later. Use organizational documents you already have to help you construct your components. Refer to strategic planning documents, mission statements, grant proposals, work plans, recruitment announcements, marketing/public relations materials, training materials, or publications. Any document that describes the work you do will be helpful.

Start at a basic level by identifying your model s core components and their relationships. Starting with the four components described below inputs, activities, short-term outcomes, and long-term outcomes will help you clarify the overall structure for your model. Their corresponding parts from the FIP logic model serve as examples. Inputs: What you do to make your program s implementation possible. These are the plans or resources you develop or steps you take to prepare for or support your program s implementation. Activities: What happens during the implementation of your program. Short-term outcomes: The direct result of your program activities. They indicate a measurable change, and the language used often starts with to increase or to decrease. Long-term outcomes: Changes in individual or group behavior or community conditions that a program hopes to achieve over time. Short-term outcomes contribute to the achievement of long-term outcomes, but other factors may contribute as well. It is important to remember, however, that programs typically are accountable for demonstrating success or progress in achieving long-term outcomes. As a result, they should be measurable and as specific as possible. Inputs resources collaborations planning Activities recruit and train parent leaders provide technical assistance to parent leaders develop and disseminate materials on family involvement to parents build relationships with schools develop coalitions of local family involvement programs and organizations disseminate FIP materials to local and national funders and organizations build relationships with potential local or national longterm funders Short-Term Outcomes increase # of parents in communities who are trained by parent leaders on parent involvement increase parent knowledge on how to become involved in their children s education build or strengthen local collaborations at each site to promote and sustain parent involvement increase awareness of and commitment to FIP and its sustainability Long-Term Outcomes increase involvement of parents associated with FIP in their children s education increase parents levels of involvement in their children s education - parents work closely with school systems and seek leadership positions establish FIP sustainability Choose the order for identifying the components of your model that works the best for you. You can start with your inputs and move toward your outcomes or vice versa. For example, if you have gone through a process like strategic planning in which you have already identified your long-term outcomes, you may want to start there and work back toward your activities and inputs.

Now that you have defined the model s basic components, consider adding more detail. If you are already satisfied that you have captured everything you need, move onto the next step. If you feel you need to add more detail to explain the program better, consider adding any of the following components to your model. Although these components are not included on the FIP model on page 2, the diagram below illustrates what they might be if added. Outputs: These fall between activities and short-term outcomes. They are the goods and services generated by program activities and are the link between program activities and short-term outcomes. Contextual variables: These are factors that may or may not be under your control, but that could impact your program s implementation and/or the achievement of your outcomes. Intermediate outcomes: These fall between short-term and long-term outcomes. You might have several intermediate outcomes that are important to achieve before your longer-term outcomes are even possible. Impacts: These come after long-term outcomes. They typically refer to even broader-level change than long-term outcomes. It is usually impossible to demonstrate or prove scientifically that your program caused the desired change. However, a plausible case can often be made that the program contributed to the desired impact. Inputs Activities Outputs the number of parent leaders trained Short-Term Outcomes Intermediate Long-Term Outcomes Outcomes incremental changes in how parents build up their levels of involvement over time Impacts better outcomes for children that result from increased family involvement Contextual Variables: staff turnover, funding delays, transportation access STEP 3: DRAFT THE LOGIC MODEL. Once you have identified the components of your model, the hard part is over. The next step is to take what you have done and put it in graphic form, putting boxes around the components and attaching arrows to show the relationships between them. Consider these criteria about the look of your logic model as you develop it: 1 Do not use abbreviations Make your model greater in length than height (should be 50% wider than tall) Write words left to right, not top to bottom Use a basic type and avoid italics Use thin lines and don t alter their thickness Use rectangles, not circles. When you have finished a draft of the model, ask others to review it for accuracy and readability. Refine and revise it until both you and others who have provided feedback are satisfied. 1 Adapted from Tufte, Edward. (1983). The Visual Display of Quantitative Information. Cheshire, CT: Graphics Press.

STEP 4: USE YOUR LOGIC MODEL AS AN EVALUATION FRAMEWORK. Now that you have your logic model, you can begin to use it as a framework for your evaluation. Because the focus of this brief is on logic models and not on evaluation principles and techniques, 2 the implementation of these next steps is described only briefly below. While typically an evaluator will lead the implementation of these steps, this should be a participatory approach with program personnel providing input to guide the evaluator s direction. Develop indicators for your logic model components. Indicators are measures used to determine if the boxes, or components, in your logic model have been achieved. When applied and interpreted together, they will help you determine whether your program is operating as shown in your logic model. Develop indicators for each component on the logic model. 3 You will need multiple indicators for each component of your model in order to understand fully whether the component has or has not been achieved. Multiple indicators will also strengthen an argument you may need to make later that your program is working as shown in the logic model. Use process indicators to measure your activities. For example, to look at the activity on the FIP logic model of providing technical assistance to parent leaders, these indicators can be used: Number of technical assistance requests received Number of technical assistance requests answered Number of types of technical assistance provided Satisfactory ratings from parent leaders on the technical assistance provided. Use outcome indicators to measure your short- or long-term outcomes. For example, to measure the FIP long-term outcome of increasing involvement of parents associated with FIP in their children s education, the following indicators can be used: Number of parents who report dedicating more time to their children s education after receiving FIP training Number of parents who report increasing their levels of involvement in their children s education after receiving FIP training Number of different types of involvement reported by parents (e.g., helping children with work at home, volunteering in the classroom, getting involved in school reform) Number of teachers who report increases in involvement of parents after FIP training. In addition to process and outcome indicators, you may need to develop indicators to track the relationships between the components on your logic model. These will help you determine if the arrows you drew are accurate and meaningful. For example, indicators to determine whether increasing parent knowledge on how to become involved in their children s education can lead to increased involvement may include the following: Number of findings showing a relationship between FIP parents knowledge on involvement and involvement in their children s education Number of findings from the research literature showing a positive relationship between parent knowledge on involvement and involvement in their children s education 2 For a more in-depth discussion of evaluation techniques see references cited under Additional Resources. 3 For a more in-depth discussion of the types of indicators, see another brief in the Reaching Results series. Horsch, Karen (1997). Indicators: Definition and use in a results-based accountability system. Cambridge, MA: Harvard Family Research Project.

Number of findings from the research literature showing no relationship between parent knowledge on involvement and involvement in their children s education Number of parents who report that they increased their involvement because they became more aware of different ways of becoming involved. Keep in mind that not all indicators are created equal. While you can likely generate a long list of possible indicators for each component on your logic model, some of them will make more sense for you to track than others. For example, some will require fewer resources. Or you might be able to use a single indicator for multiple components on your model. Consider these questions as you choose your indicators. 4 Is the indicator relevant does it enable you to know about the expected result or condition? Is the indicator defined and data collected in the same way over time? Will data be available for the indicator? Are data for the indicator currently being collected or can cost-effective instruments for data collection be developed? Will the indicator provide sufficient information about a condition or result to convince both supporters and skeptics? Is the indicator quantitative? Track your indicators. Once you have identified your indicators, you are ready to determine the data sources and methods for tracking them and collecting the information you need. Common data sources include program records, program participants, community members, or trained observers. Common methods include document/record review, questionnaires, surveys, interviews, focus groups, and checklists. Remember that you might already be collecting some of the data needed to track your indicators. For example, a review of FIP program records would likely reveal that staff are already collecting useful data, such as the number of parents trained by each parent trainer or measures of satisfaction from parents regarding participation in the trainings. STEP 5: REVISIT YOUR MODEL OFTEN AND USE IT AS A LEARNING TOOL. At regular intervals in your data collection process, apply your indicator data to your logic model. Lay out the data directly onto the model so you can get a complete picture of whether your program is working as intended. Determine which parts of the model are working well and which are not. Determine whether you need to make programming changes or whether your model needs to be revised to portray your program more accurately. In addition, periodically use your data to revisit and reexamine your overall theory of change. You may find that you need to modify some of the assumptions on which you based your original model and that, as a result, the model needs to be revised. For example, consider again the FIP logic model. As described in Step 1, the program could be experiencing the problem that parent leaders are not recruiting and training as many family members or other parents in their communities as they originally estimated. The evaluation may uncover the finding that parent leaders need support in their outreach efforts beyond the technical assistance that FIP staff members provide. In response to this finding, FIP staff may feel that they need to add to their activities the development of peer networks in communities. The addition of this component will change the overall program theory and, therefore, the logic model. 4 Horsch, Karen. (1997). Indicators: Definition and use in a results-based accountability system. Cambridge, MA: Harvard Family Research Project.

The important point is to use the logic model as a learning tool throughout the evaluation. Do not set it aside once the model is completed and the evaluation designed. When your data is applied directly to your model, you will find the data is easier to interpret and the findings easier to apply and use. The goal is to use the logic model as a feedback and learning tool with the model initially informing the data and then the data ultimately informing the model. ADDITIONAL RESOURCES Developing and Using Logic Models/Theories of Change for Evaluation Connell, James & Kubisch, Anne. (1996). Applying a theories of change approach to the evaluation of comprehensive community initiatives: Progress, prospects and problems. New York, NY: The Aspen Institute, Roundtable on Comprehensive Community Initiatives for Children and Families. Julian, David. (1997). The utilization of the logic model as a system level planning and evaluation device. Evaluation and Program Planning, 20 (3), 251-257. Milligan, Sharon, Coulton, Claudia, York, Peter, & Register, Ronald. (1996). Implementing a theories of change evaluation in the Cleveland Community Building Initiative. Cleveland, OH: Center on Urban Poverty and Social Change. Evaluation Techniques United Way of America. (1996). Measuring program outcomes: A practical approach. To order, call Sales Service/America at (800) 772-0008. W.K. Kellogg Foundation. (1998). Evaluation handbook. To order, contact Collateral Management Company, 1255 Hill Brady Road, Battle Creek, MI 49015; (616) 964-0700. Ask for item number 1203. by Julia Coffman Harvard Family Research Project Copyright 1999 By the President and Fellows of Harvard College (Harvard Family Research Project). All righs reserved. No part of this publication may be reproduced in any way without the written permission of the Harvard Family Research Project. Founded in 1983 by Dr. Heather Weiss, the Harvard Family Research Project conducts research about programs and policies that serve children and families throughout the United States. Publishing and disseminating its research widely, HFRP plays a vital role in examining and encouraging programs and policies that enable families and communities to help children reach their potential. 38 Concord Avenue Cambridge, MA 02138 Tel: (617) 495-9108 Fax: (617) 495-8594 E-mail: hfrp_gse@harvard.edu Web site: http://hugse1.harvard.edu/~hfrp