Table of Contents. Executive Summary Program Evaluation Design Program Description Literature Review... 9

Similar documents
Safe & Civil Schools Series Overview

BSP !!! Trainer s Manual. Sheldon Loman, Ph.D. Portland State University. M. Kathleen Strickland-Cohen, Ph.D. University of Oregon

Delaware Performance Appraisal System Building greater skills and knowledge for educators

ADDENDUM 2016 Template - Turnaround Option Plan (TOP) - Phases 1 and 2 St. Lucie Public Schools

EFFECTIVE CLASSROOM MANAGEMENT UNDER COMPETENCE BASED EDUCATION SCHEME

SPECIALIST PERFORMANCE AND EVALUATION SYSTEM

Positive Learning Environment

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

IMPACT INSTITUTE BEHAVIOR MANAGEMENT. Krissy Matthaei Gina Schutt

SSIS SEL Edition Overview Fall 2017

Early Warning System Implementation Guide

ARLINGTON PUBLIC SCHOOLS Discipline

California Professional Standards for Education Leaders (CPSELs)

State Parental Involvement Plan

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Every student absence jeopardizes the ability of students to succeed at school and schools to

Your Guide to. Whole-School REFORM PIVOT PLAN. Strengthening Schools, Families & Communities

Final Teach For America Interim Certification Program

Alpha provides an overall measure of the internal reliability of the test. The Coefficient Alphas for the STEP are:

School Leadership Rubrics

Executive Summary. Belle Terre Elementary School

1.0 INTRODUCTION. The purpose of the Florida school district performance review is to identify ways that a designated school district can:

Volunteer State Community College Strategic Plan,

Restorative Measures In Schools Survey, 2011

Second Step Suite and the Whole School, Whole Community, Whole Child (WSCC) Model

KENTUCKY FRAMEWORK FOR TEACHING

Restorative Practices In Iowa Schools: A local panel presentation

Executive Summary. Laurel County School District. Dr. Doug Bennett, Superintendent 718 N Main St London, KY

Student Assessment and Evaluation: The Alberta Teaching Profession s View

BENCHMARK TREND COMPARISON REPORT:

Orange Elementary School FY15 Budget Overview. Tari N. Thomas Superintendent of Schools

Elementary and Secondary Education Act ADEQUATE YEARLY PROGRESS (AYP) 1O1

Expanded Learning Time Expectations for Implementation

The specific Florida Educator Accomplished Practices (FEAP) addressed in this course are:

Charter School Performance Comparable to Other Public Schools; Stronger Accountability Needed

CHAPTER 4: REIMBURSEMENT STRATEGIES 24

Exceptional Student Education Monitoring and Assistance On-Site Visit Report. Sarasota County School District April 25-27, 2016

PEDAGOGY AND PROFESSIONAL RESPONSIBILITIES STANDARDS (EC-GRADE 12)

IUPUI Office of Student Conduct Disciplinary Procedures for Alleged Violations of Personal Misconduct

Program Alignment CARF Child and Youth Services Standards. Nonviolent Crisis Intervention Training Program

Illinois WIC Program Nutrition Practice Standards (NPS) Effective Secondary Education May 2013

Katy Independent School District Paetow High School Campus Improvement Plan

PSYC 620, Section 001: Traineeship in School Psychology Fall 2016

Discrimination Complaints/Sexual Harassment

Disciplinary action: special education and autism IDEA laws, zero tolerance in schools, and disciplinary action

PARIS ELEMENTARY SCHOOL INSTRUCTIONAL AUDIT

A Pilot Study on Pearson s Interactive Science 2011 Program

Progress or action taken

Person Centered Positive Behavior Support Plan (PC PBS) Report Scoring Criteria & Checklist (Rev ) P. 1 of 8

School-Wide Restorative Practices: Step by Step

A Review of the MDE Policy for the Emergency Use of Seclusion and Restraint:

Every curriculum policy starts from this policy and expands the detail in relation to the specific requirements of each policy s field.

Qualitative Site Review Protocol for DC Charter Schools

Special Educational Needs and Disabilities Policy Taverham and Drayton Cluster

United states panel on climate change. memorandum

DISCIPLINE PROCEDURES FOR STUDENTS IN CHARTER SCHOOLS Frequently Asked Questions. (June 2014)

Legacy of NAACP Salary equalization suits.

English Language Arts Summative Assessment

Executive Summary. Lincoln Middle Academy of Excellence

NDPC-SD Data Probes Worksheet

NATIONAL CENTER FOR EDUCATION STATISTICS RESPONSE TO RECOMMENDATIONS OF THE NATIONAL ASSESSMENT GOVERNING BOARD AD HOC COMMITTEE ON.

Kelso School District and Kelso Education Association Teacher Evaluation Process (TPEP)

STUDENT ASSESSMENT AND EVALUATION POLICY

Program Change Proposal:

MIDDLE SCHOOL. Academic Success through Prevention, Intervention, Remediation, and Enrichment Plan (ASPIRE)

University of Waterloo School of Accountancy. AFM 102: Introductory Management Accounting. Fall Term 2004: Section 4

Assessment and Evaluation for Student Performance Improvement. I. Evaluation of Instructional Programs for Performance Improvement

DISTRICT ASSESSMENT, EVALUATION & REPORTING GUIDELINES AND PROCEDURES

Attention Getting Strategies : If You Can Hear My Voice Clap Once. By: Ann McCormick Boalsburg Elementary Intern Fourth Grade

Executive Summary. Hialeah Gardens High School

School Performance Plan Middle Schools

EXECUTIVE SUMMARY. Online courses for credit recovery in high schools: Effectiveness and promising practices. April 2017

Student Support Services Evaluation Readiness Report. By Mandalyn R. Swanson, Ph.D., Program Evaluation Specialist. and Evaluation

Getting Results Continuous Improvement Plan

Social Emotional Learning in High School: How Three Urban High Schools Engage, Educate, and Empower Youth

REFERENCE GUIDE AND TEST PRODUCED BY VIDEO COMMUNICATIONS

ANNUAL SCHOOL REPORT SEDA COLLEGE SUITE 1, REDFERN ST., REDFERN, NSW 2016

INDEPENDENT STUDY PROGRAM

PREP S SPEAKER LISTENER TECHNIQUE COACHING MANUAL

Overview. Prevention of Youth Violence in Schools

Systemic Improvement in the State Education Agency

Pyramid. of Interventions

The Oregon Literacy Framework of September 2009 as it Applies to grades K-3

Contract Language for Educators Evaluation. Table of Contents (1) Purpose of Educator Evaluation (2) Definitions (3) (4)

Executive Summary. Walker County Board of Education. Dr. Jason Adkins, Superintendent 1710 Alabama Avenue Jasper, AL 35501

Instructional Intervention/Progress Monitoring (IIPM) Model Pre/Referral Process. and. Special Education Comprehensive Evaluation.

Using Staff and Student Time Engaged in Disciplinary Procedures to Evaluate the Impact of School-Wide PBS

TEACHING QUALITY: SKILLS. Directive Teaching Quality Standard Applicable to the Provision of Basic Education in Alberta

RED 3313 Language and Literacy Development course syllabus Dr. Nancy Marshall Associate Professor Reading and Elementary Education

ASSESSMENT REPORT FOR GENERAL EDUCATION CATEGORY 1C: WRITING INTENSIVE

University of Toronto

West Georgia RESA 99 Brown School Drive Grantville, GA

Training Staff with Varying Abilities and Special Needs

2020 Strategic Plan for Diversity and Inclusive Excellence. Six Terrains

Strategic Practice: Career Practitioner Case Study

Hitchcock Independent School District. District Improvement Plan

TAI TEAM ASSESSMENT INVENTORY

Scoring Guide for Candidates For retake candidates who began the Certification process in and earlier.

Albemarle County Public Schools School Improvement Plan KEY CHANGES THIS YEAR

Developing an Assessment Plan to Learn About Student Learning

Transcription:

Table of Contents Executive Summary... 1 Program Evaluation Design... 3 Program Description... 5 Literature Review... 9 FY2006 Implementation Analysis... 14 FY2006 Outcome Analysis... 25 FY2006 Return on Investment Analysis... 57 FY2007 Recommended Action Plan... 61 Implementation Appendices (I) I-1. Implementation Evaluation Results by Elementary & Middle School...I-1 I-2. CHAMPs Student Survey Results for each Middle School...I-5 I-3. Effective Teaching Strategies Student Survey Results for each Middle School...I-16 Outcome Appendices (O) O1. Educational Effect Size Rubric... O-1 O2. Math Comparison of Key Student Characteristics for each Elementary School... O-3 O3. Reading Comparison of Key Student Characteristics for each Elementary School... O-12

Outcome Appendices (continued) O4. Math Comparison of Key Student Characteristics for each Middle School... O-21 O5. Reading Comparison of Key Student Characteristics for each Middle School... O-25

FY2006 Evaluation of the CHAMPs Program Executive Summary The Department of Research and Evaluation and the Department of Safe Schools have completed a formative evaluation of the CHAMPs Program (the Program) at eight elementary schools and three middle schools in FY2006. The purpose of a formative evaluation is to provide Program administrators with the implementation and student outcome results needed to create a specific and appropriate action plan for the following year. The purpose of the action plan is to remediate implementation related concerns that may have adversely impacted student performance. There are two components to the evaluation: an implementation component and an outcome component. 1. The implementation component analyzes the extent, consistency, and quality of the Program's implementation and progress toward meeting the Program's objectives; 2. The outcome component analyzes the extent to which the student outcome objectives of the Program were met. Based on the results of the Program's implementation and outcome analyses and on the FY2006 cost of the Program, the District is able to determine the Program's Return on Investment (ROI). The District uses the ROI to make recommendations regarding program related action that should be taken the next fiscal year. Summary of the FY2006 Findings 1. Implementation Analysis Results The analysis of the Program at the District's elementary and middle schools indicates that implementation has largely been successful. The results of the student surve s ertaining to Program implementation at Y p. the middle school level only somewhat confirm these findings. Flnally, the effective instructional strategies survey suggests that Program teachers employ few effective strategies at the middle school level. Steps should be taken to encourage teachers to employ more such strategies. 2. Outcome Analysis Results The analyses of various aspects of the CHAMPs Program at the eight District elementary schools provided no evidence that the Program has made a significant impact. The analyses at the three District middle schools indicated that, across all schools as a group, there is no evidence that the Program has made a significant difference. It should be noted that the program group at Don Estridge High Tech Middle School consistently out-performed its comparison group in a majority of areas analyzed. Nevertheless, it cannot be determined that the reason for the positive performance at this school is due to the implementation of the Program as there is no pattern of positive performance elsewhere. The majority of middle school students in the Program did not recognize two of the ten survey statements (My teacher talks about CHAMPs in my class and My teacher reviews with us the CHAMPs expectations for different activities.) Page 1

3. Cost of the Program Currently, $385,000 of operating funds is allocated to the CHAMPs Classroom Management initiative. These funds cover the salaries of three resource teachers (CHAMPs Coaches) and an Assistant Director. The majority of the remaining funds are utilized to pay for materials, including a CHAMPs book for each teacher and school administrator. No capital funds are allocated to the program. The total dollar cost since the initiative implementation is $1.5 million dollars. Currently the dollar cost per student is approximately $35 per year. Recommendations Based on the results of the evaluation, the Program Evaluation Steering Committee has made the following recommendations regarding Program related action that should be taken in FY2007: 1. Continue District support for the Program; 2. Make changes to descriptors that will more accurately reflect levels of implementation at individual schools and for individual staff members; 3. Review and analyze the appropriateness of descriptors for outcomes measures; 4. Develop an action plan to maximize implementation; 5. Conduct a second formative evaluation in FY2007 Page 2

FY2006 Evaluation Design of the CHAMPs Program The purpose of this report is to provide the Superintendent of the School District of Palm Beach County with a formative evaluation of the CHAMPs Program (the Program) in FY2006. In addition to a description and literature review of the Program, this evaluation consists of three types of analyses: an implementation analysis, a student outcome analysis, and a return on investment analysis. Implementation Analysis Design The implementation analysis design follows the procedures described in the School District of Palm Beach Countv Proqram Evaluation Procedures manual. Four of the six questions in the Guiding Principles of Program Evaluation Design section of that manual address the quality of a program's implementation'. From the outset it was agreed to limit the evaluation's focus to answering two of the four questions, specifically, question numbers three and four. 3. Is the program being implemented according to the program description? 4. Is the program satisfactorily supported (e.g., equipment, facilities, management, materials, supplies, and professional development)? To provide a second perspective to the extent of the implementation of the Program, a survey was completed by participating students2 to answer the following questions: o o To what extent do students agree that the Program has been implemented according to the Program description? To what extent do students agree that Program teachers employ research based, effective instructional strategies? Note that the implementation analysis does not evaluate the quality of instruction provided by individual teachers or by the teachers as a group. It evaluates only the extent to which the program of instruction meets the criteria of the Program guidelines. Student Outcome Analysis Design Like the implementation analysis design, the outcome analysis design follows the procedures described in the School District of Palm Beach County Proqram Evaluation Procedures manual. This analysis answers question six in the Guiding Principles of Program Evaluation Design section of that manual: 6. To what degree does the program meet its objectives and other evaluation criteria established for the program evaluation? Research Questions A. To determine the extent to which students clearly understand the behavioral expectations in every learning situation, the following questions were asked: Questions two through five in the Guiding Principles of Program Evaluation Design section of the Proqram Evaluation Procedures manual address the quality of a program's implementation: 2. Is there a written program description that includes program objectives and satisfactory operational guidelines, and were the operational guidelines available to and understood by all program personnel? 3. Is the program being implemented according to the program description? 4. Is the program satisfactorily supported (e.g., equipment, facilities, management, materials, supplies, and professional development)? 5. Is there evidence of success or progress towards a product or service that is not an explicit program objective as a result of the program? Surveys employed in the program evaluation process are administered at the secondary level only. Page 3

1. In FY2006, were the percents of students receiving one or more discipline referrals in the Program schools significantly lower than those of matched comparison groups? 2. In FY2006, were the percents of students receiving one or more out-of-school suspensions in the Program schools significantly lower than those of matched comparison groups? B. To determine the extent to which an ethos of fairness was created at Program schools, the following questions were asked: 1. In FY2006, was the mean Positive School Climate and Safe and Orderly Environment (PSC) correlate on the School Effectiveness Questionnaire (SEQ) significantly higher at Program schools than in FY2005? 2. In FY2006, were the mean numbers of student discipline referrals at Program schools significantly lower than in FY2005? 3. In FY2006, were the mean numbers of out-of-school suspensions at Program schools significantly lower than in FY2005? C. To determine the extent to which teachers were provided with more focused instructional time and students with more focused time for learning at Program schools, the following questions were asked: 1. In FY2006, did 80% of all teachers surveyed at Program schools respond 3, 4, or 5 on a 5-point rubric that Program strategies contribute to more focused instructional time? 2. From FY2005 to FY2006, did cohort groups of Program students significantly out-perform matched comparison groups on FCAT SSS Reading a. In Developmental Scale Score (DSS) gains? b. In the portion of a year's growth in reading in one year? c. In estimated number of years required to move students from a basic to proficient level of reading3? 3. In FY2006,-were the percents of cohort groups of Program students scoring Level 3 or higher on I FCAT SSS Reading significantly greater than those of matched cohort comparison groups? 4. From FY2005 to FY2006, did cohort groups of Program students significantly out-perform matched comparison groups on FCAT SSS Mathematics a. In Developmental Scale Score (DSS) gains? b. In the portion of a year's growth in mathematics in one year? c. In estimated number of years required to move students from a basic to proficient level of mathematics? 5. In FY2OO6,-were the percents of cohort groups of Program students scoring Level 3 or higher on I FCAT SSS Mathematics significantly greater than those of matched cohort comparison groups? Return on lnvestment Analysis Design This analysis determines the extent of return on the District's investment in the Program during FY2006. Return on lnvestment (ROI) was determined by combining the results of the Program's implementation and outcome analyses and adding the cost of the Program. 3 The lower end of Level 2 on FCAT SSS Reading to the lower end of Level 3. Page 4

Program Description CHAMPs Classroom Management Strategies Prepared by Dave Benson Assistant Director of Safe Schools A. Description of Intervention CHAMPs Classroom Management is a Single School Culture for Behavior initiative designed to create a uniform set of practices in reference to classroom management strategies that staff members will practice across a school campus. Teachers and administrators develop clear behavioral expectations for every type of learning activity and transition. These clear expectations are posted and taught in the C-H-A-M-P (Conversation-Help-Activity-Movement-Participation) format. These "sets of decisions" teachers make concerning the activities and transitions are critical to managing an effective classroom and maintaining instructional momentum. Staff will teach, refer to, and reteach these expectations frequently throughout the school year. An initial training with school site administrators before pre-school includes some basic strategies they can utilize to easily observe and coach staff. During pre-school, each school staff receives an initial half day training to begin implementation with the CHAMPs strategies. Ongoing training occurs for staff during staff meetings, professional development days, or other set aside time frames. Additionally, the CHAMPs Coach will visit every classroom and provide feedback (written and verbal) with staff. Small group meetings or training sessions to discuss behavioral issues and classroom management strategies will occur throughout the year B. History of the Program within the District CHAMPs was initially introduced as a school-wide initiative in FY2003 as a pilot at Odyssey Middle School. Staff members were trained on four separate Professional Development Days (PDDs) during the year and implemented phases of the initiative after each session. In late July of FY2004, the majority of staff from twenty schools attended a two-day training with Randy Sprick (author and presenter of the CHAMPs materials). Three CHAMPs Coaches were assigned to assist with implementation at these schools on average of one day every two weeks. However, the focus was narrowed to 13 schools with a CHAMPs coach assisting on average of one day per week. In FY2005, several schools received initial training during pre-school with additional training scheduled for PDDs. Unfortunately, the hurricanes disrupted the training opportunities therefore direct service was provided to six schools while several other schools from previous years received ongoing assistance. Currently, in FY2006, several new schools to CHAMPs were trained during pre-school and ten will receive services on average of one day per week from a CHAlblPs Coach. An eleventh school (Don Estridge High Tech Middle School) received services last year, but encountered some implementation barriers. This school will continue to have a CHAMPs Coach on average of one day per week. Currently, at least 63 schools have received some form of training and instruction on implementing the CHAMPs initiative. Additionally, similar training has been provided for approximately 300 afterschool personnel. There are 31 schools that are receiving or have received the initiative's training and coaching services which include 1775 teachers and approximately 25,000 students. During the last four years, approximately 3500 teachers encountering close to 50,000 students have been provided effective classroom management strategies utilizing the CHAMPs model. Page 5

C. Purpose, Goals, and Outcomes of the Program The purpose of the CHAMPs Classroom Management initiative is to create an atmosphere on campus that is conducive for learning and increases responsible student behavior. Teachers formally develop and post clear expectations for each activity and teach these expectations to all students. Additionally, staff will utilize effective strategies that promote positive interactions with students and provide fluent corrections for student misbehavior. Goal 1: To provide teachers with more focused instructional time and students with more focused time for learning. Objective 1.I: By June, 2006, 80% of all teachers surveyed on the "CHAMPs Implementation Survey" will indicate with a score of 4 or higher on a 5-point scale that CHAMPs strategies contributed to more focused instructional time. Objective I.2: The mean gain in Developmental Scale Score of students at CHAMPs schools on FCAT SSS Reading from FY2005 to FY2006 will be significantly greater than that of a matched comparison group. Objective 1.3: The percent of students at CHAMPs schools scoring Level 3 or higher on FCAT SSS Reading will be significantly greater than that of a matched comparison group in FY2006. Objective 1.4: The mean gain in Developmental Scale Score of students at CHAMPs schools on FCAT SSS Mathematics from FY2005 to FY2006 will be significantly greater than that of a matched comparison group. Objective I.5: The percent of students at CHAMPs schools scoring Level 3 or higher on FCAT SSS Mathematics will be significantly greater than that of a matched comparison group in FY2006. Goal 2: Students will clearly understand the behavioral expectations in every learning situation resulting in reduced discipline incidents. Objective 2.1: The percent of students receiving one or more discipline referrals at each CHAMPs schools will be significantly lower in FY2006 than that of a matched comparison group. Objective 2.2: The percent of students receiving one or more out-of-school suspensions (unduplicated) at each CHAMPs schools will be significantly lower in FY2006 than that of a matched comparison group. Goal 3: An ethos of fairness will be created in each classroom and across campus for CHAMPs schools. Objective 3.1: The mean level of favorable teacher responses at Champs schools to questions on the Positive School Climate and Safe and Orderly Environment section of the School effectiveness questionnaire will be significantly higher in FY2006 than in FY2005 than those of a matched comparison group. Objective 3.2: The total number of student discipline referrals at each CHAMPs schools will be significantly lower in FY2006 than in FY2005. Objective 3.3: The percentage of total student out-of-school suspensions (duplicated) at each CHAMPs schools will be significantly lower in FY2006 than in FY2005. Page 6

D. Financial Components of the Program Currently, $385,000 of operating funds is allocated to the CHAMPs Classroom Management initiative. These funds cover the salaries of three resource teachers (CHAMPs Coaches) and an Assistant Director. The majority of the remaining funds are utilized to pay for materials, including a CHAMPs book for each teacher and school administrator. No capital funds are allocated to the program. The total dollar cost since the initiative implementation is $1.5 million dollars. Currently the dollar cost per student is approximately $35 per year. E. Program Implementation 1. Instructionally Related Program Descriptors School Administration Responsibilities The Principal and the CHAMPs Coach will utilize the CHAMPs implementation rubriclchecklist to determine each of the following: 1.I Administrators will visit each classroom at least twice per month to determine the presence of effective classroom management strategies, i.e., a. Classroom/School-wide rules posted, b. Activity Expectations posted, c. Positive vs. Negative interaction rate, d. Effectiveness of the attention signal, e. Student understanding of the activity expectations. 1.2 Administrators will model CHAMPs expectations in staff meetings, PDDs, and other meetings by utilizing strategies such as an attention signal and communicating expectations. 1.3 Administrators will create school-wide norms by "CHAMPsing" common area activities. Scripts and/or posters should be utilized to clearly communicate these "common area" expectations to all students and staff. 1.4 The principal, in concert with the CHAMPs Coach, will create and develop a Single School Culture Team to sustain the initiative after the initial year of implementation. 1.5 The Single School Culture Team will meet to review and address school-wide behavioral issues and assist staff on classroom management issues. Instructional Staff Responsibilities The Principal and the CHAMPs Coach will utilize the CHAMPs implementation rubric/checklist to determine'each of the following: 1.6 All teachers will develop and post classroom rules. 1.7 All teachers will develop, create visuals and teach at least two Activity Expectations. 1.8 All teachers will develop an identifiable attention signal. 1.9 All teachers will complete a Classroom Management plan and submit the plan to the Principal. 1.I 0 Teachers will consistently address early-stage misbehaviors with classroom interventions in accordance with hislher classroom management plan and only utilize discipline referrals to the office for serious behavioral infractions. 1.I 1 All teachers will monitor their interactions with students by self-assessment, peer assistance, coach visitations, or administrator observations or by another method. Page 7

Student Responsibilities The Principal and the CHAMPs Coach will utilize the CHAMPs implementation rubriclchecklist to determine each of the following items. The "CHAMPs Implementation Survey" will also be utilized to determine each of the following: 1.I2 Students respond to an attention signal from any staff member within 3-5 seconds consistently. 1.I3 Students can easily identify "what is expected of them" for each activity and transition. 2. Product Related Program Descriptors Champs Coach Responsibilities The CHAMPs Coach will utilize the CHAMPs implementation rubriclchecklist to determine each of the following: 2.1 Each teacher and administrator will receive a CHAMPs book. 2.2 Each elementary school will receive a Teacher's Encyclopedia for Behavior Management (TEBM) for each grade level. Each middle school should receive a TEBM for each team or instructional grouping. Each high school will receive a TEBM for every department. 2.3 Each elementary and middle CHAMPs school will receive a TEBM for every discipline administrator and guidance counselor. 2.4 The appearance of each classroom and office area (accessible to students) will be organized and is conducive to structured learning which encourages appropriate student behavior. 2.5 flexible by schedules. 3. Service Related Program Descriptors Champ Coach Responsibilities The CHAMPs Coach will utilize the CHAMPs implementation rubriclchecklist to determine each of the following: 3.1 Prior to implementation at a school center, administrators will receive a two-hour training session, CHAMPs for Administrators. Topics will include how to facilitate the initiative, strategies to model, effective strategies to observe in classrooms, utilizing the TEBM, and how to maintain focus on effective classroom management strategies throughout the year and in the years to follow. 3.2 Before the beginning of the school year, teachers and administrators will receive the Phase 1 training in the CHAlVlPs Classroom Management strategies. 3.3 The school principal will attend all CHAMPs trainings with staff to demonstrate support for the initiative. 3.4 The other school administrators will attend all CHAMPs trainings with staff to demonstrate support for the initiative. 3.5 Teachers will attend additional training sessions with the CHAMPs Coach throughout the year. These sessions will focus on additional strategies and analyzing behavioral issues. 3.6 Teachers will attend at least one individual feedback session with the CHAMPs Coach throughout the year. Page 8

CHAMPs Classroom Management Strategies Literature Review Prepared by Dave Benson Assistant Director of Safe Schools A. Description of Invention The high stakes testing environment placed upon schools has forced educators to explore innovative and effective instructional practices and strategies. A constant call for evaluating data provides administrators with a plethora of opportunities in the instructional arena. However, many educators are beginning to recognize the correlation of a well-managed classroom to student achievement. The data reflecting effective classroom management skills has traditionally been limited to discipline referrals and climate surveys, but more administrators are now analyzing this data as well as many other factors that contribute to creating an environment conducive to learning. Cotton (1995) indicated in an update in the metaanalysis of effective schooling practices that "Teachers and students work together over time to extend and refine each learner's knowledge and skills. Through careful preplanning, effective classroom management and instruction, positive teacher-student interactions, attention to equity issues, and regular assessment, teachers and students can achieve success." The realization that climate and culture have a profound effect on student achievement, Stolp (1995) has underscored the need to place a greater emphasis on classroom management strategies. In the Effective Schooling Practices synthesis, Cotton (1 995) noted that effective teachers set clear standards for classroom behavior and apply those standards fairly and consistently. However, Maag (2001 ) noted most approaches for dealing with student disruptions involve use of various forms of punishment such as removals from the classroom, fines, restitutional activities, in-school and out-out-school suspensions, and expulsions. Although some of these approaches may make schools safer by removing the offending students, they have little effect on encouraging students to perform socially appropriate behaviors. Maag (2001) also provides evidence that positive reinforcement techniques have had a far greater impact on student misbehavior than the traditional negative reinforcing consequences. Sprick (1995) sites the school effectiveness research indicating effective teachers: Establish smooth, efficient classroom routines, Interact with students in positive, caring ways, Provide incentives, recognition, and rewards to promote excellence, Set clear standards for classroom behavior. Sprick concludes from the study that a significant difference between teachers of students with academic gains versus teachers of students without academic gains was the extent of effective classroom management strategies. Ironically, behavior management has been consistently mentioned by teachers as an area in which they would like more training (Maag, 2001). Teachers who practice effective classroom management strategies gain more time on task for instruction (McCloud, 2005). Teachers often comment that classroom management skills are not generally emphasized in the college curricula. Likewise, professional development opportunities for teachers are usually limited to one-time presentations which provide little, if any, follow-up and feedback and are generally recognized in staff development circles as ineffective trainirlg methods (Killion, 2002). The School District of Palm Beach County last embraced a District-wide initiative in the classroom management arena in the 1980's with Lee Canter's Assertive Discipline. Since that time, most efforts at providing effective classroom management training for teachers consistently across the District has been limited and has been relegated to an individual school or teacher choice. The CHAMPs Classroom Management initiative has been presented as a school-wide effort focusing on creating classroom Page 9

climates that are more conducive to effective instruction. Sprick (1995) indicates that teachers can create positive classroom environments, which enhance student responsibility, independence, and motivation by implementing proactive and positive techniques that are based on the school effectiveness literature, i.e., that 1. Classroom organization has a huge impact on student behavior; therefore, teachers should carefully structure their classrooms in ways that prompt responsible student behavior; 2. Teachers should overtly teach students how to behave responsibly (i.e., be successful) in every classroom situation; 3. Teachers should focus more time, attention, and energy on acknowledging responsible behavior than on responding to misbehavior; 4. Teachers should preplan their responses to misbehavior to ensure that they will respond in a brief, calm, and consistent manner. The goal of increasing student achievement by providing more focused instructional time in the classroom as well as reducing incidents of misbehavior can be accomplished by focusing on sound classroom management strategies. These principles have been incorporated into the CHAMPs Classroom Management initiative. B. Goals and Objectives of Similar Programs Many schools and school districts throughout the United States have incorporated aspects of the CHAMPs Classroom Management model. While programs vary in implementation methods, the goal of increasing student achievement by developing clear behavioral expectations across campuses as well as in the classrooms is consistent. A primary objective of all schools and districts implementing the CHAMPS classroom management process is to reduce the number of incidents of misbehavior. Among the schools implementing CHAMPs, Northern Elementary School in Lexington, Kentucky, an urban, ethnically diverse, and economically disadvantaged Pre-K-5 school, strived to increase student achievement by developing a school-wide discipline plan. Fayette County Public School District in Kentucky also maintained efforts to improve student achievement with the CHAMPs model by increasing instructional time and student engagement as well as reducing the barriers to learning. In Broward County, Florida, implementation involves a train-the-trainer process for schools. CHANlPs classroom management training is provided to staff members at schools with the goal of increasing student achievement by decreasing common behavior problems and instructional interruptions. All of these schools and districts share the following implementation goals and criteria: Increase student achievement by providing more focused instruction and time on task; Reduce disciplinary incidents by clearly teaching behavioral expectations; Provide training for all staff; Provide continued coaching from district staff or by school staff specifically designated and trained. Though the models presented in this narrative all provide some form of coaching and assistance for teachers, the Palm Beach County School District implementation model includes written feedback as one of its objectives. Providing written feedback with observations and suggestions is certainly not unique to this district, but the objective was not incorporated into the implementation models discussed in this review. In fact, Teaching Strategies, Inc. and Safe and Civil Schools provide national training on this aspect of coaching (Safe and Civil Schools, 2006). The CHAMPs Classroom Management initiative in the Palm Beach County School District shares many of the same goals and objectives of similar initiatives in other districts. Page 10

C. Evidence of Success of CHAMPs Classroom Management in Similar Programs There is evidence that the CHAMPs Classroom Management model has demonstrated success in student academic gains as well as decreased behavioral issues. Anecdotal observations from administrators and testimonials from teachers in Palm Beach County during initial implementation from 2002 through 2005 have indicated that behavioral issues have decreased in classrooms where implementation occurred. A variety of implementation models were explored with more than 20 schools of various levels. Evidence indicated that the number of both discipline referrals and out-of-school suspensions decreased at several schools implementing the CHAMPs initiative. Northern Elementary school decreased the number of students suspended by 69% in 3 years while the incidents of misbehavior decreased 78% during the same time period. Additionally, the results indicate a 74% increase in the number of students reading on grade level (partially attributed to the Literacy First reading program). The achievement gap by race and socio-economic status reached near-parity by May 2004. Though Fayette County in Kentucky is only in the 3rd year of implementation, the preliminary data indicates that schools are relying more on data to develop strategies to increase student achievement. The Consolidated School Improvement Plan developed by each school after reviewing the data includes incorporating the CHAMPs classroom management strategies into the student achievement plan. Broward County, Florida has discovered that by providing intensive training, each school's CHAMPs coach can offer hislher experience and skills and can provide non-evaluative support and guidance for staff. In all of the literature reviewed as well as the data from Palm Beach County, the school-wide approach of implementation of the CHAMPs classroom management strategies has had a direct correlation to the reduction in behavioral issues. D. Ensuring Successful Implementation Effective training and follow-up support are integral to successful implementation of any initiative and particularly, the CHAMPs Classroom Management initiative. The CHAMPs implementation builds in a process for a resource teacher, or CHAMPs coach, to be assigned to a school one day per week and to visit classrooms. The coaches provide individual written feedback on effective strategies they observe and meet with staff to discuss effective classroom management practices. Additionally, principals who model the strategies encourage staff to meet with the CHAMPs coach, and emphasize and enforce the implementation school-wide have noticed a more positive learning environment. Thus, we have discovered through a variety of implementation models utilized during the last three years that the following elements are vital for effective implementation: Training opportunities during pre-school and at least 2-3 times more during the year, PrincipaIIAdministrator "buy in" and modeling of CHAMPs practices, Coach visitation of all classrooms with feedback provided, School-wide approach instead of sporadic implementation, Conversely, the barriers to implementation is the tendency to not implement the elements identified above. As administrators put more pressure on teachers to increase student achievement through academic data analysis and instructional strategies, training opportunities for classroom management are infrequent or one-time events. In an effort to maximize implementation throughout the district, each coach is assigned four schools per year. As a result, visiting staff and providing individual feedback has become a challenge. The implementation model requires the principal and administrators to encourage staff to create and display CHAMPs expectations posters as well as incorporate many of the strategies into their daily routines. Though the coach attempts to visit each classroom at each of the schools several times, the reality is that the visits may be limited to 1-3 times per year. As a result, we must rely Page 11

more on administrative support and modeling of the strategies. Some principals have demonstrated initiative and incorporated observation of specific strategies in their classroom walkthrough visits as well as included CHAMPS-related comments on the summative instrument. If the District mandated that all schools utilize a specific classroom management model, such as CHAMPs, then the principals and administrators would be required to acquire and/or internalize effective strategies. Principals are asked to be the instructional leaders of their schools, but that should include the skills of a classroom management coach as well! The District's commitment to the CHAMPs Classroom Management model would help overcome many of the implementation obstacles. Without District commitment, Principal support must be in place prior to investing a coaching position at the school. Additionally, a Single School Culture committee at each school must be trained to sustain the initiative in successive years. Eventually, all classroom management training provided by the school district, including new teacher training, should incorporate the CHAMPs Classroom Management model. E. District Program Recommendation The CHAMPs Classroom Management initiative provides effective strategies to foster an environment conducive to learning for students. The goals and objectives of the initiative correspond to the district's goals for the Single School Culture for Behavior initiative. Common practices among teachers result in more meaningful relationships with students and, thus, should positively impact student achievement. CHAMPs was chosen for implementation in the Palm Beach County School District because 1. The materials and resources for teachers are abundant and user-friendly; 2. The implementation at both the classroom and school-wide level is relatively easy; 3. Similar initiatives in other areas have shown evidence of increasing student achievement and reducing discipline issues; 4. Testimonials from teachers during the initial implementation phases indicate successes with the CHAMPs Classroom Management strategies. Page 12

REFERENCES Cotton, Kathleen. "Effective Schooling Practices: A Research Synthesis Update." Northwest Reaional Laboratory. 1995. Hogan, Tyyne M., School District of Broward County, Florida. E-mail interview. 13 March 2006 Kalias, Kathy., Fayette County Public Schools, Kentucky. E-mail interview. 28 February 2006. Killion, Joellen. Assessins Impact: Evaluatinq Staff Development. Oxford, OH: National Staff Development Council, 2002. Maag, John W. "Rewarded by Punishment: Reflections on the Disuse of Positive ~einforcement in Schools." Exceptional Children. 67(2). Winter 2001 : 173-86. McCloud, Susan. "From Chaos to Consistency." Educational Leadership, 62 (5). February 2005: 46-49. Petrilli, Peggy. "Closing the Reading Gap." Principal, 84 (4). MarchlApril. 2005: 32-35. Sprick, Randall S. CHAMPS: A Proactive and Positive Approach to Classroom Manaaement. Longmont, Colorado: Sopris West, 1995. Stolp, Stephen, and Smith, Stuart C. "Transforming School Culture: Stories, Symbols, Values & the Leader's Role." ERIC Clearinqhouse on Educational Management, Eugene, Oregon 1995. "Train the Trainers Workshop: Advertisement." Randy Sprick's Safe and Civil Schools. 2006. 12 April 2006 <http:il www.safeandcivilschoois.comlmedialttt~7~9~06.pdf Page 13

FY2006 lmplementation Analysis of the CHAMPs Classroom Management Strategies I. Purpose ofthe Analysis This report provides an implementation analysis of the CHAMPs (the Program) during FY2006 at eleven District elementary and middle schools. The analysis attempts to determine the extent to which the Program has impacted student behavior and achievement in FY2006. Specific research questions follow: 1. To what extent has the Program been implemented according to the Program description? To provide a second perspective to the extent of the implementation of the Program, a survey was completed by participating students' to answer the following questions: 2. To what extent do students agree that the Program has been implemented according to the Program description? 3. To what extent do students agree that Program teachers employ research based, effective instructional strategies? II. Analyses Question I: To what extent has the Program been implemented according to the Program description? Methods As presented in the lmplementation Rubric, there is a continuum of program implementation, ranging from Level 1, beginning implementation, to Level 4, full implementation. At one end of the continuum, Level 1 lmplementation indicates that personnel at the site(s) of implementation are experiencing difficulties implementing one or more descriptors in all three descriptor groups (instruction, service, and product). At the other end of the continuum, Level 4 indicates personnel have successfully implemented all descriptors in each of the three descriptor groups and have, in effect, achieved full implementation. Implementation Rubric I Levels of Implementation I Criteria 1 1 1 Beginning implementation I In none of the three descriptor groups (instruction, product, and 2 4 Minimal implementation Intermediate implementation Full implementation service) do 100% of descri~tors meet success standards. 1 100% of instruction-related descriptor group, or 100% of productrelated descriptor group, or 100% of service-related descriptor group meet their success -. standards. 100% of descriptors meet success standards in any two of the three descriptor groups (instruction, product, and service). 100% of instruction-related descriptor group, and 100% of productrelated descriptor group, and 100% of service-related descriptor group meet their success standards. 1 Surveys employed in the program evaluation process are administered at the secondary level only. A sample student survey may be found in Appendix I. Page 14

Table 1: Key Instructional-Related Descriptors: Standards for Successful lmpl Evidence Evidence Collection Descriptor Collected Location Schedule 1.l~dministrators will visit each classroom at least twice per month to determine the following: Classroom/School-wide rules posted Activity Expectations posted Positive vs. Negative interaction rate Effectiveness of the attention signal Student understanding of the ( activity expectations 1 1.2 Administrators will model CHAMPs expectations in staff meetings, PDDs, and other meetings by utilizing strategies such as an attention signal and communicating expectations. 1.3 Administrators will create school- / wide norms by "CHAMPsing" common area activities. Scripts and/or posters should be utilized to clearly communicate these "common area" expectations to all students and staff. SEA CPOS TS SS SEA CPOS TS SEA CPOS TS Safe Schools Safe Schools Safe Schools Key to Surveys (Evidence Collected) SEA = Self-Assessment Evaluation for Administrators CPOS = CHAMPs Coach Perception and Observation Survey TS = Teacher Survey SS = Student Survey or Focus Group I April 2006 April 2006 April 2006 School Administration School Administration School Administration # schools meeting descriptors /total # of schools evaluated # schools meeting descriptors /total # of schools evaluated # schools meeting descriptors /total # of schools evaluated Success Standard 80% 80% Results Page 15

1.4 The principal, in concert with the CHAMPS Coach, will create and develop a Single School Culture Team to sustain the initiative after the initial year of implementation. 1.5 The Single School Culture Team will meet to review and address schoolwide behavioral issues and assist staff on classroom management issues. 1.6 All teachers will develop and post classroom rules. 1.7 All teachers will develop, create visuals and teach at least two Activity Expectations. mal-related Descriptors: Stan SEA CPOS 1 Safe schools SEA 1 TS SEA CPOS TS SEA CPOS TS Safe Schools Safe Schools Safe Schools ards for Suc~ Collection Schedule April 2006 April 2006 April 2006 April 2006 Responsible School Administration lnstructional Staff lnstructional Staff Instructional Staff # schools meeting I descriptors 1 80% 1 100% /total # of schools evaluated # schools meeting descriptors /total # of schools evaluated # schools meeting I descriptors /total # of 1 80% 1 100% schools 1 I evaluated # schools meeting I I descriptors /total # of schools evaluated I I 80% 1 100% Page 16

Table 1 (cont.): Key Instructional-Related Descriptor : Standards for Successful Irr Evidence Evidence Collection Success Descriptor Results 1 Collected Location I Schedule Denominator Standard 1.8 All teachers will develop an I # schools identifiable attention signal. meeting SEA Instructional descriptors CPOS Safe Schools April 2006 Staff /total # of TS schools evaluated 1 1.9 All teachers will complete a # schools Classroom Management plan and meeting SEA submit the plan to the Principal. Instructional descriptors CPOS Safe Schools April 2006 Staff /total # of schools evaluated 1.10 Teachers will consistently address early-stage misbehaviors with classroom interventions in accordance with histher classroom management plan and only utilize discipline referrals to the office for serious behavioral infractions. SEA CPOS TS SS SEA CPOS TS SS Safe Schools 1 1.1 1 All teachers will monitor their 1 ~ interactions with students by selfassessment, peer assistance, coach visitations, or administrator observations or by another method. 1.I 2 Students respond to an attention signal from any staff member within 3-5 seconds consistently. 1.13 Students can easily identify "what is expected of them" for each activity and transition. SEA CPOS TS SS April 2006 Safe Schools April 2006 Student Student Safe Schools April 2006 Student I Safe Schools April 2006 Student # schools meeting descriptors /total # of schools evaluated # schools meeting descriptors /total # of schools evaluated # schools meeting descriptors /total # of schools evaluated # schools meeting descriptors /total # of schools evaluated Page 17

Descriptor 2.1 Each teacher and administrator will receive a CHAMPs book. 2.2 Each elementary school will receive a Teacher's Encyclopedia for Behavior Management (TEBM) for each grade level. Each middle school should receive a TEBM for each team or instructional grouping. Each high school will receive a TEBM for every department. scriptors: Stand Evidence Collected SEA CPOS TS SEA CPOS TS rds for Succ Evidence Location Safe Schools Safe Schools ssful Implemc Collection Schedule April 2006 April 2006 tation Person Responsible Champs Coach Champs Coach Numerator/ Denominator # schools meeting descriptors /total # of schools evaluated # schools meeting descriptors /total # of schools evaluated 2.3 Each elementarv and # schools middle CHAMPs school will meeting SEA receive a TEBM for every Safe Champs descriptors CPOS April 2006 discipline administrator and Schools Coach /total # of TS guidance counselor. schools evaluated 2.4 The appearance of each classroom and office area (accessible to students) will be organized and is conducive to structured learning which encourages appropriate student behavior. 2.5 A CHANlPs Coach will be on campus to support the CHAMPs initiative with school staff about fifteen days per semester for the school year (flexible by schedules). SEA CPOS TS SEA CPOS TS Safe Schools Safe Schools April 2006 April 2006 Champs Coach Champs Coach # schools meeting descriptors /total # of schools evaluated # schools meeting descriptors /total # of schools evaluated Success Standard Results Page 18

Table 3: Kev Service-Related Descriljtors: Standards for Successful lmljlementation Numerator' Descriptor Collected Location Schedule Responsible Denominator 3.1 Prior to implementation at a school center, administrators will receive a two-hour training session, CHAMPs for Administrators. Topics will include how to facilitate the meeting SEA initiative, strategies to model, Safe Champs descriptors April 2006 effective strategies to observe Schools Coach /total # of TS in classrooms, utilizing the schools TEBM, and how to maintain evaluated focus on effective classroom management strategies throughout the year and in the years-to follow. 3.2 Before the beginning of the school year, teachers and administrators will receive the Champs Phase 1 training in the Coach CHAMPs Classroom Management strategies. 3.3 The school principal will attend all CHAMPs trainings SEA with staff to demonstrate Safe Champs April 2006 support for the initiative. Schools Coach administrators will attend all CHAMPs trainings with staff to demonstrate support for the initiative. P SEA TS Safe Schools April 2006 Champs Coach # schools meeting descriptors /total # of schools evaluated # schools meeting descriptors /total # of schools evaluated # schools meeting descriptors /total # of schools evaluated Success Standard 80% 80% 80% Results 1 Page 19

Table 3 (cont.): Key Service-Rel Descriptor 3.5 Teachers will attend additional training sessions with the CHAMPs Coach throughout the year. These sessions will focus on additional strategies and analyzing behavioral issues. 3.6 Teachers will attend at least one individual feedback session with the CHAMPs Coach throughout the year. ted Descril Evidence Collected SEA CPOS TS SEA CPOS TS ors: Standards for Successful lmplen I.. Evidence Collection Person Numerator' Success Standard Location Schedule Res onsible Denominator # schools meeting descriptors /total # of schools evaluated # schools meeting descriptors /total # of schools evaluated Results Page 20

Tables 1-3 indicate that, as a group, all schools are at Full Level of lmplementation (Level 4). At the individual school level Seven elementary schools are at Full Level of lmplementation (Level 4) and one elementary school is at Minimum Level of lmplementation (Level 2). See Appendix I for detailed results at individual elementary schools. All three middle schools are at Full Level of lmplementation (Level 4). See Appendix I for detailed results at individual middle schools. Question 2: To what extend do students agree that the program has been implemented according to the Program Description? Method This portion of the student surv&y consisted of ten Program implementation statements that students were asked to confirm. All survey statements addressed the level to which the teacher employed Program approved instructional strategies in class. The survey statements are listed in below. My teacher talks about CHAMPs in my class. My teacher displays CHAMPs posters or visuals in the classroom. My teacher uses a signal to get the attention of all students. The students in my class are completely quiet within 3 seconds when my teacher tries to get our attention. My teacher is very clear on how the students should behave during different activities. I know the CHAMPs expectations for all activities in my classroom. My teacher follows a procedure if someone comes to class late or for making up work if someone is absent. My teacher reviews with us the CHANlPs expectations for different activities. If a student misbehaves, the teacher does something about it. My teacher lets students know when they are doing what they are supposed to be doing. Each statement had five possible responses: Never, Rarely, Once a Month, Almost Everyday, and Everyday. When 50% or more of Program students responded to a statement with Everyday or Almost Everyday, the statement was confirmed as recognized. When 50% or more of Program students responded Never or Rarely the statement was confirmed as not recognized. Results The results of the middle school surveys are reported in Figure 1. The results for individual middle schools can be found in Appendix I. Survey statements are listed on the left. Responses are sorted in descending order by the percent of students answering Everyday or Almost Everyday Page 21

Figure 1: CHAMPs Program Implementation Student Survey Results Statement 2. Statement 9. Statement 5. Statement 10. Statement 7. Statement 6. Statement 3. Statement 4. Statement 1. Statement 8. Figures 1 indicates that The majority of middle school students in the Program recognized seven of the ten survey statements. The majority of middle school students in the Program did not recognize two of the ten survey statements (My teacher talks about CHAMPs in my class and My teacher reviews with us the CHAMPs expectations for different activities.). Question 3: To what extent do students agree that Program teachers employ research based, effective instructional strategies? Method The student survey included nine questions asking how often students perceived the use of research based effective instructional strategies by teachers in Program c~asses.~ The questions were based on nine teacher related instructional strategies that positively impact student achievement as identified by Robert Marzano (2003).~ The Effective Instructional Strategies Rubric below reports the nine instructional strategies, the mean percentile gain in student achievement that is attributed to the employment of the strategy, and the survey question asked to determine whether that strategy is employed in the Program classroom. The strategies are ranked according to the average percentile gain.4 2 Surveys employed in the program evaluation process are administered at the secondary level only. A sample student survey may be found in Appendix I. Marzano, Robert J. (2003). What Works in Schools: Translating Research into Action (pp. 78-81). Alexandria, VA: Association for Supervision and Curriculum Development. Percentile gains were based on conventional standardized tests. See Marzano, 80-81. Page 22

Effective Instructional Strategies Rubric Effective Instructional Strategy Identifying similarities and differences Summarizing and note takina Reinforcing effort and providing recognition Homework and practice Nonlinguistic representations Cooperative learning Setting objectives and p rovidin g feedback Generating and testing hypotheses Questions, cues, and advance organizers Mean Percentile Gain in Student Achievement 45 percentile points 34 percentile points Survey Question 29 percentile points How often do you receive praise or recognition from your teacher? 28 percentile points How often do you get feedback on your homework? How often does your teacher ask you to make a 27 percentile points diagram, picture, or model of something you are Each question had five possible responses: Never, Rarely, Once a Month, Almost Everyday, and Everyday. When 50% or more of Program students responded to a question with Everyday or Almost Everyday, the strategy was considered employed in the Program classroom. When 50% or more of Program students responded Never or Rarely the descriptor was considered not employed. Results 27 percentile points 23 percentile points 23 percentile points 22 percentile points How often do you compare and contrast things that you study? HOW often do you take notes or write a summary of your ~ teacher's instruction? How often do you work with other students in small grou s? How often are you told what you are going to learn at the beginning of the lesson? How often are you asked to make predictions during the lesson? How often does your teacher make a connection about what you are learning now and what you have already learned? The results of the middle school survey are reported in Figure 2. The results for individual middle schools can be found in Appendix I. Survey questions numbers and corresponding effective instructional strategies are listed on the left. Responses are sorted in descending order by the percent of high frequency responses. Page 23

Figure 2: CHAMPs Effective Instructional Strategies Student Suwey Results Setting objectives and providing feedback Questions, cues, and advance organizers Homework and practice Summarizing and note taking Reinforcng effort and providing recognition Identifying similaries and differences Cooperative learning Generating and testing hypotheses Nonlinguistic representations Figure 2 indicates that The majority of middle school students report the presence of three of the nine effective instructional strategies in the Program; o Setting objectives and providing feedback o Questions, cues, and advance organizers o Homework and practice At the individual school level (see Appendix I), the majority students at one middle school, Don Estridge, report the presence of four effective instructional strategies. Summary of the Results The analysis of the Program at the District's elementary and middle schools indicates that implementation has largely been successful. The results of the student surve s ertaining to Program implementation at Y p. the middle school level only somewhat confirm these findings. F~nally, the effective instructional strategies survey suggests that Program teachers employ few effective strategies at the middle school level. Steps should be taken to encourage teachers to employ more such strategies. 5 The majority of middle school students in the Program did not recognize two of the ten survey statements (My teacher talks about CHAMPs in my class and My teacher reviews with us the CHAMPs expectations for different activities.) Page 24

I. Purpose of the Analysis FY2006 Outcome Analysis of the CHAMPs Program This report provides an outcome analysis of the CHAMPs (the Program) during FY2006 at eleven District elementary and middle schools. The analysis attempts to determine the extent to which the Program has impacted student behavior and achievement in FY2006. Specific research questions follow. A. To determine the extent to which students clearly understand the behavioral expectations in every learning situation, the following questions were asked: 1. In FY2006, were the percents of students receiving one or more discipline referrals in the Program schools significantly lower than those of matched comparison groups? 2. In FY2006, were the percents of students receiving one or more out-of-school suspensions in the Program schools significantly lower than those of matched comparison groups? B. To determine the extent to which an ethos of fairness was created at Program schools, the following questions were asked: 1. In FY2006, was the mean Positive School Climate and Safe and Orderly Environment (PSC) correlate on the School Effectiveness Questionnaire (SEQ) significantly higher at Program schools than in FY2005? 2. In FY2006, were the mean numbers of student discipline referrals at Program schools significantly lower than in FY2005? 3. In FY2006, were the mean numbers of out-of-school suspensions at Program schools significantly lower than in FY2005? C. To determine the extent to which teachers were provided with more focused instructional time and students with more focused time for learning at Program schools, the following questions were asked: 1. In FY2006, did 80% of all teachers surveyed at Program schools respond 3, 4, or 5 on a 5-point rubric that CHAMPs strategies contribute to more focused instructional time? 2. From FY2005 to FY2006, did cohort groups of Program students significantly out-perform matched comparison groups on FCAT SSS Reading a. In Developmental Scale Score (DSS) gains? b. In the portion of a year's growth in reading in one year? c. In estimated number of years required to move students from a basic to proficient level of reading1? 3. In FY2006, were the percents of cohort groups of Program students scoring Level 3 or higher on FCAT SSS Reading significantly greater than those of matched cohort comparison groups? 4. From FY2005 to FY2006, did cohort groups of Program students significantly out-perform matched comparison groups on FCAT SSS Mathematics 'The lower end of Level 2 on FCAT SSS Reading to the lower end of Level 3. Page 25

a. In Developmental Scale Score (DSS) gains? b. In the portion of a year's growth in mathematics in one year? c. In estimated number of years required to move students from a basic to proficient level of mathematics? 5. In FY2006, were the percents of cohort groups of Program students scoring Level 3 or higher on FCAT SSS lblathematics significantly greater than those of matched cohort comparison groups? II. Analyses Question A I. In FY2006, were the percents of students receiving one or more discipline referrals in the Program schools significantly lower than those of matched comparison groups? Method The percents of students receiving one or more discipline referrals in the Program schools were compared to matched comparison groups of students. A test of independent proportions was used to assess whether the differences between Program and comparison groups were statistically different from each other at the 0.05 level. The comparison groups were randomly selected from students throughout the District who had never participated in the Program. These groups were selected by matching the key characteristics of Program students. Key characteristics were Current grade level, Retention in FY2005 and FY2006, Racelethnicity (Black, White, Hispanic, or other), LEP (limited English proficiency -those enrolled in ESOL classes or in the two year monitoring program after exiting ESOL), ESE (exceptional student education status - all exceptionalities other than gifted, speech impaired, or hospitallhomebound), Freelreduced lunch program status. Separate analyses were completed for students in the Program across all elementary schools and across all middle schools, as well as for the students in the Program at each individual school. A detailed comparison of students in the Program group across all elementary schools evaluated and all middle schools evaluated and their comparison groups are reported in Tables A1 a and A1 b respectively. Retention was matched for both FY2005 and FY2006. The remaining student characteristics were matched on FY2006 demographic data. The size of each comparison group was made as large as possible as long as the key characteristics remained within 3% of that of the Program group. This was done to maximize the stability2 of each comparison group without compromising the similarity between the Program and comparison groups. The Program students evaluated and the comparison group students were statistically similar in demographics. 2 Stability is necessary to ensure that any randomly selected comparison group has the same characteristics as any other randomly selected comparison group. Page 26

Table Ala: Comparison of Key Student Characteristics in the Program Group across All Elementary Schools Evaluated 1 Kinderaarten., 1 18.0% 1 807 1 18.2% 1 6309 1-0.2% 1 I I I I I I Grade 1-0.1% 1 I Grade 2 1 16.9% 1 757 1 16.8% 1 5804 1 0.1% 1 Grade 3 Grade 4 Grade 5 Black 17.8% 17.4% 14.2% 15.6% 52.0% 1 White 1 22.0% 1 984 1 22.7% 1 7870 1-0.7% 1 Hispanic Other ethnicity 18.7% 7.4% 798 779 635 697 2324 I LEP 1 20.3% 1 908 1 20.7% 1 7156 1-0.4% 1 ESE Freelreduced lunch Total Number of Students 13.3% 73.7% 4473 836 329 594 3298 N A Table A1 b: Comparison of Key Student Characteristics in the Program Group across All Middle Schools Evaluated and 17.9% 17.6% 14.3% 15.2% 50.5% 19.3% 7.4% 12.8% 73.0% 3461 0 6195 6085 4946 5271 17486 6688 2566 4428 25278 N A -0.2% -0.1% 0.4% 1.5% -0.6% 1 0.0% 0.5% 0.7% -301 37 I Grade 7 1 33.4% 1 1217 1 33.3% 1 7044 1 0.1% 1 Grade 8 Black 34.0% 37.8% 1241 1377 I White 1 35.1% 1 1279 1 36.2% 1 7663 I -1.1% i Hispanic ( 20.4% 1 7 4 4 1 21.O% I 4439 1-0.6% 1 1 Other ethnicity 1 6.7% 1 246 1 7.0% 1 1473 1-0.3% 1 LEP ESE 14.3% 12.5% 33.8% 35.8% I Freelreduced lunch / 49.3% 1 1797 1 48.6% 1 10279 1 0.7% 1 Total Number of Students I 100% 520 457 3646 11.6% 12.6% 100% I 7136 7568 2448 2673 I 0.2% 2.0% 2.7% 1-0.1% 21143-1 7497 Summaries of the key characteristics of students in the Program group at all Program elementary and middle schools as groups and the range among individual Program elementary and middle schools are shown in Tables Alc and A1 d respectively. Page 27

~ 1 Table Alc: Summary of the Key Characteristics in the Program Group for All Program Elementary Schools as a Group and the Variation amona " Individual Proaram " Elementaw Schools ~ *'I Program Variation among Key Characteristics Number of Individual Number of as One Grou Kindergarten 18.0% 807 14% - 25.3% 42-178 Grade 1 17.8% 798 15% - 27.8% 47-168 ---- Grade 2 16.9% 757 14% - 21.6% 52-150 Grade 3 17.4% 779 14.9% - 25.3% 50-142 Grade 4 14.2% 635 0% - 17.3% ~ Black 52.0% White 22.0% Hispanic 18.7% Other ethnicity 7.4% LEP 20.3% ESE 13.3% Freelreduced lunch 73.7% Total Number of Students I 100% 2324 984 836 329 908 594 3298 4473 14.3% - 95.4% 0% - 54.9% 2.1% -40.2% 2.5% - 11.6% 6.7% - 36% 11% - 18.3% 36.6% - 96.5% N A 39-538 0-327 12-295 14-108 21-264 38-104 100-673 273-929 Table Ald: Summary of the Key Characteristics in the Program Group for All Program Middle Schools as a Group and Grade 6 Grade 7 Grade 8 Black White Hispanic Other ethnicity LEP ESE Freelreduced lunch Total Number of Students 32.6% 33.4% 34.0% 37.8% 35.1% 20.4% 6.7% 14.3% 12.5% 49.3% 100% 1188 1217 1241 1377 1279 744 246 520 457 1797 3646 28.4% - 35.1% 32.7% - 34.3% 31.7% - 37.2% 10.3% - 65.7% 15.7% - 64.7% 13% -32.1% 5.6% - 8.8% 4.8% - 20.5% 7.3% - 16.7% 19.8% - 73.5% N A 327-458 390-432 399-428 134-755 181-844 150-383 64-115 62-236 95-199 258-876 1150-1304 As shown in Tables A1 c and A1 d, the range of key characteristics from school to school is considerable. For this reason, when analyzing achievement at the school level, separate comparison groups were selected to match the unique characteristics of each school. A detailed comparison of the Program group and the statistically similar comparison group for each individual Program school can be found in Appendix S. Page 28

Results Results of the analysis are reported in Tables A1 e and Alf. Table A1 e reports the results for elementary schools and Table Alf reports the results for middle schools. The total number of students and the percent of those students receiving one or more referrals in FY2006 are reported for both Program and comparison groups. The last two columns measure the relative value of the difference between the two groups. First, statistical significance of the difference in the percent of students receiving one or more referrals in FY2006 is indicated by the use of the superscript S ('1. No statistically significant difference between groups is indicated by superscript NS (NS). When the difference in the percent of referrals is not statistically significant, the educational effect size3 of the difference is not reported (NR) as there is no evidence that differences are not due to chance. Table Ale: Comparison of the Percent of Students Receiving 1 or more Referrals in FY2006 in the Elementary School Program Groups and their Comparison Groups Grassy Waters 929 4.63% 43,922 Hidden Oaks 71 3 2.95% 44,776 1 Starlight Cove 1 734 1 10.90% 1 37,795 8 0 0 % 1 2.90%~ Discernable ( 41 I I S = Statistical Significance NS = No Statistical Significance NR = Not Reported Effect sizes are reported as Negligible, Discernable, Substantial, Extensive, or Extreme. Page 29

Table Alf: Comparison of the Percent of Students Receiving 1 or more Referrals in FY2006 in the Middle School Program Groups and their Comparison Groups Carver Don Estridge High Tech Jeaga 1,150,304 1,I 92 44.96% 21.47% 44.1 3% 7,832 21,880 22,312 42.05% 25.1 3% 38.75% S = Statistical Significance NS = No Statistical Significance NR = Not Reported 2.91%~s 1 1-3.66%S Negligible (+) NR 5.37%S Discernable (-) These tables indicate that For all elementary students as a group, there was no statistically significant difference in the percent of Program students and the comparison group receiving one or more referrals; o At three elementary schools evaluated, the percents of students receiving one or more referrals were significantly lower than those in their comparison groups. Those schools included Dr. Mary McLeod Bethune (substantial effect size), Grassy Waters (discernable effect size), Hidden Oaks (discernable effect size),. o At four elementary schools evaluated, the percents of students receiving one or more referrals were significantly higher than those in their comparison groups. Those schools included D. D. Eisenhower (substantial effect size), Elbridge Gale (substantial effect size), Lincoln (substantial effect size), Starlight Cove (discernable effect size),. For all middle students as a group, the percent of Program students receiving one or more referrals was statistically higher than its comparison group, but the effect size was negligible; o At one of the three middle schools evaluated, Jeaga, the percent of students receiving one or more referrals was significantly higher than that of its comparison group, and its effect size was discernable. o At one of the three middle schools evaluated, Don Estridge, the percent of students receiving one or more referrals was significantly lower than that of its comparison group, but its effect size was negligible. Question A2. In FY2006, were the percents of students receiving one or more out-ofschool suspensions in the Program schools significantly lower than those of matched comparison groups? Method The percents of students receiving one or more out-of-school suspensions in the Program schools were compared to matched comparison groups of students. A test of independent Page 30

proportions was used to assess whether the differences between Program and comparison groups were statistically different from each other at the 0.05 level. The selection of comparison groups is described in the Method section of question Al. Results Results of the analysis are reported in Tables A2a and A2b. Table A2a reports the results for elementary schools and Table A2b reports the results for middle schools. The total number of students and the percent of those students receiving one or more out-of-school suspensions (OSS) in FY2006 are reported for both Program and comparison groups. The last two columns measure the relative value of the difference between the two groups. First, statistical significance of the difference in the percent of students receiving one or more OSS in FY2006 is indicated by the use of the superscript 54'). NO statistically significant difference between groups is indicated by superscript NS (N ). When the difference in the percent of OSS is not statistically significant, the educational effect size4 of the difference is not reported (NR) as there is no evidence that differences are not due to chance. Table A2a: Comparison of the Percent of Students Receiving 1 or more Out-of-School Suspensions in FY2006 in the Dr. Mary McLeod Bethune Elbridge Gale Galaxy Grassy Waters 564 273 454 929 5.32% 1.83% 7.71% 2.69% 14,563 22,395 17,960 43,922 8.96% 7.54% -3.64% 0.1 9% NS * 0.17% NS Discernable (+) 4.40% -1.71% Negligible (t) Hidden Oaks 713 1.40% 44,776 2.87% -1.47% Negligible (+) Lincoln 506 14.82% 15,422 6.53% Substantial (-) Starlight Cove 734 5.04% 37,795 3.81% 1.23% NS S = Statistical Significance NS = No Statistical Significance NR = Not Reported 4 Effect sizes are reported as Negligible, Discernable, Substantial, Extensive, or Extreme. Page 31

Table A2b: Comparison of the Percent of Students Receivinq 1 or more Out-of-School Suspensions in FY2006 in the Jle School program Groups and their Comparison ~rouis Program Group Middle School # % Stl#denb 2 1 OSS Comparison Group # Students % OSS All prc---- K":AAll / - - Carver 1 1,150 1 30.96% 1 7,832 1 22,4396 1 8.5~./.~ 1 - Don Estridge High,304 5.67% 21,880 11.09% Tech Jeaga 1,192 18.79% 22,312 20.55% -1.76% NS S = Statistical Significance NS = No Statistical Significance NR = Not Reported These tables indicate that I Substantial (-) - 1 1 For all elementary students as a group, there was no statistically significant difference in the percent of Program students and the comparison group receiving one or more out-of-school suspensions; o At three of the eight elementary schools evaluated, the percents of students receiving one or more out-of-school suspensions were significantly lower than those in their comparison groups. Those schools included Dr. Mary McLeod Bethune (discernable effect size), Grassy Waters (negligible effect size), Hidden Oaks (negligible effect size). At one elementary school, Lincoln (substantial effect size), the percent of students receiving one or more out-of-school suspensions was significantly higher than that of its comparison group. For all middle students as a group, there was no statistically significant difference in the percent of Program students and the comparison group receiving one or more out-of-school suspensions; o At one middle school, Don Estridge, the percent of students receiving one or more out-ofschool suspensions was significantly lower than that of its comparison group, and its effect size was discernable. o At one middle school, Carver, the percent of students receiving one or more out-of-school suspensions was significantly higher than that of its comparison group, and its effect size was substantial. Question B1. In FY2006, was the mean Positive School Climate and Safe and Orderly Environment (PSC) correlate on the School Effectiveness Questionnaire (SEQ) significantly higher at Program schools than in FY2005? Method The means of the PSC correlate on the teacher version of the SEQ at Program schools were compared to the results at those same schools in FY2005. A t-test was used to assess whether the differences between FY2006 and FY2005 results were statistically significant at the 0.05 level. Results Page 32

Results are reported in Tables Bla and Bl b. Table Bl a reports the results for elementary schools and Table Bl b reports the results for middle schools. The total number of Program teachers responding to the SEQ and mean PSC are reported for FY2005 and FY2006. The last two columns measure the relative value of the change between the two years. First, statistical significance of the difference in the mean in FY2005 and FY2006 is indicated by the use of the superscript S (3. No statistically significant difference between years is indicated by superscript NS ( '). When the difference in the mean is not statistically significant, the educational effect size5 of the difference is not reported (NR) as there is no evidence that differences are not due to chance. Table Bla: Mean Teacher Responses on the PSC Correlate of the SEQ in Program Elementary Schools All Program Elementary Schools 348 30 1 3.85 D. D. Eisenhower 49 28 3.78 3.94 Dr. Mary McLeod Bethune 58 39 3.34 0.1 1 NS Elbridge Gale NR 30 N R 4.90 Galaxy 39 29 3.98 Grassy Waters 66 87 4.06 4.10 Hidden Oaks NR 44 NR 4.20 N R NR Lincoln Starlight Cove 53 83 S = Statistical Significance NS = No Statistical Significance NR = Not Reported Table Bl b: Mean Teacher Responses on the PSC Correlate of the SEQ in Program Middle Schools School All Program Middle Schools Number of Teachers Responding 39 79 1 Carver 1 57 1 37 1 3.36 1 2.84 1 4.52s 1 Exceptional (-) 11 I I I I I I Don Estridge High Tech 2005 163 63 2006 174 53 3.45 4.27 Mean PSC 2005 3.47 Jeaga 43 84 3.40 3.47 0.07~~ I I I I I I1 I U S = Statistical Significance NS = No Statistical Significance NR = Not Reported 3.96 4.12 2006 3.50 0.50 Extensive (+) -- -0.14~~ N R Relative Value of Change Gain 0.03NS FY2006 Educational Effect Size NR 3.62 4.02 0.40~ Substantial (+) 5 Effect sizes are reported as Negligible, Discernable, Substantial, Exceptional, or Extreme. Page 33

These tables indicate that For all Program elementary teachers as a group, there was a statistically significant positive difference in the mean PSC correlate of the SEQ from FY2005 to FY2006, and the effect size was discernable; o At Lincoln, the mean PSC was statistically higher in FY2006 than in FY2005, and the effect size was extensive. For all Program middle teachers as a group, there was no statistically significant difference in the mean PSC correlate of the SEQ from FY2005 to FY2006; o At Don Estridge High Tech, PSC was statistically higher in FY2006 than in FY2005, and the effect size was substantial; o At Carver, PSC was statistically lower in FY2006 than in FY2005, and the effect size was exceptional. Question B2. In FY2006, were the mean numbers of student discipline referrals at Program schools significantly lower than in FY2005? Method The mean numbers of student referrals in the Program schools in FY2006 were compared to those at the same schools in FY2005. A t-test was used to assess whether the differences between FY2006 and FY2005 results were statistically different from each other at the 0.05 level. Results Results are reported in Tables B2a and B2b. Table B2a reports the results for elementary schools and Table B2b reports the results for middle schools. The total number of Program students and the mean number of student referrals are reported for FY2005 and FY2006. The last two columns measure the relative value of the change between the two years. First, statistical significance of the difference in the mean number of referrals in FY2005 and FY2006 is indicated by the use of the superscript S ('1. l o statistically significant difference between years is indicated by superscript NS ( ). When the difference in the mean number of referrals is not statistically significant, the educational effect size%f the difference is not reported (NR) as there is no evidence that differences are not due to chance. 6 Effect sizes are reported as Negligible, Discernable, Substantial, Extensive, or Extreme. Page 34

pp -- Table B2a: Mean IVumber of Student Referrals in Program - Elementary Schools Number of Mean Number Students Referrals School 2005 2006 2005 2006 Gain Educational All 1 3881 1 4784 1 0.2113 1 0.2028 1) 0.0085~~ 1 NR Dr Mary McLeod 51 571 0.1265 0.0928 Bethune Elbridge Gale NR 278 NR 0.1438 Galaxy 436 495 0.2615 0.3131 Grassy Waters 859 903 0.1024 0.0532 Negligible (+) Hidden Oaks NR 718 NR 0.032 Lincoln 569 526 0.2355 0.5019-0.2664 Substantial (-) Starlight Cove 905 783 0.221 0.2299-0.0089 NS NR S =Statistical Significance NS = No Statistical Significance NR = Not Reported Table B2b: Mean IVumber of Student Referrals in Program Middle Schools Number of Mean Number Relative Value of Change Students Referrals School FY2006 2005 2006 2005 2006 Gain Educational Effect Size All Program Middle 3661 3630 1.652 1.2504 0.4016 Discernable (+) Schools Carver 1346 1149 2.4933 1.8477 0.6456~ Substantial (+) -- Don Estridge High I I 07 1302 0.8338 0.5492 0.2846~ Negligible (+) Tech Jeaga 1208 1179 1.4644 1.4427 0.0217 NS NR S = Statistical Significance NS = No Statistical Significance NR = Not Reported These tables indicate that For all Program elementary students as a group, there was no statistically significant difference in the mean number of referrals from FY2005 to FY2006; o At Lincoln the mean number of referrals was statistically higher in FY2006 than in FY2005 and the effect size was substantial; o At Grassy Waters the mean number of referrals was statistically lower in FY2006 than in FY2005, but the effect size was negligible. Page 35

For all Program middle students as a group, there was a statistically significant decrease in the mean number of referrals from FY2005 to FY2006, and the effect size was discernable. o At 2 of the 3 middle schools evaluated, the mean number of referrals was statistically lower in FY2006 than in FY2005. Those schools included Carver (substantial effect size), Don Estridge High Tech (negligible effect size). Question B3. In FY2006, were the mean numbers of out-of-school suspensions at Program schools significantly lower than in FY2005? Method The mean numbers of out-of-school suspensions in the Program schools in FY2006 were compared to those at the same schools in FY2005. A t-test was used40 assess whether the differences between FY2006 and FY2005 results were statistically different from each other at the 0.05 level. Results Results are reported in Tables B3a and B3b. Table B3a reports the results for elementary schools and Table B3b reports the results for middle schools. The total number of Program students and the mean number of student out-of-school suspensions are reported for FY2005 and FY2006. The last two columns measure the relative value of the change between the two years. First, statistical significance of the difference in the mean number of out-of-school suspensions in FY2005 and FY2006 is indicated by the use of the superscript S (S). No statistically significant difference between years is indicated by superscript NS (NS). When the difference in the mean number of out-ofschool suspensions is not statistically significant, the educational effect size7 of the difference is not reported (NR) as there is no evidence that differences are not due to chance. Effect sizes are reported as Negligible, Discernable, Substantial, Extensive, or Extreme, Page 36

~ Table B3a: Mean Number of Student Out-of-school suspensions in Program Elementary Schools All Program Elementary Schools 3881 I I 1 D.D. Eisenhower 1 598 1 510 1 0.0351 1 0.0451-0.01 Ns 1 NR Dr Mary McLeod Bethune 514 4784 1 Elbridge Gale 1 NR 1 278 1 NR / 0.0179 1 NR 1 NR Galaxy 436 571 495 0.0448 0.0389 0.0711 0.0679 1-0.0231 Negligible (-) 0.0543-0.0154 NS NR ~1 0.0949 I1 ll Grassy Waters 859 903 0.0419 0.031 0.01 09 NS Hidden Oaks NR 718 NR 0.0139 Lincoln 569 526 0.0351 0.2567 Extensive (-) Starlight Cove 905 783 0.0508 0.0587-0.0079 NS N R S = Statistical Significance NS = No Statistical Significance NR = Not Reported Carver 1346 1149 0.8522 0.7137 Discernable (+) Don Estridge High Tech 07 1302 0.122 0.0806 Negligible (+) 1 Jeaga 1 1208 1 1179 1 0.4023 ( 0.3249 1 0.0774~~ 1 NR - S =Statistical Significance NS = No Statistical Significance NR = Not Reported These tables indicate that For all Program elementary students as a group, there was a statistically significant increase in the mean number of out-of-school suspensions from FY2005 to FY2006, but its effect size was negligible. o At Lincoln, the mean number of out-of-school suspensions was statistically higher in FY2006 than in FY2005, and its effect size was extensive. Page 37

For all Program middle students as a group, there was a statistically significant decline in the mean number of out-of-school suspensions from FY2005 to FY2006, and its effect size was discernable. o At 2 of the 3 middle schools evaluated, the mean number of out-of-school suspensions was statistically lower in FY2006 than in FY2005. Those schools included Carver (discernable effect size), Don Estridge High Tech (negligible effect size). Question CI. In FY2006, did 80% of all teachers surveyed at Program schools respond 3,4, or 5 on a 5-point rubric that CHAMPs strategies contribute to more focused instructional time? Method The number of teachers responding 3,4, or 5 in FY2006 were recorded for each Program school to assess whether they equaled 80% or greater of all respondents. Results Results are reported in Tables Cl a and C'l b. Table Cla reports the results for elementary schools and Table Cl b reports the results for middle schools. The total number of Program teachers surveyed and the percents of teachers responding 3, 4, or 5 to the survey in FY2006 are reported. The last column reports whether the 80% standard was met. Table Cla: Percent of Teachers in Program Elementary Schools Responding 3,4, or 5 to CHAMPs Survey Question #9 School All Program Number of Teachers Surveyed Percent of Teachers Responding 3,4, or 5 80% Standard Met YES 1 D. D. Eisenhower 1 24 1 96% (1 YES 1, Page 38

Table Clb: Percent of Teachers in Program Middle Schools Responding 3,4, or 5 to CHAMPs Survey Question #9 School All Program Middle Schools Number of Teachers Surveyed Percent of Responding 3,4, or 5 137 79% Carver 32 78% Don Estridge High Tech 46 80% These tables indicate that Across all Program elementary schools as a group, 90% of the teachers surveyed responded 3, 4, or 5 to the CHAMPs survey, which is 10% more than required by the standard. o At seven elementary schools, 80% or more of the teachers surveyed responded 3, 4, or 5 to the survey. Those schools included D. D. Eisenhower, Dr. Mary McLeod Bethune, Elbridge Gale, Galaxy, Grassy Waters, Hidden Oaks, Lincoln, Starlight Cove. Across all Program middle schools as a group, only 79% of the teachers surveyed responded 3, 4, or 5 to the CHAMPs survey, which is 1% less than required by the standard. o At only one middle school, Don Estridge, did 80% of the teachers surveyed responded 3, 4, or 5 to the survey. Question C2. From FY2005 to FY2006, did cohort groups of Program students significantly out-perform matched comparison groups on FCA T SSS Reading a. In Developmental Scale Score (DSS) gains? b. In the portion of a year's growth in reading in one year? c. In estimated number of years required to move students from a basic to proficient level of reading8? Method The reading achievement gains of Program students were compared to matched comparison groups of similar students. The comparison groups were randomly selected from students throughout the District who had never participated in the Program. Comparison groups were selected by matching the key characteristics of Program students. Key 8 The lower end of Level 2 on FCAT SSS Reading to the lower end of Level 3. Page 39

~ characteristics were Current grade level, Retention in FY2005 and FY2006, Racelethnicity (Black, White, Hispanic, or other), LEP (limited English proficiency - those enrolled in ESOL classes or in the two year monitoring program after exiting ESOL), ESE (exceptional student education status - all exceptionalities other than gifted, speech impaired, or hospitallhomebound), Freelreduced lunch program status, Prior year's FCAT achievement levels. Separate analyses were completed for students in the Program across all elementary schools and across all middle schools, as well as for the students in the Program at each individual school. A detailed comparison of students in the Program group across all elementary schools evaluated and all middle schools evaluated and their comparison groups are reported in Tables C2a and C2b respectively. Reading scores were matched on FY2005 FCAT results. Retention was matched for both FY2005 and FY2006. The remaining student characteristics were matched on FY2006 demographic data. The size of each comparison group was made as large as possible as long as the key characteristics remained within 3% of that of the Program group. This was done to maximize the stabilityg of each comparison group without compromising the similarity between the Program and comparison groups. The Program students evaluated and the comparison group students were statistically similar in demographics and FY2005 FCAT scores. Table C2a: Comparison of Key Student Characteristics in the Program Group across All Elementary Schools Evaluated and its comparison Group Key Characteristics 1 1 1 Grade 3 1 5.1% 1 61 1 5.2% 1 410 1-0.1% I Grade 4 1 44.5% / 533 1 45.4% 1 3596 1-0.9% Grade 5 Retention Program Group 50.4% 6.5% 1 Black 1 56.9% 1 682 1 55.4% 1 4389 1 1.5% 1 White Hispanic 1 Other ethnicitv 1 6.8% 1 82 1 6.8% 1 539 1 0.0% I LEP 18.9% 17.3% Number of Students 604 78 1 ESE 1 16.0% 1 192 1 15.8% 1 1250 1 0.2% 1 I / Prior Level 1 Reading 1 28.5% 227 207 Comparison Group 49.4% 6.2% 19.9% 17.9% Number of Students 3916 495 I I I I Difference between Groups 342 26.1 % 2064 2.4% 1 1 Prior Level 2 Readina 1 18.0% 1 216 1 17.8% 1 1407 1 0.2% 1 Prior Level 3 Reading Prior Level 4 Reading 35.3% 15.4% 1573 1421 1.0% 1 0.3% 1 Prior Mean DS Score 1 1327.4 1 1198 1 1338.1 7922 1-10.7 1 Total Number of 1 Students 100.0% 423 184 1198 37.2% 16.1% 100.0% 2950 1277-1.O% -0.6% -1.9% -0.7% 7922-6724 9 Stability is necessary to ensure that any randomly selected comparison group has the same characteristics as any other randomly selected comparison group. Page 40

Table C2b: Comparison of Key Student Characteristics in the Program Group across All Middle Schools Evaluated and its Comparison Group 1 Grade 6 1 31.8% 1 1037 1 32.0% 1 4973 1-0.2% 1 I Grade 7 1 33.8% 1 1103 1 33.8% 1 5260 / 0.0% 1 I Grade 8 1 34.3% 1 1119 1 34.2% 1 5327 1 0.1% 1 Retention Black 2.9% 37.8% I White 1 35.7% 1 1165 1 37.1% 1 5774 1-1.4% 1 Hispanic Other ethnicity ESE Freelreduced lunch 19.6% 6.9% 12.3% 48.4% 94 1232 I Prior Level 1 Readina 1 23.7% 1 773 1 22.3% 1 3471 1 1.4% i Prior Level 2 Reading Prior Level 3 Reading 19.1 % 30.4% 638 224 401 1576 1 Prior Level 4 Readina 1 19.7% 1 643 1 20.4% 1 3178 1-0.7% 1 Prior Mean DS Score ( 1660.5 1 Total Number of 624 992 3259 I Students 1 100.0% 1 3259 2.3% 36.4% 19.9% Summaries of the key characteristics of students in the Program group at all Program elementary and middle schools as groups and the range among individual Program elementary schools and middle schools are shown in Tables C2c and C2d respectively. 6.7% 11.5% 47.9% 18.8% 31.4% 1670.3 100.0% 360 5659 3091 1036 1793 7458 2930 4887 15560 15560 0.6% 1.4% -0.3% 1 0.2% 0.8% 0.5% 0.3% -1.0% I -9.7-12301 Page 41

Table C2c: Summary of the Key Characteristics in the Program Group for All Program Grade 3 Grade 4 5.1 % 44.5% 61 533 1 0.7% - 12.6% 37.8% - 51.9% Grade 5 50.4% 604 43.2% - 57.4% Retention 6.5% 78 0.7% - 15.3% Black 56.9% 682 27.8% - 97.3% White 18.9% 227 0.0% - 44.4% Hispanic 17.3% 207 0.5% - 38.5% 1-14 44-116 48-141 LEP 19.9% 238 3.8% - 35% 7-79 ESE 16.0% 192 11.1%-21.6% Prior Level 1 Reading Prior Level 2 Reading Prior Level 3 Reading Prior Level 4 Reading Prior Mean DS Score Total Number of Students 28.5% 18.0% 35.3% 15.4% 1327.4 100.0% 342 216 423 184 1198 1198 16.0% - 43.2% 14.2% - 21.6% 30.6% - 45.1 % 5.9% - 24.5% 1225.0-1457.0 100.0% - 100.0% 17-75 1 17-49 34-102 11-58 106-267 106-267 Table C2d: Summary of the Key Characteristics in the Program Group for All Program Middle Schools as a Group and the Variation among Individual Program lhliddle Schools Key Characteristics Variation among Afl Program Number of Schools Individual Students as One Group Schools Grade 6 31.8% 1037 27.9% - 34.2% Grade 7 Grade 8 34.3% 1119 32.1% - 38.4% 326-400 Retention 2.9% 94 0.8% - 4.3% 10-42 Black 37.8% 1232 10.7% - 65.7% 133-673 White 35.7% 1165 16.1 % - 63.6% 165-794 Hispanic 19.6% 638 12.8% - 30.5% 131-301 Other ethnicity 6.9% 224 5.4% - 9.2% 54-115 LEP 13.1% 426 5% - 19% 62-195 ESE 12.3% 401 7.2% - 16.5% 90-163 Freelreduced lunch 48.4% 1576 20.2% - 74.2% 252-732 Prior Level 1 Reading 23.7% 773 9% - 35.4% 112-363 Prior Level 2 Reading 19.1% 624 14.8% - 23% 185-227 Prior Level 3 Reading 30.4% 992 24.8% - 34% 254-424 Prior Level 4 Reading 19.7% 643 12.5% - 29.2% 123-365 Prior Mean DS Score 1660.5 3259 1560.6-1812 987-1248 1 Total Number of Students I 100.0% 3259 1 100.0% - 100.0% 1 987-1248 1 10 Elbridge Gale Elementary School was not evaluated as an individual school because there were fewer than 30 students in the program. Page 42

As shown in Tables C2c and C2d, the range of key characteristics from school to school is considerable. For this reason, when analyzing achievement at the school level, separate comparison groups were selected to match the unique characteristics of each school. A detailed comparison of the Program group and the statistically similar comparison group for each individual Program school can be found in Appendix S2. Student achievement was measured by calculating the mean gain scores for students in the Program groups and their respective comparison groups from FY2005 to FY2006 for FCAT SSS Reading DSS scores. A t-test was used to assess whether the mean gains of each Program and comparison group were statistically different from each other at the 0.05 level. Results The results are summarized in Tables C2e and C2f. Table C2e presents the results for Program elementary schools as a group and individually. Table C2f presents the results for Program middle schools as a group and individually. The tables are organized in the following manner: Program and comparison group means are reported as DSS gains made from FY2005 to FY2006, the mean portion of a year's growth in reading proficiency made from FY2005 to FY~oo~~', and the mean predicted years needed to move students from a basic to a proficient'' level of reading. The last three columns measure the relative Program value means. Statistical significance of the differences in Mean DSS gain between the Program groups and their comparison groups is indicated by the use of the superscript snf). No statistically significant difference between groups is indicated by the superscript NS ( ). When the difference in DSS mean gains is not statistically significant, the difference in the portion of a year's growth and the years needed to move a student from a basic to a proficient level of reading is not reported (NR). l1 A portion of a year's growth of 1.5 would indicate that approximately one and a half year's growth took place in one year. 12 The years needed to move a student from basic to proficient assume that the reported portion of a year's growth will remain constant during each year needed to move students to proficiency. When more than five years were needed to move students from a basic to a proficient level of reading, Not attainable was entered as 9th grade students could not move from a basic to proficient level in reading by the year of their graduation. Page 43

pppp -- Table C2e: Comparison of Gain from FY2005 to FY2006 between Students in the Elementary School Program Groups and their Com~arison Grou~s Program Group Means Comparison Group Means Program Portion Years Portion Years school 2005 2006 DSS of Needed 2005 2006 DSS DSS DSS Gain Year's Basicto DSS DSS Gain ' Growth Proficient Growth Proficient All Program Elementary 1327.4 1446.3 118.9 137.3 1.09 attainable Schools --- Not Bethune 1235.0 1337.9 102.9 0.82 1235.6 1373.0 137.4 1.11 attainable Dwight D' 1457.0 1522.1 65.1 0.42 Eisenhower att -- ~~-~~~ Galaxy 1225.0 1392.1 167.0 1.19 1223.7 1392.1 168.4 1.20 attainable att ait Grassy 1356.8 1493.3 136.5 1.22 Waters Hidden Oaks Not 1401.7 1516.9 115.2 0.79 1419.3 1567.8 148.4 1.16 attainable Not Lincoln 1283.6 1344.8 61.3 1295.7 1408.2 112.5 1.02 -- Starlight Cove 1343.4 14860 142.6 1.00 Not attainable S = Statistical Significance NS = No Statistical Significance NR = Not Reported 1343.9 1492.8 149.0 1.14 att attainable, attainable Not 1-6.4" NR NR 13 The term Not calculable is reported in this column when the difference in DSS mean gains is significant but the years needed to move students from a basic to a proficient level in reading is not attainable by their year of graduation in both the Program and comparison group. Page 44

Table C2f: Comparison of Gain from FY2005 to FY2006 between Students in the Middle School Program Groups and their Comparison Groups Program school All Middle Schools Carver Don Estridge Jeaga 2005 DSS S = Statistical Significance NS = No Statistical Significance NR = Not Reported Comparison Group Means Not 1660.5 1779.3 H8.7 1.04 1670.3 1780.7 110.4 0.95 attainable 1560.6 1691.9 131.3 1.O1 -- attainable Table C2e indicates that, from FY2005 to FY2006, for all Program elementary schools as a group, The Program group had a mean DSS gain that was statistically lower than its comparison group; Table C2e also indicates that, from FY2005 to FY2006, at the seven individual Program schools evaluated, The Program group at Dwight D. Eisenhower had a mean DSS gain that was statistically lower than its comparison group; The Program groups at the remaining six schools had mean DSS gains that were not statistically different from their respective comparison groups. Table C2f indicates that, from FY2005 to FY2006, for all Program middle schools as a group, The Program group had a mean DSS gain that was statistically greater than that of its comparison group; o The portion of a year's growth made by the Program group was greater than that of its comparison group by only 0.09 of one year. Were that difference to remain constant, it would take eleven years for students to gain only one year's worth of growth over the comparison group in moving students from a basic to proficient level of reading. Relative Program Value Means DSS Gain Years Needed Basic to Proficient 1812.0 1923.2 111.2 1.30 7.9 1791.9 1879.0 87.1 0.90 attainable Not 24.1 S 0.40 Feweris ~~~~~~ 1572.7 Program Group Means ' Portion 2006 DSS of DSS Gain Year's Growth 1687.9 115.2 0.75 Years Needed Basic to Proficient attainable 2005 DSS 1574.8 Portion of Year's Growth Table C2f also indicates that, from FY2005 to FY2006, at the three individual Program schools evaluated, 2006 DSS 1707.5 DSS Gain Years Needed Basic to Proficient Portion of Year's Growth The Program group at Don Estridge had a mean DSS gain that was statistically greater than that of its comparison group; 132.7 1.05 Not attainable -Oe30 Not calculable 14 The term Not calculable is reported in this column when the difference in DSS mean gains is significant but the years needed to move students from a basic to a proficient level in reading is not attainable by their year of graduation in both the Program and comparison group. 15 The term Fewer is used when the years needed to move students from basic to proficient is lower for the Program group than for the comparison group. The term More is used when the years needed to move students from basic to proficient is higher for the Program group than for the comparison group. Page 45

o The portion of a year's growth made by the Program group at Don Estridge was greater than that of its comparison group by 0.40 of one year. Were that difference to remain constant, it would take two and a half years for students to gain one year's worth of growth over its comparison group in moving students from a basic to proficient level of reading. The Program group at Jeaga had a mean DSS gain that was statistically lower than its comparison group; The Program groups at the remaining school had a mean DSS gain that was not statistically different from its respective comparison group. Question C3. In FY2006, were the percents of cohort groups of Program students scoring Level 3 or higher on FCAT SSS Reading significantly greater than those of matched cohort comparison groups? Method In FY2006, the percent of students proficient in the Program groups were compared to matched comparison groups of similar students. The comparison groups were randomly selected from students throughout the District who had never participated in the program. The selection of comparison groups is described in the Method section of question C2. Student achievement was measured by calculating the percent of students both in the Program groups and their respective comparison groups who scored Level 3 or higher (proficient) on the FY2006 FCAT SSS Reading. A Test of Independent Proportions was used to assess whether the percent of proficient students in each Program and comparison group were statistically different from each other at the 0.05 level. If the differences in proficiency were found to be statistically significant, they were also examined for educational significance. Results Results are reported in Tables C3a and C3b. Table C3a reports the results for all Program elementary schools as a group and each individual Program elementary school. Table C3b reports the results for all Program elementary schools as a group and each individual Program elementary school. The tables are organized in the following manner: The percent of proficient students in the Program and comparison groups in FY2005 and FY2006 are reported. The last two columns measure the relative value of the Program. First, statistical significance of the difference in the percent of proficient students in the Program groups and their comparison groups is indicated by the use of the superscript S ('&. No statistically significant difference between groups is indicated by superscript NS (N ). When the difference in the percent of proficient students is not statistically significant, the educational effect size16 of the difference is not reported (NR) as there is no evidence that differences are not due to chance. Effect sizes are reported as Negligible, Discernable, Substantial, Extensive, or Exfreme. Page 46

Table C3a: Comparison of the Percent of Proficient Students in FY2006 in the Elementary School Program Groups and their Comparison Groups Program Group Comparison Group Relative Program Value Program 2005 2006 2005 2006!Percent Of Educational percent Percent Percent Percent Students Effect Size Proficient Proficient Proficient Proficient Proficient Al Program Elementary 53.4% 48.3% 56.2% 53.6% -5.2 Discernable (-) Schools Bethune 37.8% 28.1% 39.6% Discernable (-) -4.1% NS NR -- -3.5% NS NR Grassy Waters 56.9% 56.2% 61.5% 56.2% 0.0% NS NR Hidden Oaks 66.7% 60.5% 68.4% 65.7% -5.1% NS NR Lincoln 43.4% Starlight 60.2% 52.7% 63.1% 56.4% -3.7% NS Cove S = Statistical Significance NS = No Statistical Significance NR = Not Reported Table C3b: Comparison of the Percent of Proficient Students in FY2006 in the Middle School Program Groups and their Comparison Groups Program SchOof All Program Middle Schools 2005 percent Proficient 57.1% 34.6% Program Group 2006 * Percent Proficient 57.1% 45.3% Comparison Group 2005 Percent Proficient 58.9% 2006 Percent Proficient 57.3% -6.1% NS Relative Program Value Percent Of Students Proficient -0.2% NS Educational Effect Size 1 Carver 1 43.8% 1 43.3% 1 45.5% 1 45.9% 11-2.6% Ns 1 NR NR Jeaga 46.8% 44.5% 48.3% 46.9% -2.4% NS NR S = Statistical Significance NS = No Statistical Significance NR = Not Reported Table C3a indicates that, in FY2006, The percents of proficient students in the Program group as a whole and at Dr. Mary McLeod Bethune were statistically lower than that of their respective comparison groups; o The educational effect sizes of the Program groups, both as a whole and at Dr. Mary McLeod Bethune, were discernable. The percents of proficient students in the Program groups at the remaining six schools were not statistically different from their respective comparison groups. Page 47

Table C3b indicates that, in FY2006, The percents of proficient students in the Program group as a whole and at Carver and Jeaga were not statistically different from their respective comparison groups. The percent of proficient students in the Program group at Don Estridge was statistically greater than its comparison group; o The educational effect size of the Program group was discernable. Question C4. From FY2005 to FY2006, did cohort groups of Program students significantly out-perform matched comparison groups on FCAT SSS Mathematics? a. In Developmental Scale Score (DSS) gains? b. In the portion of a year's growth in mathematics in one year? c. In estimated number of years required to move students from a basic to proficient level of mathematics"? Method The mathematics achievement gains of Program students were compared to matched comparison groups of similar students. The comparison groups were randomly selected from students throughout the District who had never participated in the Program. Comparison groups were selected by matching the key characteristics of Program students. Key characteristics were Current grade level, Retention in FY2005 and FY2006, Racelethnicity (Black, White, Hispanic, or other), LEP (limited English proficiency - those enrolled in ESOL classes or in the two year monitoring program after exiting ESOL), ESE (exceptional student education status - all exceptionalities other than gifted, speech impaired, or hospitallhomebound), Freelreduced lunch program status, Prior year's FCAT achievement levels. Separate analyses were completed for students in the Program across all elementary schools and across all middle schools, as well as for the students in the Program at each individual school. A detailed comparison of students in the Program group across all elementary schools evaluated and all middle schools evaluated and their comparison groups are reported in Tables C4a and C4b respectively. Mathematics scores were matched on FY2005 FCAT results. Retention was matched for both FY2005 and FY2006. The remaining student characteristics were matched on FY2006 demographic data. The size of each comparison group was made as large as possible as long as the key characteristics remained within 3% of that of the Program group. This was done to maximize the stability18 of each comparison group without compromising the similarity between the Program and comparison groups. The Program students evaluated and the comparison group students were statistically similar in demographics and FY2005 FCAT scores. Table C4a: Comparison of Key Student Characteristics in the Program Group across All Elementary Schools Evaluated and its Comparison Group '7 The lower end of Level 2 on FCAT SSS Mathematics to the lower end of Level 3. '8 Stability is necessary to ensure that any randomly selected comparison group has the same characteristics as any other randomly selected comparison group. Page 48

Table C4b: Comparison of Key Student Characteristics in the Program Group across All Middle Schools Evaluated and its Comparison Grour, Difference Program Number of Comparison umber of Key Characteristics between Group Group Students Groups Grade 6 31.8% 1039 32.0% 4985-0.2% Grade 7 33.8% 1103 33.8% 5276 0.0% t Grade 8 34.4% 1121 34.2% 5331 0.2% Retention 3.0% 97 2.4% 375 0.6% I Black 1 37.8% 1 1233 1 36.2% 1 5638 1 1.6% 1 White Hispanic 35.9% 19.5% I Other ethnicitv 1 6.9% 1 224 1 6.7% 1 1044 1 0.2% 1 LEP ESE Freelreduced lunch Prior Level 1 Mathematics 13.1% 12.3% 48.3% 23.5% 1171 635 I Prior Level 2 Mathematics 1 21.9% 1 713 1 22.2% 1 3468 1-0.3% 1 Prior Level 3 Mathematics Prior Level 4 Mathematics Prior Mean DS Score Total Number of Students 27.7% 18.5% 1713.4 100.0% 426 400 1576 767 903 603 3263 3263 37.2% 19.9% 10.5% 11.7% 47.9% 21.8% 28.3% 19.0% 1717.0 100.0% 5801 3109 1631 1826 7476 3397 4415 2969 15592 15592-1.3% -0.4% 2.6% 0.6% 0.4% 1.7% -0.6% -0.5% -3.6-12329 Summaries of the key characteristics of students in the Program group at all Program elementary and middle schools as groups and the range among individual Program elementary schools and middle schools are shown in Tables C4c and C4d respectively. Table C4c: Summary of the Key Characteristics in the Program Group for All Program Elementary Schools as a Group and the Variation among Individual Program Elementary Schools Page 49

Key Characteristics as One Group Variation among Grade 4 44.5% 538 38.6% - 51.2% Grade 5 50.3% 608 43.3% - 56.1 % Retention Black 57.1% 690 28.0% - 96.7% 39-178 White 18.9% 228 0.0% - 44.5% Hispanic 17.2% 208 1.I% - 38.5% - Other ethnicity LEP ESE Freelreduced lunch Prior Level 1 Mathematics Prior Level 2 Mathematics Prior Level 3 Mathematics Prior Level 4 Mathematics Prior Mean DS Score Total Number of Students 20.0% 17.1% 79.1 % 24.3% 24.3% 37.7% 115% 1331.3 100.0% 241 207 955 294 293 456 139 1208 1208 4.3% - 35.0% 12.2% - 25.6% 50.0% - 97.3% 14.6% - 32.8% 20.5% - 31.9% 29.1% - 41.4% 4.9% - 16.5% 1278.6-1374.8 100.0% - 100.0% 8-79 20-47 74-205 25-63 23-72 34-111 9-34 107-268 107-268 Table C4d: Summary of the Key Characteristics in the Program Group for All Program Middle Grade 7 Grade 8 33.8% 34.4% 1 Retention 1 3.0% 1 97 1 0.8%-4.6% 1 10-45 1 Black White 37.8% 35.9% 1103 1121 1233 1171 33.7% - 34% 32.1 % - 38.3% 10.7% - 65.8% 16.1 % - 63.6% 336-421 328-400 133-675 165-794 Other ethnicity LEP 6.9% 13.1% 224 426 ESE 12.3% 400 Freelreduced lunch 48.3% 1576 090-161 20.2% - 74% I Prior Level 1 Mathematics / 23.5% 1 767 1 8.5%-37.5% / 106-385 1 Prior Level 2 Mathematics Prior Level 3 IWathematics Prior Level 4 Mathematics Prior Mean DS Score 21.9% 27.7% 18.5% 1713.4 713 903 603 3263 5.4% - 9.2% 5% - 19% 17.2% - 26.2% 20.5% - 31% 13% - 26.4% 1632.1-1826 54-115 62-195 252-732 215-259 210-386 129-329 989-1248 I Total Number of Students 1 100.0% 1 3263 1 100.0% - 100.0% 1 989-1248 1 As shown in Tables C4c and C4d, the range of key characteristics from school to school is considerable. For this reason, when analyzing achievement at the school level, separate 19 Elbridge Gale Elementary School was not evaluated as an individual school because there were fewer than 30 students in the program. Page 50

comparison groups were selected to match the unique characteristics of each school. A detailed comparison of the Program group and the statistically similar comparison group for each individual Program school can be found in Appendix S. Student achievement was measured by calculating the mean gain scores for students in the Program groups and their respective comparison groups from FY2005 to FY2006 for FCAT SSS Mathematics DSS scores. A t-test was used to assess whether the mean gains of each Program and comparison group were statistically different from each other at the 0.05 level. Results The results are summarized in Tables C4e and C4f. Table C4e presents the results for Program elementary schools as a group and individually. Table C4f presents the results for Program middle schools as a group and individually. The tables are organized in the following manner: Program and comparison group means are reported as DSS gains made from FY2005 to FY2006, the mean portion of a year's growth in mathematics proficiency made from FY2005 to FY~oo~~', and the mean predicted years needed to move students from a basic to a proficient21 level of mathematics. The last three columns measure the relative Program value means. Statistical significance of the differences in Mean DSS gain between the Program groups and their comparison groups is indicated by the use of the superscript S No statistically significant difference between %I. groups is indicated by the superscript NS ( ). When the difference in DSS mean gains is not statistically significant, the difference in the portion of a year's growth and the years needed to move a student from a basic to a proficient level of mathematics is not reported (NR). 20 A portion of a year's growth of 1.5 would indicate that approximately one and a half year's growth took place in one year. The years needed to move a student from basic to proficient assume that the reported portion of a year's growth will remain constant during each year needed to move students to proficiency. When more than five years were needed to move students from a basic to a proficient level of reading, Not attainable was entered as 9" grade students could not move from a basic to proficient level in reading by the year of their graduation. Page 51

Table C4e: Comparison of Gain from FY2005 to FY2006 between Students in the Elementary School Program Groups and their S = Statistical Significance NS = No Statistical Significance NR = Not Reported 22 The term Not calculable is reported in this column when the difference in DSS mean gains is significant but the years needed to move students from a basic to a proficient level in reading is not attainable by their year of graduation in both the Program and comparison group. 23 The term Fewer is used when the years needed to move students from basic to proficient is lower for the Program group than for the comparison group. The term More is used when the years needed to move students from basic to proficient is higher for the Program group than for the comparison group. Page 52

Table C4f: Comparison of Gain from FY2005 to FY2006 between Students in the Middle School Program Groups and their Comparison Groups Program school All 2005 DSS Program Group Means Comparison Group Means Relative Program Value Means Portion Years Portion Years Portion Years - 2006 DSS of DSS Gain Year's Growth Proficient -- Not 1713.4 1800.5 87.1 0.98 1717.0 1805.9 88.9 0.97-1.8 NS NR NR attainable attainable Carver 1632.1 1713.5 81.3 0.88 attainable -18.6 -''I7 Not calculable24 Estridge 1826.0 1924.6 98.7 1.19 anainab,e 18.8 0.25 Not calculable - - -- - Jeaga 1655.7 1734.1 78.4 0.82 1652.4 1746.1 93.7-0'19 Not calculable S = Statistical Significance NS = No Statistical Significance NR = Not Reported Table C4e indicates that, from FY2005 to FY2006, for all Program schools as a group, There was no statistically significant difference in the mean DSS gain in mathematics between the Program group and the comparison group; Table C4e indicates that, from FY2005 to FY2006, of the seven individual Program schools evaluated, The Program group at Grassy Waters had a mean DSS gain that was statistically greater than that of its comparison group; o The portion of a year's growth made by the Program group at Grassy Waters was greater than that of its comparison group by 0.50 of one year. Were that difference to remain constant, it would take two years for students to gain ne year's worth of growth over its comparison group in moving students from a basic to proficient level of mathematics. The Program group at Dr. Mary McLeod Bethune had a mean DSS gain that was statistically lower than its comparison group; The Program groups at the remaining five schools had mean DSS gains that were not statistically different from their respective comparison groups. Table C4f indicates that, from FY2005 to FY2006, for all Program schools as a group, There was no statistically significant difference in the mean DSS gain in mathematics between the Program group and the comparison group; Table C4f indicates that, from FY2005 to FY2006, of the three individual Program schools evaluated, The Program group at Don Estridge had a mean DSS gain that was statistically greater than that of its comparison group; 24 The term Not calculable is reported in this column when the difference in DSS mean gains is significant but the years needed to move students from a basic to a proficient level in reading is not attainable by their year of graduation in both the Program and comparison group. Page 53

o The portion of a year's growth made by the Program group at Don Estridge was greater than that of its comparison group by 0.25 of one year. Were that difference to remain constant, it would take four years for students to gain one year's worth of growth over its comparison group in moving students from a basic to proficient level of mathematics. The Program groups at Carver and Jeaga had mean DSS gains that were statistically lower than their respective comparison groups. Question C5. In FY2006, were the percents of cohort groups of Program students scoring Level 3 or higher on FCAT SSS Mathematics significantly greater than those of matched cohort comparison groups? Method In FY2006, the percent of students proficient in the Program groups were compared to matched comparison groups of similar students. The comparison groups were randomly selected from students throughout the District who had never participated in the program. The selection of comparison groups is described in the Method section of question C4. Student achievement was measured by calculating the percent of students both in the Program groups and their respective comparison groups who scored Level 3 or higher (proficient) on the FY2006 FCAT SSS Mathematics. A Test of Independent Proportions was used to assess whether the percent of proficient students in each Program and comparison group were statistically different from each other at the 0.05 level. If the differences in proficiency were found to be statistically significant, they were also examined for educational significance. Results Results are reported in Tables C5a and C5b. Table C5a reports the results for all Program elementary schools as a group and each individual Program elementary school. Table C5b reports the results for all Program elementary schools as a group and each individual Program elementary school. The tables are organized in the following manner: The percent of proficient students in the Program and comparison groups in FY2005 and FY2006 are reported. The last two columns measure the relative value of the Program. First, statistical significance of the difference in the percent of proficient students in the Program groups and their comparison groups is indicated by the use of the superscript S ('k No statistically significant difference between groups is indicated by superscript NS (N ). When the difference in the percent of proficient students is not statistically significant, the educational effect size25 of the difference is not reported (NR) as there is no evidence that differences are not due to chance. 25 Effect sizes are reported as Negligible, Discernable, Substantial, Extensive, or Extreme. Page 54

pp Table C5a: Comparison of the Percent of Proficient Students in FY2006 in the Elementary School Program Groups and their Com~arison Grou~s Program School 2005 Percent Proficient Program Group 2006 Percent Proficient Comparison Group 2005 Percent Percent Proficient Elementary 51.4% 44.0% 53.8% 48.8% -4.7% S Negligible (-) Bethune 41.8% 27.7% 43.0% 35.8% -8.0% Discernable (-) Dwight D. Eisenhower 55.1% 45.3% 46.7% 47.9% 56.8% 46.4% 51.3% 44.0% -4.5% NS NR -- 3.8% NS NR Grassy Waters 56.0% 52.2% 59.3% 51.1% 1.1% NS NR Hidden Oaks 59.1% 50.6% 62.4% 57.3% -6.6%NS NR Lincoln Starlight Cove 46.0% 53.5% 32.1% 46.5% 47.7% 55.8% 38.8% 51.4% S = Statistical Significance NS = No Statistical Significance NR = Not Reported -6.6% NS -4.9% NS NR NR Table C5b: Comparison of the Percent of Proficient Students in FY2006 in the IMddle School Program Groups and their Comparison Groups Program Group Comparison Group Relative Program Value All Program Middle Schools 1 Carver 1 39.2% 1 41.1'70 1 40.3% 1 44.0'70 1) -2.8% NS 1 NR Don Estridge 2005 percent Proficient 54.6% 74.3% 2006 Percent Proficient 57.6% 80.3% Jeaga 1 45.9% 1 46.1% 1 45.4% 1 48.6% 1-2.4'70Ns I NR I I I I I I I S = Statistical Significance NS = No Statistical Significance NR = Not Reported Table C5a indicates that, in FY2006, 2005 Percent Proficient 56.0% 73.6% 2006 Percent Proficient 58.4% 75.3% Percent Of Students Proficient -0.7% NS Educational Effect Size The percents of proficient students in the Program elementary group as a whole and at Dr. Mary McLeod Bethune Elementary were statistically lower than that of their respective comparison groups; NR o The educational effect size of the Program group as a whole was negligible and was discernable at Dr. Mary McLeod Bethune. The percents of proficient students in the Program groups at the remaining six schools were not statistically different from their respective comparison groups. Table C5b indicates that, in FY2006, The percents of proficient students in the Program secondary group as a whole and at Carver and Jeaga were not statistically different from their respective comparison groups; Page 55

The percent of proficient students in the Program group at Don Estridge was statistically greater than that of its comparison group; o The educational effect size of the Program group was discernable. Ill. Conclusion The analyses of various aspects of the CHAMPS Program at the eight District elementary schools provided no evidence that the Program has made a significant impact. The analyses at the three District middle schools indicated that, across all schools as a group, there is no evidence that the Program has made a significant difference. It should be noted that the program group at Don Estridge High Tech Middle School consistently out-performed its comparison group in a majority of areas analyzed. Nevertheless, it cannot be determined that the reason for the positive performance at this school is due to the implementation of the Program as there is no pattern of positive performance elsewhere. Page 56

1 I. Purpose of the Report CHAMPs Program FY2006 Analysis of Return on lnvestment This report provides an analysis of the extent of return on the District's investment in the CHAMPs Program (the Program) at eight elementary schools and three middle schools during FY2006. II. Return on lnvestment Return on lnvestment (ROI) was determined by combining the results of the Program's implementation and outcome analyses and adding the FY2006 average cost of the Program for each participating student along with the total cost of the Program. An ROI analysis is only conducted for those programs that have a high level of implementation (Level 3 or 4). Method The results of the implementation analysis were used to determine the level of Program implementation. Rubric 1 reports how the implementation analysis results for all Program schools as a group were related to the level of implementation. Rubric 1: Relationship of lmplementation Analysis Results to Level of lmplementation Implementation Analysis Level of Result Implementation +-- ( Explanation of Level of lmplementation Extreme problems in implementation due to great / Beginning I 1 difficulty in implementation process or 1 misunderstanding of implementation requirements. 1 Significant problems in implementation due to difficulty I Minimal I 2 in implementation process or some misunderstanding 1 I Some problems in implementation due to difficulty in Intermediate Full 3 4 implementation process or misunderstanding of implementation requirements. Little or no problems in implementation. The results of the outcome analysis were used to determine the level of impact on student achievement. Rubric 2 reports how the outcome analysis results for all Program schools as a group were related to the level of student performance attributed to the Program. Rubric 2: Relationshir, of Outcome Analvsis Results to Level of lm~act on Student Performance Outcome Analysis Relative Level of Impact on Explanation of Relative Program Value Level Proaram " Value Student Performance No or negligible educational No or negligible difference in student performance 1 Discernable educational value I Substantial educational value 1 Extensive educational value or higher A discernibly higher level of performance among 2 Program students when contrasted with students in the - comparison group. 3 4 comparison group. An extensively higher level of performance among Program students when contrasted with students in the comparison group. Page 57

The average annual cost needed for each student to participate in the Program and the total annual cost of the Program were used to determine the cost of the implementation. Rubric 3 reports how the two costs were related to the level of cost of implementation. Note that both per student and total cost conditions must be met'. Rubric 3: Relationship of Average Annual Cost Per Student to Participate in the Program and Total Annual Cost of Program to Level of Cost of Implementation Results Average Annual Cost Per Participating Student > $1,000 Total Annual cost of Program Level of Cost Explanation of Level of Cost > $4.000.000.,. 1 Great cost to im~lement 5 $1]000 5 $4,000,000 2 Significant cost fo implement 01$500 Moderate cost to im lement 1 $100 r $500,000 4 Little or no cost to implement Return on Investment analyses were conducted for this Program at the elementary and middle school levels because the level of implementation was high for both. Figures 1 and 2 report the Program ROI results for elementary schools and middle schools respectively. The Cost of Program Implementation Level (I - 4) and Level of lmpact on Student Performance (1-4) are plotted along the X and Y axes respectively within each figure. The Program name in each figure represents the point in one of the quadrants where the Levels of Cost and lmpact on Performance intersect. The interplay of the two levels within the quadrants guides interpretation of the ROI and the decision making process regarding what program related action to take. ' For example, a program with a $90 annual cost per student and a total annual cost of $700,000 is at a level 3 cost of implementation. Page 58

~ ~ s. 4 I 2 High Low Cost of Program Implementation Level Figure 2: Return on Investment in Program Middle Schools I I 3 1 0 -I Champs 1 High 2 3 4 Low Cost of Program Implementation Level Page 59

Figures 1 and 2 indicate that, at both Program elementary and middle schools, o The level of the Program's impact on student performance was low (Level 1); o The level of the Program's cost was low (Level 4). Ill. Discussion Level of Implementation The analysis of the Program at the District's elementary and middle schools indicates that implementation has largely been successful. The results of the student surveys pertainin to Program implementation at the middle school level only somewhat confirm these findings 9. Finally, the effective instructional strategies survey suggests that Program teachers employ few effective strategies at the middle school level. Steps should be taken to encourage teachers to employ more such strategies. Level of Impact on Student Performance The analyses of various aspects of the CHAMPs Program at the eight District elementary schools provided no evidence that the program has made a significant impact. The analyses at the three District middle schools indicated that, across all schools as a group, there is no evidence that the Program has made a significant difference. It should be noted that the program group at Don Estridge High Tech Middle School consistently out-performed its comparison group in a majority of areas analyzed. Nevertheless, it cannot be determined that the reason for the positive performance at this school is due to the implementation of the Program as there is no pattern of positive performance elsewhere. Level of Cost Currently, $385,000 of operating funds is allocated to the CHANlPs Classroom Management initiative. These funds cover the salaries of three resource teachers (CHAMPs Coaches) and an Assistant Director. The majority of the remaining funds are utilized to pay for materials, including a CHAMPs book for each teacher and school administrator. No capital funds are allocated to the program. The total dollar cost since the initiative implementation is $1.5 million dollars. Currently the dollar cost per student is approximately $35 per year. The majority of middle school students in the Program did not recognize two of the ten survey statements (My teacher talks about CHAMPs in my class and My teacher reviews with us the CHAMPs expecfations for differenf activities.) Page 60

CHAMPs Program Recommendation and Evaluation Action Plan for FY2007 The Department of Research and Evaluation and the Division of Learning Support have completed a formative evaluation of the CHAMPs Program in FY2006. The purpose of a formative evaluation is to provide Program administrators with the implementation and student outcome results needed to create a specific and appropriate FY2007 action plan to remediate implementatior~ related concerns that may be adversely impacting student performance. Based on the results of the evaluation, the Program Evaluation Steering Committee recommends the following action for FY2007: 1. Continue District support for the Program; 2. Make changes to descriptors that will more accurately reflect levels of implementation at individual schools and for individual staff members; 3. Review and analyze the appropriateness of descriptors for outcomes measures; 4. Develop an action plan to maximize implementation; 5. Conduct a second formative evaluation in FY2007.

Table la: Tasks Responding to Evaluation Findings I I PROGRAM: CHAMPS PROGRAM CONTACT: ALISON ADLER I DAVE BENSON I I I Share the tasks listed in the FY2006 Action Plan with I 1 I I Principals of the FY2006 target schools as well as the FY2007 I I I Dave Benson August 1,2006 September 29,2006 target schools, including tasks responding to evaluation I findings and tasks responding to additional concerns. I I I I Emphasize with teachers and administrators the importance of regularly communicating the CHAMPs expectations with students. Dave Benson Principals August 9,2006 June 8,2007 In FY2007, continue to work with staff at the FY2006 schools to implement the CHAMPs Classroom Management initiative. Dave Benson 1 CHAMPs Coaches August 16,2006 June 8,2007 Review and provide feedback for all staff on classroom management plans. Train administrators to utilize the classroom management plan when providing assistance to staff. Dave Benson I CHAMPS Coaches August 16,2006 October 26, 2006 Gain vertical support by informing area superintendents of progress. In FY2007, implement the CHAMPs Classroom Management initiative with additional schools utilizing descriptors that more accurately reflect implementation. Alison Adler I Dave Benson Dave Benson August 6, 2006 June 8,2007 August 9,2006 June 8,2007

Table I b: Tasks Responding to Additional Concerns CHAMPS Coaches I Dave Benson I

Table lc: Tasks Responding to Evaluation Findings for which no Current Program Descriptors Exist ROGRAM: CHAMPS PROGRAM CONTACT: ALISON ADLER I DAVE BENSON Determine descriptors not included in FY06 Program Evaluation that would more accurately reflect success of Dave Benson school-wide norms, classroom issues, and individual Dave Benson I Provide Area Superintendents and other district staff a set of questions to ask when visiting CHAMPS schools to help assess progress. Dave Benson 1 Coaches August 16,2006 June 8,2007 'APPROVED-SUPERVISOR OF PROGRAM CONTACT Alison Adler Chief, Safety and Learning Enviornment

FY2006 Evaluation of Champs Appendices

Page 1-1

CHAMPS ELEMENTARY INSTRUCTIONAL 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 1.10 1.11 1.12 1.13 Lincoln ES YES INO YES YES YES YES YES YES YES YES YES YES YES Starliaht Cove ES YES YES YES YES YES YES YES YES YES YES YES YES YES

CHAMPS ELEMENTARY PRODUCT SERVICE 2.1 2.2 2.3 2.4 2.5 3.1 3.2 3.3 3.4 3.5 3.6 Lincoln ES Starlight Cove ES YES JYES IYES JYES YES YES YES (YES YES IYES YES JYES INO JYES YES (YES YES YES (YES YES YES YES 92% 1 MINIMAL 100% 1 FULL 100% 100% 100% 100% 100% 100% 100% 88% 100% 100% 100% FULL

CHAMPS MIDDLE SCHOOL Carver MS Don Estridge HTMS Jeaga MS INSTRUCTIONAL 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 1.10 1.11 1.12 1.13 YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES

CHAMPS MIDDLE SCHOOL Carver MS Don Estridge HTMS Jeaga MS PRODUCT SERVICE 2.1 2.2 2.3 2.4 2.5 3.1 3.2 3.3 3.4 3.5 3.6 YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES YES 100% 100% 100% FULL FULL FULL 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% FULL

CHAMP STUDENT SURVEY RESULTS MIDDLE SCHOOL FOR EACH Page 1-5

Run 06/20/2006 Pgm SURVCHPI CHAMPS Student Survey School Type: MS

Run 06/20/2006 Pgm SURVCHPl CHAMPs Student Survey School Type: MS I Q02. My teacher displays CHAMPs posters or visuals in the 1 classroom. 1 Pct Never 1 N Rarely Pct 1 N 1 Everyday or Less than Once or almost every once a week twice a week day Total Pct N Pct N Pct N N Total 10% 253 5% 136 4% 100 5% 125 74% 1799 2413 School L C Swain Middle School 37% 132 8% 30 4% 15 4% 17 45% 161 355 Carver Middle School 5% 20 7% 27 4% 17 5% 22 77% 299 385 Jeaga Middle School 9% 69 7% 51 5% 41 6% 49 70% 504 714 Don Estridge High Tech Middle 3% 32 2% 28 2% 27 3% 37 87% 835 959

Run 06/20/2006 Pgm SLIRVCHPI CHAMPS Student Survey School Type: MS (203. My teacher uses a signal to get the attention of all students. Never Rarely Less than Once or once a week twice a week Everyday or almost every day Total Pct N Pct N Pct N ~c't N Pct Total School 16% 389 16% 407 10% 257 16% --- L C Swain Middle School 28% 102 23% 84 11% 42 10% Carver Middle School ~~~~~~ 7% 28 12% 47 11% 45 17% 66 51% 199 385 Jeaga Middle School 19% 136 12% 90 8% 58 15% 114 44% 316 714 Don Estridge High Tech Middle 12% 123 19% 186 11% 112 18% 178 37% 360 959

Run 06/20/2006 Pgm SURVCHPI CHAMPS Student Survey School Type: MS Page - - - - Q04. The students in my class are completely quiet within 3 seconds when my teacher tries to get our attention. Once or Everyday or almost every Pct N Pct N Pct N Pct N Pct N N Total 21% 525 25% 612 15% 381 18% 454 18% 441 2413 School L C Swain Middle School 30% 110 31% 113 11% 42 10% 38 14% 52 355 Carver Middle School 23% 89 23% 90 17% 67 21% 82 14% 57 385 Jeaga Middle School 28% 203 24% 177 13% 96 14% 103 18% 135 714 Don Estridge High Tech Middle / 12%1 1231 24%1 2321 1% 176 24%1 2311 20%1 1971 9591

Run 06/20/2006 Pgm SURVCHPI CHAMPS Student Survey School Type: MS Page Q05. My teacher is very clear on how the students should behave during different activities. Never Rarely Less than once a week Once or twice a week Everyday or almost every day Total Total Pct N Pct N Pct N Pct N Pct N ~~~~~~~~~~ 4% 119 6% 156 12% 294 19% 461 57% 1383 2413 School L C Swain Middle School 7% 28 12% 43 18% 67 52% 188 355 Carver Middle School 2% 11 7% 28 11% 45 15% 61 62% 240 385l Jeaga Middle School 6% 46 7% 54 13% 96 17% 127 54% 391 714 Don Estridge High Tech Middle 3% 33 4% 46 11% 110 21% 206 58% 564 959

Run 06/20/2006 Pgm SURVCHP1 CHAMPs Student Survey School Type: MS Page I Q06. I know the CHAMPs expectations for all activities in my classroom. Everyday or Less than Once or almost every Never, 1 1 1 Rarely once a week twice a week day Totali Total School L C Swain Middle School Carver Middle School 13% 32% 7% 337 115 28 12% 20% 10% 296 71 40 12% 298 13% 49 10% 41 15% 9% 17% 372 34 66 46% 24% 54% 1110 2413 ~ 86 355 210 385 Jeaga Middle School Don Estridge High Tech Middle 1 7% I 16% 121 73 14% 8% 103 82 14% 11% 101 107 12% 19% 87 185 42% 53% 302 714 I 512 959

Run 06/20/2006 Pgm SLIRVCHPI CHAMPS Student Survey School Type: MS Q07. My teacher follows a procedure if someone comes to class 1 late or for making up work if someone is absent. Never Rarely Less than once a week Once or twice a week Everyday or almost every day Total Pct N Pct N Pct N Pct N Pct N N Total 8% 213 9% 234 11% 280 18% 455 51% 1231 2413 School Carver Middle School Jeaga Middle School Don Estridge High Tech Middle 7% 27 9% 36 13% 51 16% 65 53% 206 385 I I I I I I I I I I I 10% 75 7% 56 10% 72 16% 118 55% 393 714 7% 71 9% 95 12% 119 23% 222 47% 452 959 I

Run 06/20/2006 Pgm SLIRVCHPI CHAMPs Student Survey School Type: MS Page Q08. My teacher reviews with us the CHAMPs expectations for different activities. Never Rarely Less than once a week Once or twice a week Everyday or almost every day Total Total School L C Swain Middle School 49% 176 25% 90 11% 40 5% 20 8% 29 355~ Carver Middle School 13% 53 21% 81 17% 67 19% 76 28% 108 385 Jeaga Middle School 28% 206 22% 162 15% 113 13% 97 19% 136 714 Don Estridge High Tech Middle 24% 238 28% 273 20% 192 14% 137 12% 119 959

Run 0612012006 Pgm SURVCHPI CHAMPS Student Survey School Type: MS Q09. If a student misbehaves, the teacher does something about it. I Less than Once or almost every Never Rarely once a Total Pct N Pct N Pct N Pct N Pct N N rota1 5% 126 7% 172 10% 244 17% 425 59% 1446 2413 School I - C Swain Middle School 7% 27 8% 29 6% 24 16% 57 61% 218 355 :arver Middle School 5% 21 9% 36 10% 39 14% 56 60% 233 385 Jeaga Middle School 5% 37 7% 54 9% 69 13% 97 64% 457 714 Ion Estridge High Tech diddle 4% 41 5% 53 11% 112 22% 215 56% 538 959

Run 06/20/2006 Pgm SURVCHPI CHAMPS Student Survey School Type: MS Page Q10. My teacher lets students know when they are doing what they are supposed to be doing. Never Rarely Less than once a week Once or twice a week Everyday or almost every day Pct N Pct N Pct N Pct N Pct N Total School 7% -- 172 7% 183 12% 291 ---- -- L C Swain Middle School 11% 41 8% 30 10% 38 16% 59 52% 187 355 Carver Middle School 5% 20 7% 30 10% 41 21% 81 55% 213 385 Jeaga Middle School 9% 66 8% 61 12% 92 15% 110 53% 385 714 Don Estridge High Tech Middle 4% 45 6% 62 12% 120 23% 228 52% 504 959

EFFECTIVE TEACHING STRATEGIESTUDENT SURVEY RESULTS FOR EACH MIDDLE SCHOOL Page 1-16

Run 0612012006 Pgm SURVCHP1 CHAMPS Student Survey School Type: MS (211. How often do you compare and contrast things that you study? Never Rarely Less than once a week Once or twice a week Everyday or almost every day Total Pct N Pct N Pct N Pct N Pct N N Tot a1 17% 414 18% 441 21% 530 26% 644 15% 384 2413 School L C Swain Middle School 22% 80 23% 83 19% 68 23% 83 11% 41 355 Carver Middle School 11% 45 19% 74 21% 82 26% 103 21% 81 385 Jeaga Middle School 18% 135 15% 114 20% 148 26% 188 18% 129 714 Don Estridge High Tech 1 Middle 16% 154 17% 170 I 24% 232 28% 270 13% 133 959

Run 06/20/2006 Pgm SURVCHPI CHAMPS Student Survey School Type: MS Page 12 Q12. How often do you take notes or write a summary of your teacher's instruction? Less than Never Rarely once a week twice a week Pct N Pct N Pct N Pct N Pct N N Total 17% 422 15% 379 14% 354 22% 537 29% 721 2413 School 15% 56 12% 45 21% 77 30% 110 355 Carver Middle School 12% 48 16% 62 17% 66 27% 107 26% 102 385 Jeaga Middle School 17% 125 18% 135 13% 98 19% 140 30% 216 714 Don Estridge High Tech Middle 18% 182 13% 126 15% 145 22% 213 30% 293 959

Run 0612012006 Pgm SURVCHPl CHAMPS Student Survey School Type: MS (213. How often do you receive praise or recognition from your teacher? Never Less than Rarely once a week twice a week Pct N Pct N Pct N Pct N Pct N N Total 15% 379 17% 432 19% 466 24% 594 22% 542 2413 School L C Swain Middle School 20% 74 20% 74 18% 67 20% 73 18% 67 355 Carver Middle School 12% 49 17% 68 16% 63 18% 73 34% 132 385 Jeaga Middle School 19% 136 18% 134 18% 131 23% 169 20% 144 714 Don Estridge High Tech Middle 12% 120 16% 156 21% 205 29% 279 20% 199 959

Run 06/20/2006 Pgm SURVCHPI CHAMPS Student Survey School Type: MS Total Q14. How often do you have homework that is about what you learn in class? 1 Everyday or Less than Once or almost every Never Rarely once a week twice a week day Total I I 1 I I Pct N Pct N Pct N Pct N Pct 1 N N 11% 273 11% 282 11% 283 22% 547 42% 1028 2413 I School L C Swain Middle School 10% 39 18% 64 13% 49 21% 77 35% 126 355 Carver Middle School 6% 26 13% 52 11% 45 21% 81 47% 181 3851 Jeaga Middle School 10% 74 15% 108 14% 107 23% 168 35% 257 714~ Don Estridge High Tech Middle 13% 134 6% 58 8% 82 23% 221 48% 464 959

Run 06/20/2006 Pgm SURVCHPl CHAMPS Student Survey School Type: MS I 1 Q15. How often do you get feedback on your homework? I I Never Rarely Less than once a week twice a week 1 Everyday or Total I I I I I Pct N 1 P c ~ N kt N Pct N Pct I N l N l Total 15% 364 14% 338 15% 365 23% 569 32% 777 2413 School L C Swain Middle School 18% 67 19% 68 16% 60 21% 76 23% 84 355 Carver Middle School 8% 34 17% 68 16% 63 23% 92 33% 128 385 Jeaga Middle School 15% 109 16% 119 16% 115 21% 155 30% 216 714 Don Estridge High Tech Middle 16% 154 8% 83 13% 127 25% 246 36% 349 959

Run 06/20/2006 Pgm SURVCHPl CHAMPS Student Survey School Type: MS Page (216. How often do you work with other students in small groups? Never Rarely Less than once a week Once or twice a week Everyday or almost every day Total, Pct N Pct N Pct N Pct N Pct 1 Total School 14% I 357 24% 599 18% 458 23% 559 18% 440 2413 L C Swain Middle School 12% 44 29% 104 21% 75 23% 85 13% 47 355 Carver Middle School 13% 51 20% 77 18% 71 26% Jeaga Middle School 14% 102 25% 182 16% 118 23% 166 20% 146 714 Don Estridge High Tech Middle 16% 160 24% 236 20% 194 21% 205 17% 164 959

Run 0612012006 Pgm SURVCHPI CHAMPS Student Survey School Type: MS Page 17 beginning of the lesson?

Run 0612012006 Pgm SURVCHP1 CHAMPS Student Survey School Type: MS Page (218. How often are you asked to make predictions during the 1 lesson? Everyday or Less than Once or almost every, Never Rarely once a week twice a week day Total Pct N Pct N Pct N Pct N Pct N N Total 21% 508 22% 549 20% 506 19% 466 15% 384 2413 school ;arver Middle School 15% 60 20% 79 18% 73 21% 84 23% 89 385 Jeaga Middle School 20% 147 22% 159 18% 130 18% 132 20% 146 714 1 I Ion Estridge High Tech Middle 23% 224 21% 210 24% 234 19% 183 11% 108 959

Run 06/20/2006 Pgm SURVCHPI CHAMPS Student Survey School Type: MS Page 19 Q19. How often does your teacher ask you to make a diagram, picture, or model of something you are learning? Pct Never N Rarely -- Pct N Less than once a week Pct N Once or twice a week Pct N Everyday or almost every Pct N N Total 20% 489 25% 609 21% 518 19% 463 13% 334 2413 School L C Swain Middle School 19% 70 33% 120 18% 66 15% 54 12% 45 355 1 Carver Middle School 17% 69 23% 90 18% 72 19% 75 20% 79 385 Jeaga Middle School 20% 146 23% 171 21% 156 19% 141 14% 100 714 Don Estridge High Tech Middle 21% 204 23% 228 23% 224 20% 193 11% 110 959

Run 0612012006 Pgm SLIRVCHP1 CHAMPS Student Survey School Type: MS (220. How often does your teacher make a connection about what you are learning now with what you have already learned? Pct Never N Rarely Less than once a week Once or twice a week Pct Everyday or almost every day ~~~~~ Pct N Pct N N Pct N N Total 10% 250 11% 283 18% 453 25% 607 33% 820 2413 School L C Swain Middle School - 12% 44 16% 58 21% 76 23% 83 26% 94 355 Carver Middle School 7% 30 9% 36 16% 65 23% 91 42% 163 385 Jeaga Middle School 9% 69 13% 95 18% 131 22% 163 35% 256 714 Don Estridge High Tech Middle 11% 107 9% 94 18% 181 28% 270 32% 307 959

Page 0-1

Explanation of Educational Effect Size The educational effect size provides a method of understanding the extent to which a difference in scores impacts student achievement. It is used to categorize the size of (1) growth, (2) year-to-year change, or (3) differences between groups. Size Effect Reading Math Years Needed Portion of a Portion of a Basic to Year's Growth Year's Growth Proficient 1 Negligible (+) I $ t 5 I I Negligible (-) t t N'A 3 1 0.7 0.7 N'A I +-- Average Growth Note: A "Portion of a Year's Growth" of 1.5 would indicate that these students made, on average, one and a half year's growth in a single year. "Years Needed Basic to Proficient" is the estimated number of years it would take to move students from a basic to a proficient level (roughly equivalent to moving students from the lower end of FCAT Level 2 to the lower end of Level 3). "NIA indicates that students could not move from a basic to proficient level by the year of their graduation. Substantial, Extensive, and Extreme correspond to Cohen's categories of small, medium and large.