Interim Report. Roisin P. Corcoran, Ph.D. Joseph M. Reilly. Johns Hopkins University

Similar documents
AC : DEVELOPMENT OF AN INTRODUCTION TO INFRAS- TRUCTURE COURSE

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Running head: DEVELOPING MULTIPLICATION AUTOMATICTY 1. Examining the Impact of Frustration Levels on Multiplication Automaticity.

A Pilot Study on Pearson s Interactive Science 2011 Program

Early Warning System Implementation Guide

Textbook Evalyation:

EQuIP Review Feedback

Investigate the program components

Linking the Ohio State Assessments to NWEA MAP Growth Tests *

Copyright Corwin 2015

Applying Florida s Planning and Problem-Solving Process (Using RtI Data) in Virtual Settings

ACADEMIC AFFAIRS GUIDELINES

Scoring Guide for Candidates For retake candidates who began the Certification process in and earlier.

TASK 2: INSTRUCTION COMMENTARY

Calculators in a Middle School Mathematics Classroom: Helpful or Harmful?

Practices Worthy of Attention Step Up to High School Chicago Public Schools Chicago, Illinois

University of Toronto Mississauga Degree Level Expectations. Preamble

NCEO Technical Report 27

Kelso School District and Kelso Education Association Teacher Evaluation Process (TPEP)

give every teacher everything they need to teach mathematics

Making the ELPS-TELPAS Connection Grades K 12 Overview

BENCHMARK TREND COMPARISON REPORT:

Van Andel Education Institute Science Academy Professional Development Allegan June 2015

Introduce yourself. Change the name out and put your information here.


THE PENNSYLVANIA STATE UNIVERSITY SCHREYER HONORS COLLEGE DEPARTMENT OF MATHEMATICS ASSESSING THE EFFECTIVENESS OF MULTIPLE CHOICE MATH TESTS

South Carolina English Language Arts

Delaware Performance Appraisal System Building greater skills and knowledge for educators

PROGRESS MONITORING FOR STUDENTS WITH DISABILITIES Participant Materials

Your Guide to. Whole-School REFORM PIVOT PLAN. Strengthening Schools, Families & Communities

Justification Paper: Exploring Poetry Online. Jennifer Jones. Michigan State University CEP 820

$0/5&/5 '"$*-*5"503 %"5" "/"-:45 */4536$5*0/"- 5&$)/0-0(: 41&$*"-*45 EVALUATION INSTRUMENT. &valuation *nstrument adopted +VOF

Newburgh Enlarged City School District Academic. Academic Intervention Services Plan

Kannapolis Charter Academy

Queensborough Public Library (Queens, NY) CCSS Guidance for TASC Professional Development Curriculum

Final Teach For America Interim Certification Program

A pilot study on the impact of an online writing tool used by first year science students

AIS/RTI Mathematics. Plainview-Old Bethpage

Effective practices of peer mentors in an undergraduate writing intensive course

The Oregon Literacy Framework of September 2009 as it Applies to grades K-3

Process Evaluations for a Multisite Nutrition Education Program

EFFECTS OF MATHEMATICS ACCELERATION ON ACHIEVEMENT, PERCEPTION, AND BEHAVIOR IN LOW- PERFORMING SECONDARY STUDENTS

University-Based Induction in Low-Performing Schools: Outcomes for North Carolina New Teacher Support Program Participants in

Mooresville Charter Academy

OPAC and User Perception in Law University Libraries in the Karnataka: A Study

USE OF ONLINE PUBLIC ACCESS CATALOGUE IN GURU NANAK DEV UNIVERSITY LIBRARY, AMRITSAR: A STUDY

OFFICE OF ENROLLMENT MANAGEMENT. Annual Report

Topic: Making A Colorado Brochure Grade : 4 to adult An integrated lesson plan covering three sessions of approximately 50 minutes each.

Department of Geography Bachelor of Arts in Geography Plan for Assessment of Student Learning Outcomes The University of New Mexico

Greek Teachers Attitudes toward the Inclusion of Students with Special Educational Needs

B. Outcome Reporting Include the following information for each outcome assessed this year:

A Study of the Effectiveness of Using PER-Based Reforms in a Summer Setting

ECON 365 fall papers GEOS 330Z fall papers HUMN 300Z fall papers PHIL 370 fall papers

ABET Criteria for Accrediting Computer Science Programs

Principal vacancies and appointments

Higher Education Six-Year Plans

Shelters Elementary School

A Game-based Assessment of Children s Choices to Seek Feedback and to Revise

National Survey of Student Engagement (NSSE) Temple University 2016 Results

EXECUTIVE SUMMARY. TIMSS 1999 International Science Report

NATIONAL SURVEY OF STUDENT ENGAGEMENT

The Efficacy of PCI s Reading Program - Level One: A Report of a Randomized Experiment in Brevard Public Schools and Miami-Dade County Public Schools

KENTUCKY FRAMEWORK FOR TEACHING

Advancing the Discipline of Leadership Studies. What is an Academic Discipline?

UK Institutional Research Brief: Results of the 2012 National Survey of Student Engagement: A Comparison with Carnegie Peer Institutions

Proficiency Illusion

Professional Learning Suite Framework Edition Domain 3 Course Index

(Includes a Detailed Analysis of Responses to Overall Satisfaction and Quality of Academic Advising Items) By Steve Chatman

Soaring With Strengths

Top Ten: Transitioning English Language Arts Assessments

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Program Assessment and Alignment

THEORY OF PLANNED BEHAVIOR MODEL IN ELECTRONIC LEARNING: A PILOT STUDY

Examinee Information. Assessment Information

Colorado State University Department of Construction Management. Assessment Results and Action Plans

Short Term Action Plan (STAP)

Table of Contents PROCEDURES

EXECUTIVE SUMMARY. Online courses for credit recovery in high schools: Effectiveness and promising practices. April 2017

An Introduction and Overview to Google Apps in K12 Education: A Web-based Instructional Module

PCG Special Education Brief

ACBSP Related Standards: #3 Student and Stakeholder Focus #4 Measurement and Analysis of Student Learning and Performance

PUBLIC CASE REPORT Use of the GeoGebra software at upper secondary school

Scholastic Leveled Bookroom

Challenging Texts: Foundational Skills: Comprehension: Vocabulary: Writing: Disciplinary Literacy:

MATH 205: Mathematics for K 8 Teachers: Number and Operations Western Kentucky University Spring 2017

BSP !!! Trainer s Manual. Sheldon Loman, Ph.D. Portland State University. M. Kathleen Strickland-Cohen, Ph.D. University of Oregon

TAIWANESE STUDENT ATTITUDES TOWARDS AND BEHAVIORS DURING ONLINE GRAMMAR TESTING WITH MOODLE

K5 Math Practice. Free Pilot Proposal Jan -Jun Boost Confidence Increase Scores Get Ahead. Studypad, Inc.

Math Intervention "SMART" Project (Student Mathematical Analysis and Reasoning with Technology)

Evaluation of Respondus LockDown Browser Online Training Program. Angela Wilson EDTECH August 4 th, 2013

Biological Sciences, BS and BA

Linking the Common European Framework of Reference and the Michigan English Language Assessment Battery Technical Report

PREDISPOSING FACTORS TOWARDS EXAMINATION MALPRACTICE AMONG STUDENTS IN LAGOS UNIVERSITIES: IMPLICATIONS FOR COUNSELLING

NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE)

School Inspection in Hesse/Germany

Chicago State University Ghana Teaching and Learning Materials Program:

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

TRI-STATE CONSORTIUM Wappingers CENTRAL SCHOOL DISTRICT

THE WEB 2.0 AS A PLATFORM FOR THE ACQUISITION OF SKILLS, IMPROVE ACADEMIC PERFORMANCE AND DESIGNER CAREER PROMOTION IN THE UNIVERSITY

STUDENT LEARNING ASSESSMENT REPORT

Transcription:

Implementation Quality of the ORIGO Stepping Stones Program Interim Report Roisin P. Corcoran, Ph.D. Joseph M. Reilly Johns Hopkins University January 30, 2016

ORIGO Evaluation 2015 2 Introduction As described by the developers, the ORIGO Stepping Stones program (Stepping Stones) is an elementary mathematics curriculum for grades k 5. It blends digital and print resources to create a unique learning environment for students. The teacher print resources and digital tools are all available online, and the student materials are available in either print or digital versions. Features include: 144 detailed lessons in grades 1 5; 72 whole-class lessons in kindergarten, and a range of small group activities written to meet the Common Core State Standards (CCSS) and the Texas Essential Knowledge and Skills (TEKS), with student materials in both English and Spanish. The flexibility for every teacher to view and access all content for grades k 5. The ability to differentiate instruction using extra help, extra practice, and extra challenge materials and procedures. Embedded online professional learning for teachers that provides the materials and pedagogical knowledge needed to deliver content aligned with the CCSS and TEKS. An appropriate blend of print and digital resources to assist teachers and schools in transitioning to digital core programs. According to its supporting materials, Stepping Stones was written and developed by a team of mathematics and education experts. Each module includes materials covering essential background information for the module s lessons, providing teachers the mathematical focus, learning targets, and Standards of Mathematical Practice alignment for the module. The research behind the pedagogical approach and the development of strategies within each module is also explained and referenced. In all, there are over 72 online pages explaining how the program reflects current research.

ORIGO Evaluation 2015 3 Study Design The purpose of the present study is to evaluate the impact of Stepping Stones in the Worthington (OH) School District. The mixed-methods evaluation design involves classroom observations, an online teacher survey, and student achievement scores on the Northwest Evaluation Association Measurement of Academic Progress (NWEA MAP) and Measurement of Academic Progress for Primary Grades (NWEA MAP/MPG). The sample included 11 elementary schools that implemented the program in the Worthington School District. For the 2012 2013 school year, the program was initiated in grades k 2. In the 2013 2014 school year, implementation was expanded to include grades 3 5. During both the 2013 2014 and 2014 2015 school years, the program was the lead math curricula used in this district across all elementary grade levels. This report presents findings from the teacher survey and classroom observations. Analysis and discussion of the study s quantitative results involving student achievement scores on the NWEA MAP and NWEA MAP/MPG assessments will be presented in a subsequent report in Spring 2016. Evaluation Questions This report seeks to address the following questions: 1. To what degree do teachers regard ORIGO Stepping Stones to be effective with regard to: Teaching mathematics in general and addressing the CCSS in particular? Being easy to implement? Fostering student interest in and learning of mathematics? 2. To what degree are students engaged and interested in ORIGO Stepping Stones during classes? 3. To what degree do teachers implement ORIGO Stepping Stones with high fidelity?

ORIGO Evaluation 2015 4 Participants The study involved schools from the Worthington Public School District in Worthington, Ohio, using the Stepping Stones program. This district contains 11 elementary schools and is located in suburban central Ohio, approximately 10 miles north of Columbus Ohio s largest city and state capitol (Worthington City Government [WCG], 2014). According to the district website, 90 of our students move on to higher education; 80 of our teachers have a master s degree or more (Worthington Public School District [WPSD], 2014). In total, 215 teachers (including special educators and intervention specialists) and approximately 4,500 students across all 11 elementary schools in this district implemented the Stepping Stones program during the 2014 2015 school year. As part of the outlined evaluation, all teachers implementing the program were invited to participate in an online teacher survey and classroom observations were conducted with six classrooms across three participating schools. Student achievement data from the NWEA MAP and MAP/MPG assessments will be collected from all participating schools at the close of the 2015 2016 school year. Measures Teacher survey. In the present study, teachers utilizing the Stepping Stones program were administered a brief online survey via the Qualtrics online survey platform in Fall 2015. The survey consisted of both Likert-type rating items and open-ended questions that address Evaluation Questions 1-3. Specifically, the survey comprised items involving fidelity of implementation, nature of implementation experiences, and teacher program perceptions. The survey was developed for the purpose of evaluating the Stepping Stones program in the Worthington Public School District and was administered by the JHU research team with the assistance of the Worthington Public School District Office of Academic Achievement. The Likert-type items were used to ascertain teachers perspectives on fidelity of implementation,

ORIGO Evaluation 2015 5 fidelity of intervention, and general program efficacy. The response format varies, but typically participants rated their level of agreement using a 5-point Likert-type scale from 1 (strongly disagree) to 5 (strongly agree). In October 2015, the research team with the assistance of the WPS Office of Academic Achievement invited all WPS elementary educators assigned Stepping Stones subscriptions participating in program implementation to complete the online survey (a total of 215 individuals including special educators and intervention specialists). In total, 73 teachers representing grades kindergarten through fifth from across all 11 elementary schools completed the survey during the administration window. This convenience sample consisted of 72 classroom teachers and one intervention specialist with a gender composition of 65 females and eight males. Classroom Organization Rubric (COR). Classroom observations occurred in Fall 2015, and involved use of the Classroom Organization Rubric (COR). Each observation consisted of researchers using the COR for three consecutive 15-minute intervals, followed by a summary section. Presented in the appendix, the COR is divided into four key areas: instructional orientation, instructional strategies, behavior management, and student engagement. Each of these areas is assessed by a set of three items. Observations for the presented project consisted of two raters independently observing six classrooms from three WPS elementary schools in October 2015, all of which were first-grade classes. First-grade classes were selected specifically for observation based on the findings of research conducted on the Stepping Stones program in the Worthington district during the 2013 2014 school year. This research (also conducted by JHU) indicated that first-grade students receiving the Stepping Stones program scored significantly higher than control students in other districts not receiving the program. Further, first grade was the only grade to achieve statistical significance of this type in the study (Corcoran, Eisinger,

ORIGO Evaluation 2015 6 Reilly, & Ross, 2015). For this reason, first grade was selected for priority observation in the outlined study in order to begin to descriptively characterize those factors unique to this grade that might be contributing to these significant effects. For ease of comparison, all observations were conducted during the students mathematics class. The observation measure (COR) was developed for the purpose of this research and was loosely based on the Classroom Assessment Scoring System (CLASS; Pianta, La Paro, & Hamre, 2008). Results Teacher Survey Findings Teacher program use. As presented in Table 1, below, the teacher survey asked teachers to rate various aspects of their program use. Participants reported, on average, that they felt they were able to conduct the lessons effectively (M = 3.84, SD = 0.79), followed the steps on how to use the program (M = 3.81, SD = 0.60), and felt confident that they implemented the program correctly (M = 3.58, SD = 0.74). However, participants also often indicated that they changed parts of the program to better fit them to their curriculum goals (M = 3.94, SD = 0.81). Additionally, participants often indicated that they implemented the program differently from other teachers because of changes they made for their students (M = 3.69, SD = 0.82). In terms of training and support, on average, participants agreed that they had sufficient training in how to fit the program into their lessons (M = 3.60, SD = 0.82), and sufficient support from their principal to implement the program properly (M = 3.89, SD = 0.68). Lastly, participants slightly disagreed that parents were actively involved in the program (M = 2.56, SD = 0.80), but were neutral that they were supportive (M = 3.02, SD = 0.71).

ORIGO Evaluation 2015 7 Table 1 Teacher Use of the ORIGO Stepping Stones Program Statement Strongly Disagree Neutral Agree Strongly M SD disagree agree I follow the steps on how to use this program. 0.0 3.2 19.4 71.0 6.5 3.81 0.60 I can t find the time to teach every lesson in each module of this program. I have a hard time teaching parts of this program. I feel like I am able to conduct the lessons effectively. I have sufficient training on how to fit this program into my lessons. I find the MathEd professional development videos useful. I am confident that I implement this program correctly. I changed parts of the program to fit them into my curriculum goals. The principal or district made revisions to how we use the program. I implement this program differently from other teachers because of changes that I ve made for my students. Through using this program, I increased the frequency of activities involving differentiated instruction. My principal provides the needed support (e.g., materials, training) for the program to be used properly. Parents are actively involved in this program. 25.8 35.5 11.3 21.0 6.5 2.47 1.26 9.7 25.8 21.0 38.7 4.8 3.03 1.12 1.6 6.5 11.3 67.7 12.9 3.84 0.79 1.6 11.3 17.7 64.5 4.8 3.60 0.82 1.6 14.5 54.8 27.4 1.6 3.13 0.74 0.0 8.1 32.3 53.2 6.5 3.58 0.74 0.0 9.7 6.5 64.5 19.4 3.94 0.81 25.8 43.5 24.2 4.8 1.6 2.13 0.91 0.0 11.3 19.4 58.1 11.3 3.69 0.82 1.6 30.6 25.8 33.9 8.1 3.16 1.01 0.0 3.2 19.4 62.9 14.5 3.89 0.68 8.1 38.7 41.9 11.3 0.0 2.56 0.80 Parents are supportive of this program. 1.6 19.4 54.8 24.2 0.0 3.02 0.71 Note. N = 62 response rate for each item

ORIGO Evaluation 2015 8 Fidelity of implementation. As shown in Table 2, participants reported that, on average, they used the program Steps 1, 2, 3, and 4 (M = 3.34 4.37, SD = 0.97 0.68), with Steps 2 and 3 receiving the most extensive use (starting the lesson and teaching the lesson, respectively). Of the additional program materials, participants reported using the practice pages from practice book (M = 3.98, SD = 1.51) and the ongoing practice activities (M = 3.92, SD = 1.14) the most extensively. Table 2 Teacher Use of Lesson-Based Program Components Component/ Lesson Step Not at all Rarely Occasionally Preparing the Lesson (Step 1) Frequently Extensively 6.2 6.2 24.6 56.9 6.2 3.51 0.94 M SD Starting the Lesson (Step 2) Teaching the Lesson (Step 3) Reflecting on the Work (Step 4) Extra Practice Differentiation Activities 3.1 4.6 21.5 60.0 10.8 3.71 0.84 1.5 0.0 1.5 53.8 43.1 4.37 0.68 6.2 10.8 32.3 44.6 6.2 3.34 0.97 7.7 23.1 36.9 24.6 7.7 3.02 1.05 Extra Help Differentiation Activities Extra Challenge Differentiation Activities Ongoing Practice Activities 10.8 29.2 33.8 20.0 6.2 2.82 1.07 13.8 18.5 36.9 21.5 9.2 2.94 1.16 4.6 7.7 16.9 32.3 38.5 3.92 1.14 Use/Have students complete Practice Pages from practice book 16.9 1.5 6.2 16.9 58.5 3.98 1.51 Note. N = 65 response rate for each item

ORIGO Evaluation 2015 9 Table 3 Typical Durations of Lesson Script Steps Lesson step < 10 mins On average, Preparing the Lesson (Step 1) lasts 10 19 mins 20 29 mins 30 45 mins > 45 mins 69.2 27.7 1.5 1.5 0.0 On average, Starting the Lesson (Step 2) lasts On average, Teaching the Lesson (Step 3) lasts 75.4 23.1 1.5 0.0 0.0 12.3 63.1 21.5 3.1 0.0 On average, Reflecting on the Work (Step 4) lasts 72.3 24.6 3.1 0.0 0.0 Note. N = 65 response rate for each item In terms of the module-based materials (i.e., those materials that teachers would not use every day but would likely use every few weeks), teachers made by far the most extensive use of the assessment materials (M = 4.25, SD = 0.99), followed distantly by the problem-solving activities (M = 3.15, SD = 1.00) and the investigation activities (M = 2.77, SD = 1.09). As for the remaining module-based materials (cross-curricular materials and materials from higher and lower grades, respectively), more than half of respondents indicated using these materials either rarely or not at all.

ORIGO Evaluation 2015 10 Table 4 Materials from a grade lower than the grade I am teaching Extensively 33.8 27.7 26.2 10.8 1.5 2.18 1.07 M SD Materials from a grade higher than the grade I am teaching Investigation Activities Teacher Use of Module-Based Program Components Component/ Lesson Step Not at all Rarely Occasionally Frequently Problem- Solving Activities Cross- Curricular Activities 24.6 27.7 24.6 15.4 7.7 2.54 1.24 10.8 32.3 33.8 15.4 7.7 2.77 1.09 6.2 15.4 44.6 24.6 9.2 3.15 1.00 27.7 43.1 23.1 6.2 0.0 2.08 0.87 Assessment materials 3.1 4.6 6.2 36.9 49.2 4.25 0.99 Note. N = 65 response rate for each item

ORIGO Evaluation 2015 11 Teacher program perceptions. Participants reported, on average, that they supported the goals of the program (M = 4.00, SD = 0.60) and that they enjoyed teaching the program (M = 3.61, SD = 0.91). In terms of the program s perceived influence on their instructional abilities in math, more than half of participants reported that they felt that the program enhanced their knowledge of math content (M = 3.63, SD = 0.85), and increased their confidence in teaching math (M = 3.37, SD = 1.01). In terms of participants perceptions of the implementation support they received, over 65 of participants expressed satisfaction with the frequency of implementation support (M = 3.73, SD = 0.85), while over 75 of participants expressed satisfaction with the quality of implementation support (M = 3.85, SD = 0.87). Lastly, although participants on average agreed that the program met the needs of all or most of their students (M = 3.29, SD = 0.98), they were mostly neutral in their attitudes about whether the program helped create a cooperative classroom community (M = 3.18, SD = 0.78).

ORIGO Evaluation 2015 12 Table 5 Teacher Perceptions of the ORIGO Stepping Stones Program Statement Strongly Disagree Neutral Agree Strongly M SD disagree agree I enjoy teaching this program. 4.8 4.8 24.2 56.5 9.7 3.61 0.91 This program enhances my knowledge of math content. This program increases my confidence in teaching math. This program meets the needs of all or most of my students. I support the goals of this program (i.e., to make math more focused, coherent, and rigorous). 1.6 9.7 22.6 56.5 9.7 3.63 0.85 3.2 21.0 19.4 48.4 8.1 3.37 1.01 1.6 27.4 16.1 50.0 4.8 3.29 0.98 0.0 1.6 12.9 69.4 16.1 4.00 0.60 This program has helped to create a cooperative classroom community. 3.2 9.7 56.5 27.4 3.2 3.18 0.78 Note. N = 62 response rate for each item Table 6 Teacher Perceptions of ORIGO Stepping Stones Implementation Support Statement Not at all Not very Neither satisfied or dissatisfied Somewhat Very much So far, how satisfied are you with the frequency of implementation support? 0.0 10.2 22.0 52.5 15.3 3.73 0.85 So far, how satisfied are you with the quality of implementation support? 1.7 6.8 15.3 57.6 18.6 3.85 0.87 Note. N = 59 response rate for each item Participants perceptions of student engagement. Participants reported, on average, that students engagement in learning math was generally average or slightly above average relative to what was typical in their respective grades (M = 3.23, SD = 0.78). Participants also M SD

ORIGO Evaluation 2015 13 reported, on average, that they felt the program fostered students thinking/reasoning skills (M = 3.71, SD = 0.78), that students were active learners in the program (M = 3.65, SD = 0.81), and that students enjoyed participating in the program (M = 3.58, SD = 0.74). Though participants also generally reported that students learned how to problem solve more effectively as a result of the program (M = 3.27, SD = 0.85), less consensus existed among participants in this area. Table 7 Student Engagement in Math This Year Student Engagement N Significantly below average 1 1.4 Slightly below average 9 12.9 Average 36 51.4 Slightly above average 21 30.0 Significantly above average 3 4.3 Total 70 - Table 8 Teachers Perceptions of Student Engagement While Using the ORIGO Stepping Stones Program Statement M Students enjoy participating in this program. Strongly disagree Disagree Neutral Agree Strongly agree 0.0 9.7 27.4 58.1 4.8 3.58 0.74 SD Students are active learners in this program. This program fosters thinking/reasoning skills in students. 1.6 8.1 22.6 59.7 8.1 3.65 0.81 1.6 8.1 14.5 69.4 6.5 3.71 0.78 Children have learned how to problem solve more effectively as a result of this program. 1.6 16.1 40.3 37.1 4.8 3.27 0.85 Note. N = 62 response rate for each item Participant perceptions of program strengths and weaknesses. In addition to the Likert type items, the survey included open-ended questions in which participants

ORIGO Evaluation 2015 14 commented on their perceptions of the program s strengths and weaknesses, made suggestions concerning possible program improvements, and ultimately responded whether they felt that the program should continue to be implemented in their school moving forward. Participant responses to these questions are discussed below. What parts of the program have been most successful for you so far? Of those who responded to this question (N = 59), the following areas were reported most frequently as program successes or strengths: 1. Design, structure, and scope and sequence of the lessons (13 participants) 2. Assessment materials, including formative, summative, and pretest orientated materials (11 participants) 3. Workbook, Practice Page, and journal materials (10 participants) 4. Online supplemental tools and materials including Flare, Staticware Images, and Fundamentals (9 participants) 5. The overall variety of strategies and resources (including those related to differentiation) that the program employs (6 participants) 6. The Step In Discussions (4 participants) 7. Manipulatives, mats, and various hands-on materials (3 participants) Other areas reported by multiple participants as program strengths included program elements directly related to SmartBoard use (interactive lessons and visuals), the ability to review and utilize materials from grade levels above and below the grade they taught, and the opinion that the program worked particularly well for students performing below grade level. Specific participant comments regarding program successes and strengths included: The journal pages provide good practice to those who need it. It gives us a framework for teaching math.

ORIGO Evaluation 2015 15 The lessons are laid out clearly and are relatively easy to follow. Teaching the lesson works well for our students who are below grade level and the differentiation activities help them to better understand the concept when they are still lost after the mini-lesson portion. The student journals, step in discussions, and assessments have been very useful for me. What parts of the program are most challenging? Of those who responded to this question (N = 59), the following areas were reported most frequently as challenges with using the program: 1. Meeting the needs of all learners while using the program (particularly those who are already high performing) (11 participants) 2. Covering all the material in the lessons and modules (7 participants) 3. Finding materials that extend skills and provide students with extra practice (7 participants) 4. Preparation time (6 participants) 5. Differentiation with the program (5 participants) 6. The program does not go deep enough into skills/teaches too many skills each module/ jumps around (4 participants) 7. The number lines (4 participants) Other areas reported by multiple participants as challenges included not having hard copy materials provided to them, using the doubles plus strategy with students, creating meaningful student-led learning experiences while using the program, and issues with directions and examples in the assessment activities. Specific participant comments involving challenges included:

ORIGO Evaluation 2015 16 Extending the lesson for higher level math students. They complete the journal practice very quickly and correctly. Getting through all of the modules! Having time to reteach when necessary. Too much! Should Stepping Stones or another math program be implemented next year? Participants were asked if they thought Stepping Stones or another math program should be implemented again the following school year. Of those who responded (N = 59), 43 participants were in favor of continuing to implement Stepping Stones, with seven of these participants indicating that they would prefer to implement the program alongside other supplemental materials. Of the remaining responses, five participants specifically indicated that they would prefer another program, three participants expressed that they didn t know/couldn t decide, and eight participants provided responses that either praised or criticized elements of Stepping Stones but did not provide a definitive answer (five participants criticized, three praised). Specific participant comments in response to this question included: Stepping Stones. Students are making growth and successfully meeting expectations with this program. Stepping Stones should be kept for at least 5 years.... It takes 5 years to implement a successful program. Suggestions to improve Stepping Stones. Participants were asked what suggestions they have to improve Stepping Stones. In response to this question, 59 participants offered suggestions, with the following being noted most frequently: 1. Improved alignment between program components (8 participants). In this area, participants made specific note of:

ORIGO Evaluation 2015 17 a. Improving alignment between homework, ongoing practice activities, and the lessons (3 participants) b. Creating a document specifying the alignment between the games and the lessons (3 participants) c. Improving the program s alignment to the CCSS (1 participant) d. Improving the alignment between the Step In activities and the Journal activities (1 participant) 2. No suggestion/none (8 participants) 3. Provide more (and deeper) challenge and extension activities (6 participants) 4. Provide more differentiated instruction activities (ideally for every lesson) with greater variety and access (6 participants) 5. Provide more professional development (6 participants). Participants noted that ideally, training should occur during the school year (as opposed to the summer) and should provide demonstrations of best teaching practices using Stepping Stones. 6. Make revisions to assessment materials (5 participants). Participants suggested in this area to provide more extensive posttest materials with more examples, and create more consistency in formatting between the assessment and practice materials (i.e., assessment materials should be in a similar format to the practice problems). 7. Create more program focus on investigation and project-based activities for students that are connected to real life tasks (5 participants) In addition to these suggestions, which made up the majority of participant responses, several other suggestions were reported. These included specifying an alternate method of pacing for students who are lower performing, providing more options and resources (i.e., games, books, videos, homework), providing a more in-depth parent component, providing a greater degree of program access to students online, providing extra practice problems in the journal, and

ORIGO Evaluation 2015 18 possibly reducing the amount of activities for each lesson and module. Specific participant comments involving suggestions included: Create a space for student led real world extension projects. Offer training that involves showcasing best teaching practices and implementation of Stepping Stones instead of just teaching what is available. I would love some differentiation built into each lesson so I m not left having to search the grade below or the grade above. Other participant comments. Lastly, 14 participants offered additional comments concerning their experiences with the program. The majority of these comments either reiterated participant s general perceptions of the program, reiterated perceptions of the program s strengths and weaknesses, reiterated suggestions they had for improving the program, or provided details concerning their implementation of the program. Specific participant closing comments included: I really appreciate how supported I feel in the resources that I have from ORIGO all the teacher tools, related games, step-by-step lessons, etc. I think this program is great for average and below average students. Create more resources and posters for the higher grades. The number line lessons can be very difficult for young learners. More lessons are needed for this strategy so that students are more comfortable using it. Classroom Observation Findings Observations were conducted in Fall 2015 using the Classroom Organization Rubric (COR). Each observation consisted of researchers using the COR for three consecutive 15- minute intervals, followed by a summary section. Presented in the appendix, the COR is divided into four key areas: instructional orientation, instructional strategies, behavior management, and student engagement. Each of these areas is assessed by a set of three items.

ORIGO Evaluation 2015 19 As outlined, observations for the presented project consisted of two raters independently observing six classrooms from three WPS elementary schools in October 2015, all of which were first-grade classes. As previously discussed, first grade was targeted for observation because of prior research demonstrating the statistically significant positive results exhibited by this grade in comparison to control students not receiving Stepping Stones (Corcoran et al., 2015). For ease of comparison, all observations were conducted during the students mathematics classes. Inter-rater agreement. Table 9 shows the percent agreement between the two independent classroom observers by scale item and time interval on the COR. As shown, the raters had relatively high levels of agreement both across time intervals, and for each individual scale item, with an overall 74.07 agreement rate across all scale items and time intervals. Furthermore, for items observers disagreed upon, only six items differed by more than one point on the observation Likert scale. In other words, 89.29 of all observer rating differences were within 1 Likert point of each other. Table 9 Inter-Rater Percent Agreement by Time Interval and Scale Item Time Interval Items Interval Interval Interval 1 2 3 SUM Agree IO 1 6 4 3 13 72.22 IO 2 5 6 4 15 83.33 IO 3 6 4 2 12 66.67 IS 1 6 5 4 15 83.33 IS 2 3 5 6 14 77.78 IS 3 6 3 2 11 61.11 BM 1 6 5 5 16 88.89 BM 2 4 3 2 9 50.00 BM 3 4 6 2 12 66.67 SE 1 6 6 5 17 94.44 SE 2 4 6 6 16 88.89 SE 3 5 2 3 10 55.56 SUM 61 55 44 Agree 84.72 76.39 61.11 Across All Time Intervals 160 74.07

ORIGO Evaluation 2015 20 Observation results. In terms of instructional orientation, whole group instruction represented the most prevalent orientation teachers used throughout most of the lessons, with teachers generally making frequent or extensive use of this during the first 30 minutes of lessons. Use of this orientation gradually decreased, however, as the lessons progressed and was generally only observed occasionally during the final 15 minutes of the observed lessons. In contrast, cooperative learning and individual tutoring, though generally not used as prevalently as whole group instruction during the observations, both increased in prevalence during the middle and closing sections of many lessons, with both orientations, on average, demonstrating peak usage by teachers during the final 15 minutes of the observations. Secondly, in terms of the instructional strategies used by teachers, teachers made the most extensive use of providing students with feedback on learning throughout the observed lessons. On average, this strategy was exhibited by teachers frequently to extensively across all three observation time intervals. Though not used quite as extensively as providing students with regular feedback on learning, the use of high-level questioning strategies was also used, on average, frequently by participants throughout the observations. Finally, though teachers were observed making frequent to extensive communication with students concerning the purpose behind the lesson and instructional activities during the first 15 minutes of lessons, teachers gradually used this strategy less and less as the lessons progressed (i.e., they typically communicated the lesson s purpose early in the lesson but would not make extensive reference to it later). In terms of the behavior management domain, students appeared to consistently understand the procedures, rules, and expectations regarding classroom conduct in the observed classes. Furthermore, teachers generally made frequent use of preventative behavior management strategies throughout the observed lessons and, on average, occasionally provided students with feedback on behavior.

ORIGO Evaluation 2015 21 Lastly, in terms of the student engagement domain, consistent and thorough engagement and on-task behavior was exhibited from students in the lessons observed. In fact, ratings for student engagement represented the highest (or tied for the highest) observation ratings exhibited on the COR across all three observation time intervals. Furthermore, students were consistently observed appearing to enjoy the work they were completing. On average, ratings in this area were frequent to extensive throughout the observations and though generally consistent across observation time intervals, peaked slightly during the observations final 15 minutes. Lastly, though students were, on average, frequently observed thinking deeply about the material, ratings in this area were less than those concerning student engagement and student enjoyment and, on average, declined slightly between the lessons beginning and endpoints. Table 10 Classroom Observation Time Interval Classroom Observation 1 2 3 Mean SD Mean SD Mean SD Whole Group Instruction 3.83 0.39 3.42 0.90 2.08 1.44 Instructional Orientation Cooperative Learning 0.67 1.23 1.17 1.40 2.17 1.03 Individual Tutoring 0.00 0.00 1.50 1.31 2.58 1.17 Instructional Strategies Lesson Purpose Communicated 3.17 0.94 2.92 1.44 1.67 1.30 Strategic Questioning 2.92 0.67 2.92 0.67 2.67 0.78 Feedback on Learning 3.33 0.49 3.25 0.45 3.33 0.65 Behavior Management Rules/Expectation Communicated 3.33 0.49 3.25 0.45 3.25 0.45 Preventative Management 3.00 0.60 2.75 0.45 2.58 1.00 Feedback on Behavior 2.33 0.89 2.00 1.21 2.17 1.34 Student Engagement 80 Consistently Engaged 3.83 0.39 3.83 0.39 3.75 0.45 Students Enjoy the Work 3.33 0.65 3.33 0.78 3.67 0.49 Students Think Deeply 3.25 0.45 2.67 0.65 2.58 0.52

ORIGO Evaluation 2015 22 Conclusions and Discussion Data gathered from the teacher surveys and classroom observations yielded several key conclusions concerning the Stepping Stones program as it pertains to the outlined research project. Ultimately, when considered in combination, findings from these data demonstrated overall support for the program as it pertained to user experiences and perceptions. To this point, the vast majority of teachers indicated that they supported the goals of the program (85.5), and about two-thirds of teachers indicated that they enjoyed using the program and found that the program had enhanced their content knowledge in math. Furthermore, the majority of teachers indicated that they felt students were engaged with and enjoyed using the program. This finding was further highlighted during the classroom observations, where student engagement scores represented the highest scores recorded by researchers across the 12-item observation tool. With these positive findings in mind, it is important to consider some additional findings concerning the program s implementation and perceived effectiveness. First, although the vast majority of teachers expressed that they feel confident in their abilities to conduct the lessons effectively (80.6) and follow the steps on how to use the program (77.5), 83.9 of teachers indicated that they changed parts of the program to better match their curriculum goals, and 69.4 indicated that they implemented the program differently from other teachers because of changes they made for their students. Additionally, these changes appeared to be teacher driven and not necessarily the result of a lack of training or principal support. To be specific, teachers expressed overall satisfaction with the frequency and quality of professional development and training they had received, and very few teachers (only 6.4) indicated that their principal or the district had made revisions to how they used the program. Furthermore, nearly 80 of teachers agreed that their principal provided them the support necessary for them to use the program properly.

ORIGO Evaluation 2015 23 Though a detailed investigation of the nature and impact of teacher alterations such as these is beyond the scope of the presented research, it is worth considering how other teacher survey responses may orient these results. One such result worth discussing in light of these findings is the perception expressed by numerous participants that the program may benefit from supplementation in order to better address the needs of diverse learners, particularly students who are already high performing. Specifically, difficulty using the program to meet the needs of all learners represented the challenge expressed most frequently by teachers in the survey. Similarly, only slightly more than half of teachers (54.8) indicated that they felt the program met the needs of all or most all of their students. With these findings in mind, it is important to consider the ways in which teacherdriven program adaptations and modifications of this type may affect the program s functioning. Though contextualized modifications designed by teachers to target specific student or curricular needs may be efficacious in certain instances, proper fidelity of program implementation, particularly as it pertains to core program components, is often treated as a crucial requisite for core curriculum programs functioning as they are intended. Thus, it is important to consider ways in which program fidelity, particularly as it pertains to utilizing supplemental components embedded within the program designed to reach learners with diverse abilities, may be better cultivated with teachers (including differentiation activities and activities from higher and lower grade levels). By building capacity in this area in schools and teachers implementing Stepping Stones, potential drawbacks associated with premature program modification may be averted and more consistent implementation may occur. Ultimately, despite these variations in implementation, it is worth reiterating the overall positive results of the outlined study. Over 75 of teachers indicated that they felt the program fostered students thinking and reasoning skills, and 72.8 recommended that the

ORIGO Evaluation 2015 24 program continue being used in their schools. Moving forward, exploration of the outlined study s quantitative results will provide further insight into the program s efficacy as it relates to improving student achievement. This research, combined with further research investigating program implementation outcomes and the program s impact on students across multiple years, will result in even greater understanding of this promising program.

ORIGO Evaluation 2015 25 References Corcoran, R. P., Eisinger, J. M., Reilly, J. M., & Ross, S. M. (2015). Preparing students for the Common Core State Standards in mathematics: An evaluation of the ORIGO Stepping Stones Mathematics program. Presentation at the annual meeting of the American Educational Research Association, Chicago, IL. Council of Chief State School Officers (CCSSO) & National Governors Association Center for Best Practices (NGA Center). (2014). Common Core State Standards Initiative: Resources. Washington, DC: CCSSO & NGA Center. Retrieved from http://www.corestandards.org/about-the-standards/ Northwest Evaluation Association. (2013). Measure of Academic Progress (MAP). Portland, OR: NWEA. Pianta, R. C., La Paro, K., & Hamre, B. (2008). Classroom assessment scoring system: K-3. Baltimore, MD: Brookes Publishing. Worthington City Government (WCG). (2014). Worthington, Ohio: About section. Worthington, OH: WCG. Retrieved from http://www.worthington. org/index.aspx?nid=81 Worthington Public School District (WPSD). (2014). Worthington Public Schools: Academic achievement section. Worthington, OH: WPSD. Retrieved from http://www.worthington.k12.oh.us/site/default.aspx?pageid=1