TI.4.b Rationale for and evidence of changes in TI implementation Refinements to the Pirate CODE We do not deem the changes noted below as significant, rather we view them as refinements to the original TI/Pirate CODE submission. The refinements reflect the evolution of the Pirate CODE innovations and the increased research capacity of lead faculty involved with Pirate CODE innovations. 1. Refinements in the R&D language used to describe the stages of implementation. Figure 3 in the original ECU TI Proposal, the Pirate CODE, outlined a five-step research and development (R&D) methodology from problem analysis through adoption. While the language used in this model aligns with the research literature, it lacked stamina as the Pirate CODE components were developed and implemented. Faculty were led by the Dean of the College of Education to use a language of implementation which differed from the R&D methodology language. This became the common implementation language among the faculty and is now a part of the faculty s lexicon as they converse about the Pirate CODE. Additionally, the Pirate CODE elements were originally called projects, or components, or even project components. As implementation continued, the faculty settled upon referring to the Pirate CODE projects as innovations. The changes highlighted in the table below reflect the original implementation language used at each step of Pirate CODE innovation implementation and study and the current implementation language. Original TI Language Problem Analysis Prototype Design Field Test and Iterative Refinement Summative Evaluation Component Adoption with Evaluation Monitoring Current TI Implementation Language Squishy Pilot Formal Pilot Refinement & Expansion Scale Up Impact and Reflective Studies Last updated: July 8, 2014 1 TI4b Rationale and Evidence TI Changes
2. Refinements in Proposed Major Research Questions. Table 1 in the original ECU TI Proposal listed proposed research questions for each Pirate CODE innovation. As the innovations moved through the R&D model and faculty research capacity matured, the research questions also evolved. The revised table below contains revised major research questions as of May 2014. Pirate CODE Introductory Clinical Observation for Novice Observers/Video Grand Rounds ISLES Instructional Strategies Modules Develop and validate a structured observation protocol using video segments prior to field experiences Design a series of online modules to increase knowledge of select, research-based, instructional strategies Will the incorporation of classroom video segments for observation in conjunction with an observational guide result in higher quality classroom observations and student course satisfaction than the traditional unstructured observation process in the current ELEM 2123 course? How effectively do interns apply ISLES instructional strategies in authentic classroom settings? 1. Is there a difference between teacher candidates observations skills and knowledge transfer from video observation to observations in the field when candidates are exposed to the VGR (incorporation of classroom videos for observation, structured observation protocol, in-class debriefing conversations) model? 2. How do opportunities to observe, reflect upon, and discuss videos of classroom interactions affect teacher candidates observations of and reflections on local classroom interactions? 3. In what ways do observation skills and knowledge transfer from VGR (incorporation of classroom videos for observation, structured observation protocol, in-class debriefing conversations) to non-structured observation events? 1. What elements comprise fidelity of implementation measure? 2. To what extent, if any, can fidelity of implementation measures be validated? 3. What practical challenges are inherent in implementing fidelity measures? 4. What is the fidelity of implementation regarding the ISLES (ISLES 1,2,3) module series? Last updated: July 8, 2014 2 TI4b Rationale and Evidence TI Changes
edtpa Preparation Modules Integrating ISD- Development Strategies Observation Model Support with Instructional Coaches Incorporate edtparelevant instructional development strategies leading to effective edtpa module design and implementation Provide coordinated support for enhancing clinical internship effectiveness using a coaching model Were candidates able to display mastery of and incorporate the series of ISD strategies in effective simulated and actual edtpa tasks? What impact does the instructional coach have on the pre-service teacher s ability to effectively use the TQP instructional strategies? 5. Do candidates appropriately utilize ISLES strategies? 6. To what extent, if any, is the candidate performance on ISLES 3 predictive of edtpa Task 2 performance? 7. To what extent, if any, is the edtpa predictive of positive student achievement (VAM)? 8. What is the variation of edtpa performance in relation to the fidelity of ISLES implementation? Not applicable. 1. Do ECU graduates that received instructional coaching impact student achievement at a higher rate than ECU graduates that did not receive coaching? 2. Do ECU graduates that received instructional coaching return for a second year of teaching at a higher rate than ECU graduates that did not receive coaching? 3. What impact does instructional coaching have on the use of effective instructional practices in the clinical internship? Last updated: July 8, 2014 3 TI4b Rationale and Evidence TI Changes
Model for Coordinating Clinical Support and Professional Development Experience Co- Teaching Model Develop a professional development model to link and clarify the roles of clinical teachers, university supervisors, and instructional coaches during the internship Experiment with different co-teaching models to optimize teacher candidate learning What impact does the revised professional development model have on the clinical internship model? Does professional development increase the ability to effectively support interns? How effective is the internship coordination of the roles of communication between clinical teachers, university supervisors, instructional coaches, and faculty? Of the co-teaching models being implemented, what differences exist in the teaching ability of the participants as compared to traditional placements? 1. What components are necessary for on-line training modules for CTs and USs to be effective? 2. Given limited financial support, what configuration of support teams for teacher education candidates is most effective and plausible? 1. Is there a difference between the student achievement of K-12 students placed in co-teaching classrooms from the student achievement of K-12 students not placed in co-teaching classrooms? 2. Is there a difference between the readiness to teach of candidates placed in co-teaching classrooms from that of students not placed in coteaching classrooms? 3. Which model of co-teaching (2:1 or 1:1) has greater impact on K-12 student learning? 4. Are teacher candidates who are prepared using the co-teaching model more collaborative than teacher candidates prepared in the traditional model? Are there differences between the collaboration of 2:1 teacher candidates and 1:1 teacher candidates? 5. How does co-teaching impact the teaching efficacy of candidates? 6. How well does the teacher candidate that co- Last updated: July 8, 2014 4 TI4b Rationale and Evidence TI Changes
edtpa Administration Develop a replicable model for edtpa implementation to improve candidate readiness, consistency of instruction, and interrater reliability Was the edtpa administrative model successful in implementing the edtpa summative portfolio, collecting the necessary data, and scoring the candidate performance? taught during the internship perform the first year teaching solo in the field? 1. How can the fidelity of edtpa implementation be improved? 2. To what extent does performance on the edtpa predict student achievement of beginning teachers? 3. What is the relationship between local evaluations and national scores? 4. Is candidate performance on embedded signature assessments predictive of performance on the edtpa? 5. What is the effect of pathway on edtpa performance? 3. Refinements in Data Collection Table 6 in the original ECU TI Proposal the Pirate CODE outlined research data collection metrics for each Pirate CODE innovation. As the innovations were implemented, the data collection metrics and processes were refined to support each individual innovation. How the data collection evolved, including how the Pirate CODE impacted the EPP s assessment system, is documented in Section 2.3. Last updated: July 8, 2014 5 TI4b Rationale and Evidence TI Changes