edtpa MYTHS and FACTS

Similar documents
Secondary English-Language Arts

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Developing an Assessment Plan to Learn About Student Learning

CONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION. Connecticut State Department of Education

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Office: Bacon Hall 316B. Office Phone:

Indiana Collaborative for Project Based Learning. PBL Certification Process

Master of Science (MS) in Education with a specialization in. Leadership in Educational Administration

DOCTOR OF PHILOSOPHY BOARD PhD PROGRAM REVIEW PROTOCOL

Language Arts Methods

Scoring Guide for Candidates For retake candidates who began the Certification process in and earlier.

Linguistics Program Outcomes Assessment 2012

What does Quality Look Like?

Youth Sector 5-YEAR ACTION PLAN ᒫᒨ ᒣᔅᑲᓈᐦᒉᑖ ᐤ. Office of the Deputy Director General

Contract Language for Educators Evaluation. Table of Contents (1) Purpose of Educator Evaluation (2) Definitions (3) (4)

SACS Reaffirmation of Accreditation: Process and Reports

West Georgia RESA 99 Brown School Drive Grantville, GA

EDUC-E328 Science in the Elementary Schools

Lecturer Promotion Process (November 8, 2016)

STUDENT EXPERIENCE a focus group guide

Program Report for the Preparation of Journalism Teachers

Kelso School District and Kelso Education Association Teacher Evaluation Process (TPEP)

CREDENTIAL PROGRAM: MULTIPLE SUBJECT Student Handbook

Greta Bornemann (360) Patty Stephens (360)

STUDENT PERCEPTION SURVEYS ACTIONABLE STUDENT FEEDBACK PROMOTING EXCELLENCE IN TEACHING AND LEARNING

DESIGNPRINCIPLES RUBRIC 3.0

School Leadership Rubrics

Content Teaching Methods: Social Studies. Dr. Melinda Butler

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

Loyola University Chicago Chicago, Illinois

Queensborough Public Library (Queens, NY) CCSS Guidance for TASC Professional Development Curriculum

Maintaining Resilience in Teaching: Navigating Common Core and More Site-based Participant Syllabus

An Introduction to LEAP

Chapter 9 The Beginning Teacher Support Program

ABET Criteria for Accrediting Computer Science Programs

Requirements for the Degree: Bachelor of Science in Education in Early Childhood Special Education (P-5)

University of Toronto Mississauga Degree Level Expectations. Preamble

Revision and Assessment Plan for the Neumann University Core Experience

Programme Specification. MSc in Palliative Care: Global Perspectives (Distance Learning) Valid from: September 2012 Faculty of Health & Life Sciences

Self Assessment. InTech Collegiate High School. Jason Stanger, Director 1787 Research Park Way North Logan, UT

CÉGEP HERITAGE COLLEGE POLICY #15

SHEEO State Authorization Inventory. Nevada Last Updated: October 2011

Residency Principal and Program Administrator Internship and Certification Handbook

Navitas UK Holdings Ltd Embedded College Review for Educational Oversight by the Quality Assurance Agency for Higher Education

World s Best Workforce Plan

HIGHLAND HIGH SCHOOL CREDIT FLEXIBILITY PLAN

Department of Teaching and Learning

Graduation Initiative 2025 Goals San Jose State

Professional Learning Suite Framework Edition Domain 3 Course Index

SPECIALIST PERFORMANCE AND EVALUATION SYSTEM

PEDAGOGY AND PROFESSIONAL RESPONSIBILITIES STANDARDS (EC-GRADE 12)

Field Experience and Internship Handbook Master of Education in Educational Leadership Program

SHEEO State Authorization Inventory. Kentucky Last Updated: May 2013

Strategic Planning for Retaining Women in Undergraduate Computing

Xenia High School Credit Flexibility Plan (CFP) Application

The following faculty openings are managed by our traditional hiring process:

K-12 PROFESSIONAL DEVELOPMENT

Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd. Hertfordshire International College

Statewide Strategic Plan for e-learning in California s Child Welfare Training System

Course Specification Executive MBA via e-learning (MBUSP)

Arkansas Tech University Secondary Education Exit Portfolio

Goal #1 Promote Excellence and Expand Current Graduate and Undergraduate Programs within CHHS

Developing Quality Fieldwork Experiences for Teacher Candidates. A Planning Guide for Educator Preparation Programs and District Partners

Individual Interdisciplinary Doctoral Program Faculty/Student HANDBOOK

Multiple Measures Assessment Project - FAQs

Ph.D. in Behavior Analysis Ph.d. i atferdsanalyse

ACADEMIC AFFAIRS POLICIES AND PROCEDURES MANUAL

Connecting to the Big Picture: An Orientation to GEAR UP

Standards and Criteria for Demonstrating Excellence in BACCALAUREATE/GRADUATE DEGREE PROGRAMS

Math Pathways Task Force Recommendations February Background

UK Institutional Research Brief: Results of the 2012 National Survey of Student Engagement: A Comparison with Carnegie Peer Institutions

Student Experience Strategy

NC Global-Ready Schools

Title II, Part A. Charter Systems and Schools

EQuIP Review Feedback

- COURSE DESCRIPTIONS - (*From Online Graduate Catalog )

SHEEO State Authorization Inventory. Indiana Last Updated: October 2011

Baker College Waiver Form Office Copy Secondary Teacher Preparation Mathematics / Social Studies Double Major Bachelor of Science

Promotion and Tenure Guidelines. School of Social Work

University of Oregon College of Education School Psychology Program Internship Handbook

Improving recruitment, hiring, and retention practices for VA psychologists: An analysis of the benefits of Title 38

Innovating Toward a Vibrant Learning Ecosystem:

ACADEMIC AFFAIRS GUIDELINES

TRANSNATIONAL TEACHING TEAMS INDUCTION PROGRAM OUTLINE FOR COURSE / UNIT COORDINATORS

Higher Education Review of University of Hertfordshire

The Teaching and Learning Center

Envision Success FY2014-FY2017 Strategic Goal 1: Enhancing pathways that guide students to achieve their academic, career, and personal goals

The SREB Leadership Initiative and its

University of Cambridge: Programme Specifications POSTGRADUATE ADVANCED CERTIFICATE IN EDUCATIONAL STUDIES. June 2012

Department of Geography Bachelor of Arts in Geography Plan for Assessment of Student Learning Outcomes The University of New Mexico

Supplemental Focus Guide

Spring Valley Academy Credit Flexibility Plan (CFP) Overview

TASK 2: INSTRUCTION COMMENTARY

MPA Internship Handbook AY

MAINTAINING CURRICULUM CONSISTENCY OF TECHNICAL AND VOCATIONAL EDUCATIONAL PROGRAMS THROUGH TEACHER DESIGN TEAMS

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

and Beyond! Evergreen School District PAC February 1, 2012

ACCREDITATION STANDARDS

Indicators Teacher understands the active nature of student learning and attains information about levels of development for groups of students.

Transcription:

edtpa MYTHS and FACTS Myth: Pearson hires part-time employees who are unqualified and don t score reliably. External scorers don t know our candidates and can t/shouldn t judge their teaching. Pearson scores edtpa. Fact: Although Pearson manages edtpa scoring activities, all scoring training is designed by the Stanford Center for Assessment, Learning and Equity (SCALE) and all scoring is conducted by educators. edtpa scorers include teacher educators, clinical supervisors of student teachers, K-12 teachers, administrators and National Board Certified Teachers. All scorers are selected because of their verified experience both with beginning teachers and teaching the subject-matter area in which they will score. The criteria for selecting and training scorers are rigorous and include: Expertise in the subject matter or developmental level of the teaching field (degree and/or professional experience). Teaching experience in the field (or experience teaching methods courses or supervising student teachers in that field). Experience mentoring or supervising beginning teachers or administering programs that prepare them. Overall, approximately 50 percent of scorers hired are faculty/supervisors and 50 per cent are teacher leaders. All scorers complete an extensive 20-plus hour training curriculum that includes multiple checks to ensure that they score consistently. In addition, all scorers complete an anti-bias module that highlights potential sources of bias that might influence scoring accuracy. These include characteristics of teacher candidates (gender, socioeconomic status, region/location, and language), context or features of instruction, portfolio response characteristics (quality of writing, technical quality of materials, nature of instructional materials), instructional context characteristics (classroom setting or context, curriculum constraints, grade level or teaching assignment), and halo/pitchfork effects (undue influence of performance on initial rubrics on later scoring). In addition, SCALE has trained all subject-specific scoring trainers to review these potential sources of scoring bias when conducting interactive sessions that analyze evidence and score justifications for the practice portfolio. These interactive sessions precede the scoring of qualification portfolios that determine scorers readiness to score actual portfolios. Finally, while scoring, scorers are back-read by scoring supervisors and score previously scored validity portfolios to ensure they continue to score consistently and without bias. 1

Myth: Pearson owns and is in control of the assessment design. Fact: Stanford University is the exclusive author and owner of edtpa. All assessment and assessment materials were created with extensive input from teachers and teacher educators from across the country over a four-year development process. The Stanford Center for Assessment, Learning and Equity (SCALE) is solely responsibility for developing all edtpa handbooks, rubrics, scoring training, supervision of scoring trainers, all benchmarking and training materials and support resources for candidates and programs. Evaluation Systems, a group of Pearson, is the operational partner that provides the technical infrastructure to collect candidate materials, hires educators to score the materials, and delivers score reports to teacher candidates and preparation programs. edtpa s architecture was developed and many handbooks were drafted prior to licensing with Pearson. Myth: edtpa handbooks and rubrics are poorly constructed and unclear. Fact: An extensive, multi-year development process involved teachers and teacher educators in the assessment s design, review, piloting, and field tests. edtpa design and review team members included a wide range of university faculty, P 12 teachers, and representatives of national subject-matter organizations, such as the Specialized Professional Associations (SPA) associated with the Council for Accreditation of Educator Preparation (CAEP). Throughout the process of pilot, field-testing and first year of operations, the language and structure of prompts and rubrics have been vetted by subject matter experts, candidates, and faculty, who have provided input to revise any language perceived as being confusing. In addition, working closely with scorers and scoring trainers, SCALE reviews scoring data, to identify, revise and clarify confusing language in handbook directions, prompts and rubrics. Finally, the questions posed in the Online Community at edtpa.aacte.org by edtpa users from across the country inform revisions to handbooks and rubrics. Based on all sources of feedback, refreshed versions of edtpa handbooks for 2014-15 will be issued this summer, featuring changes that improve clarity. Myth: Candidates are not allowed to retake the assessment. Fact: Candidates can retake the entire edtpa or one edtpa task to meet their institution or state requirement and to demonstrate they can plan, teach and assess the learning for their students. More information about retake guidelines and what candidates submit can be found in the Resource Library at edtpa.aacte.org and on www.edtpa.com. 2

Myth: Faculty cannot assist candidates to prepare for edtpa. Fact: The actual policy is just the opposite; faculty are encouraged and expected to provide formative support to candidates. Of course, the program coursework and feedback during fieldwork are the most important supports for developing candidate competencies in planning, instructing, and assessing learning. In addition, faculty working in educator preparation programs are expected to support candidates as they prepare for edtpa. Faculty can provide students with support documents (like Making Good Choices), handbooks, samples of previously completed edtpa materials, and lesson planning templates that help them understand rubrics and other materials. More information can be found here. Myth: edtpa requires direct instruction. Fact: We have developed edtpa to allow preparation programs to support candidates using multiple approaches to teaching and learning. The design teams included educators with subject-specific expertise who used their subject-matter content and pedagogical standards to determine the types of teaching and learning edtpa handbooks would emphasize for their field. For all fields, the central focus of student learning must go beyond facts and skills to develop conceptual understandings and engage with content in meaningful ways. edtpa s focus on deep, meaningful subject-matter learning for students; the importance of connecting instruction to students prior academic learning and lived experiences; and the emphasis on high-leverage pedagogical practices can be accomplished through a variety of instructional approaches. Lastly, edtpa prompts and rubrics were reviewed by subject-matter experts (faculty and classroom teachers) as part of an extensive content validation process (see Summary Report) and the teaching practices evaluated were examined as part of a job analysis (appropriateness and frequency of use) by more than 100 educators. Myth: edtpa ignores/restricts culturally relevant pedagogy. Fact: As a nationally accessible assessment, edtpa is designed so that teacher candidates from all routes (traditional, alternative, etc.) and different geographic regions and contexts will be able to demonstrate their readiness to teach students in diverse contexts and classrooms. A key part of developing edtpa was building rubrics that would help candidates learn to effectively teach their subject matter to all students, taking into account student needs and strengths, backgrounds, contexts and lived experiences. To that end, embedded within and across the rubrics are elements identified as being essential to culturally relevant pedagogical practices. 3

Higher education faculty and administrators who use edtpa find that it helps translate awareness of culturally relevant pedagogy into classroom practice. See this article for more information. A cornerstone of effective teaching within edtpa is attention to instructional context and what students bring to their learning. Candidates design learning segments for edtpa that are based on deep knowledge of their students. Candidates describe their instructional context and the specific learning needs of students in the Context artifact and at six different points across the Task commentaries, wherein candidates are prompted to: Consider the variety of learners in your class who may require different strategies/support (e.g., students with IEPs or 504 plans, English language learners, struggling readers, underperforming students or those with gaps in academic knowledge, and/or gifted students). In addition, candidates develop lesson plans and justify in the Planning Commentary their choices or adaptations of learning tasks, instructional activities and materials based on their students prior academic learning and their personal, cultural and community assets. Students must explain how and why their lessons link prior learning with new learning and how they will draw upon students lived experiences to support meaningful learning. Scoring rubrics for Planning (rubric 3 in most fields) and Instruction (rubric 7 in most fields) examine the extent to which candidates have addressed both prior academic learning and students personal, cultural and community assets as they plan and enact those plans. The upper levels of these rubrics are applied when candidates make these connections explicitly. Myth: Determining whether a candidate is prepared for the classroom cannot be measured by assessing a 20-minute recording of their classroom practice that can be edited and/or chosen as a representation of their practice as a whole. Fact: Much like the National Board portfolio and the Measures of Effective Teaching (MET) studies, which include video clips of similar length, candidates are not judged solely on a 20-minute video. The video clips illustrate how candidates enact various aspects of effective teaching are inextricably linked with multiple sources of evidence including real artifacts of teaching including lesson plans, student work samples, and instructional materials. Candidates also provide commentaries justifying their lesson plans based on their students strengths and needs, analyzing student work, explaining feedback to students, and proposing next steps for teaching and learning. Developing a reliable understanding of whether a candidate is prepared to be the teacher of record in 4

their own classroom requires the use of multiple measures of skills, practices and performance. There is no single approach that can measure teacher competence, given how complex and varied the knowledge base is for measuring effective teaching. edtpa is a capstone, summative assessment that contributes to a multiple-measures system for licensure. Programs are encouraged to develop assessment systems that include formative, embedded signature assessments as well as summative assessments that align with state, national and specialized professional association standards for beginning teachers. Myth: Having our candidate portfolios scored by Pearson and receiving score reports made up of numbers does not support program renewal or reform. Fact: edtpa portfolios are scored by educators using analytic rubrics that describe candidate performance at each of five levels. Similar to the rubrics used to evaluate and score the National Board for Professional Teaching Standards, portfolio results from edtpa rubrics provide valuable information to programs and candidates. Score reports, in combination with faculty examination of the candidate work submitted, provide rich, actionable information for each candidate. On a program level, an analysis of score patterns and candidate portfolios can identify particular areas represented by rubrics on which a program might want to focus renewal efforts. SCALE encourages faculty to engage in local evaluation of candidate portfolios in a process of continuous improvement of programs and curriculum and has provided resources to support this effort. See these videos for examples. Myth: edtpa promotes teaching to the test and restricts the teacher education curriculum. Fact: edtpa represents a broad consensus of the teaching field about what knowledge and skills matter for a beginning teacher s performance and good teaching in general. edtpa is built on core aspects of teaching planning for instruction, engaging students in learning, assessing learning and supporting academic language development and requires them to be linked together to show the full cycle of teaching. This is why it is a capstone event in student teaching. This also is why the assessment requires real job-related artifacts from teaching lesson plans, video and student work samples in order to show the complexity of the local teaching context and the way the candidate responds to real students when trying to teach them in a real setting. In other words, if the assessment is measuring practices that teacher candidates should be expected to know how to do, then teaching to the assessment is consistent with 5

those expectations. For a more complete analysis and suggestions for maintaining curricular features integrity when implementing edtpa, see the FAQs Teacher Education Curriculum. Myth: The $300 fee to take edtpa is unfair and is pure profit for Pearson. Fact: The $300 fee for edtpa covers all development costs and operational assessment services associated with the resources and support for implementation, delivery, scoring and reporting of edtpa, as well as customer support service for candidates and faculty. Assessment services include access to and support within the edtpa Online Community network for faculty and the use of the technology platform that registers the candidate, receives the portfolio, coordinates the logistics of scoring the portfolio, analyzes the results and reports the results to the candidate. Assessment services also include the recruiting and management of qualified educators who serve as scorers, scoring supervisors, and trainers. Scorers are trained specifically to edtpa rubrics, they use standardized scoring procedures and are calibrated and monitored during scoring. The fee does not need to be paid directly by the teacher candidate. Some states or programs are paying for or subsidizing that cost. Some campuses embed the cost of edtpa in a program fee so that students can use financial aid to pay for edtpa. Pearson also has provided an allotment of financial assistance fee waivers to states with a formal agreement to participate in edtpa and that use edtpa for consequential purposes for distribution to candidates with financial need. See this document for alternatives. Fees are not unusual for professional assessments. Aspiring architects, accountants and dental hygienists can spend nearly $1,000 dollars to become licensed or certified. Nurses are charged $200 for the exam to be certified by the National Council of State Boards of Nursing. The cost for a teacher to become National Board Certified is $2,500. Myth: edtpa has not been tested for reliability and validity. Fact: edtpa is the most rigorously and widely field-tested performance assessment of new teachers ever introduced to the field. Field test data and analyses have been reviewed by independent technical advisory committees in three edtpa adopting states (NY, WA, and OH) and by a national technical advisory committee composed of nationally and internationally recognized psychometricians, researchers and assessment scholars. Though it is new, edtpa s design and architecture is much like that of the highly respected assessment of veteran teachers administered by the National Board for 6

Professional Teaching Standards. edtpa is aligned with the Interstate Teacher Assessment and Support Consortium (InTASC) standards for beginning teachers. Meeting the highest standards for assessment development, edtpa was field-tested during the 2011-12 and 2012-13 academic years. More than 12,000 students submitted edtpa portfolios for scoring. The reliability and validity studies established that the assessment is aligned to professional standards, reflects the actual work of teaching and the scores measure primary traits of effective teaching. Scoring was highly reliable, ranging from.83 to.92 as percentage of scorer agreement. In other words, edtpa is a trustworthy measure of beginning teacher skills. Educators developed edtpa to focus on characteristics of teaching that research has found to be most important: how teacher candidates plan and teach lessons in ways that make the content clear and help diverse students learn, assess the effectiveness of their own teaching, and adjust their instruction as necessary. Establishing edtpa s predictive validity will require following candidates into their teaching practice for several years to obtain a stable estimate of student learning. SCALE is committed to conducting predictive validity studies in the future that connect candidate scores to on-the-job performance. # # # 7