Topics in the Session The Dynamic Learning Maps Alternate Assessment System: Planning for District Implementation in 2014-15 Meagan Karvonen June 9, 2014 Key features of the DLM system Lessons learned in 2013-14 Assessment delivery in 2014-15 District roles and responsibilities Resources and training Overview of teacher responsibilities Score reporting Seeking your feedback Features KEY FEATURES OF THE DLM SYSTEM Academic content Assessment design Accessibility by design The integrated assessment model Scoring A Portion of the Math Map Learning Map Claims Conceptual Areas Essential Elements (and other nodes) 1
Major Claims Students can comprehend text in increasingly complex ways Students can produce writing for a range of purposes and audiences Students can communicate for a range of purposes and audiences Students can investigate topics and present information English Language Arts Conceptual Areas Determining critical elements of text Constructing understandings of text Integrating ideas and information from text Using writing to communicate Integrating ideas and Information in writing Using language to communicate with others Clarifying and contributing to discussion Using sources and information Collaborating and presenting ideas Constructing understandings of text Example EE for English Language Arts Common Core State Standard RL.6.2 Determine a theme or central idea of a text and how it is conveyed through particular details; provide a summary of the text distinct from personal opinions or judgments. Essential Element EE.RL.6.2 Determine the theme or central idea of a familiar story and identify details that relate to it. Identify two related points the author makes in an informational text Assessment Design s in Linkage Levels s 3-5 items Engagement activity ELA: Text or writing task Math: Context at beginning, carried throughout Item types Teacher-administered System-administered (MC, sorting, matching) Initial Precursor Target Successors Connect the map Distal Precursor Proximal Precursor to the items developed. a b c d e 2
Types Accessibility by Design Computer Administered Majority of testlets in DLM Allowable adaptations PNP features are utilized Teacher Administered Teacher uses computer to receive instructions, assist with test administration, and to enter responses Often used at the initial precursor level Also used when content cannot be assessed in KITE Personal Learning Profile Accessible Content Technology Accessible Content Personal Learning Profile levels Vocabulary Multiple and alternate pathways Items tagged Item writing guidelines based on Universal Design Prior knowledge Personal Needs and Preferences (PNP) Display Enhancements Language & Braille Audio & Environment System Independent First Contact Communication Academics Sensory characteristics Motor characteristics Computer access Attention Technology Special user interface Dynamic routing Integrated Assessment Model Fall-Early Spring Late Spring Instructionally embedded Test again testlets on limited Teacher/IEP team choice with sample of some constraints EEs from Could be multiple shorter within year windows Summative scores based on combination Each testlet measures one Essential Element/Level 3
Blueprints in the Integrated Model Consortium approved a subset of Essential Elements in each grade and minimum requirements for breadth of coverage Teachers (maybe with state guidance) pick the EEs from those available Doesn t just have to be the minimum! Spring window: system picks 5 EEs to re-assess Example: 3 rd Grade ELA 17 of 35 EEs in 3 rd grade are in the blueprint Minimum expectation for each student s assessment: Three EEs in C1.1, including at least one RL and one RI Two EEs in C1.2 (L, RL or RI) EEs must be from different strands, i.e. RL and L, not RL and RL. One EE in C1.3 (RL or RI) All students assessed in writing (EE.W.3.2.a and EE.W.3.4) No choice; one testlet covers all writing EEs Example: 6 th Grade Math 11 of 12 EEs in 6 th grade are in the blueprint Minimum expectation for each student s assessment: Two EEs from Claim 1 in different conceptual areas One EE from Claim 2 One EE from Claim 3 Two EEs from Claim 4 Score Reporting (Draft) Mastery and growth On-demand reports by Essential Element Reports to help teachers plan instruction Year-End Reports 3 levels of information Prototype EE Report Prototype Accountability Report Shows every Essential Element assessed that year Shows the levels mastered within each Essential Element Overall in the subject Labels for mastery and growth Compares student to peer groups 4
Fall 2013 Pilot LESSONS LEARNED IN 2013-14 Grades 3/4, 7/8, HS 1 EE per subject/grade band Fixed form, 3 levels N = 1,409 Assignment to complexity band Compared models difficulty & attempt rates by level/band Initial modeling Feedback: navigation, response modes, engagement, tools meeting student needs Lessons Learned Field Test #1 (Feb 2014) Expressive communication more conservative approach to initial assignment (impacted 5% to 9% of students) Relationship between student s complexity band and testlet level was as expected (mostly) Grades 3-12 2 EEs per subject/grade 3 testlets matrix design N = 9,615 in 14 states Ongoing evaluation of initialization into the map Ongoing modeling analysis Feedback: Evaluation of teacher resources Subjective evaluation of testlet difficulty Opportunity to learn assessed content Field Test #2 (Mar-Apr 2014) Field Test Participants Grades 3-12 2 EEs per subject/grade 3 testlets matrix design N = 10,428 in 16 states Similar foci as FT #1 Initialization Modeling Teacher surveys Field Test 1 Field Test 2 Students Teachers Students Teachers MS 2,682 766 2,666 102 Total 9,615 3,288 10,428 2,023 5
Field Test #3 (May-June 2014) Lessons Learned Grades 3-12 4 to 5 EEs per grade/subject By June 2 88,977 tests completed (~14,800 students) Variation on matrix design Modeling the map structure Feedback Accessibility Instructional relevance Initial steps evaluating theory of action Field test results still under analysis Initialization How the map is modeled Teacher feedback s meet criteria for operational assessment Lessons Specifically for B/VI Students Background Recommendations 1. Revisions 2. Alternate forms 3. Alternate pathways Late May 2014 1. Alternate forms in 3 grades 2. Teacher survey Other R&D in 2013-14 Roll-out of accessibility features, e.g. Read aloud synthetic voice Full support for switch users Autoenrollment (2 versions) ipad delivery A few technology enhanced items Anticipated Assessment Changes Enhancements to further promote accessibility Interface Alternate forms Building out the instructionally embedded system More content Better integration of instructional resources 2014-15 FROM THE DISTRICT PERSPECTIVE 6
What we will tell you 1. Overview 2. District staff responsibilities 3. Resources & training 4. Steps for teachers Topics Further guidance from MDE (not in this session) Ongoing field test participation & testing sub-windows (if applicable) Who uploads data Delivery method for required training Policy guidance for teachers Selecting content Accessibility decisions OVERVIEW Integrated Assessment Model Fall-Early Spring Late Spring Instructionally embedded Test again testlets on limited Teacher/IEP team choice with sample of some constraints EEs from Could be multiple shorter within year windows Summative scores based on combination Each testlet measures one Essential Element/Level Special Exception: Integrated Model in 2014-15 Phase When Assessments A Late Oct Tier I field testing B Nov Dec Tier II field testing Operational instructionally embedded C Jan Early Mar Tier III field testing (including Braille) Operational instructionally embedded D TBD (MS chooses) Operational spring window (retest 5 EEs/subject) Choices in 2014-15 Teacher choice during the year Phases A-C, Oct March Still TBD for MDE When choices are made Who is involved in choosing Spring window: system selects (not teacher choice) EEs to repeat DISTRICT STAFF: ROLES & RESPONSIBILITIES 7
Roles & Responsibilities Training & Resources Role Assessment Coordinator Key Responsibilities Leads implementation for the district (makes sure others do what needs to be done) Oversee teacher training Local support for teachers during assessment Data Steward User accounts Accuracy of enrollment and roster files (including exit records) Data verification two windows Technical Liaison Install KITE client (after August 1) Troubleshoot device problems, firewall issues, etc. Local caching server Each role has a user guide and a quick start guide More detailed guide for Data Steward Webinars scheduled in early fall separate for each role Recorded & available on website How to videos on specific procedures in Educator Portal Training & Resources First versions available by August Updates in mid-september Release notes Oct - May STEPS FOR TEACHERS High Level View 1. Professional Development for Instruction 2. IEP team decisions 3. Managing student information 4. Plan, deliver, adjust instruction 5. Complete required training 6. Administer instructionally embedded assessments & field tests* 7. Administer assessments in spring window 8. All year, review data and talk to parents Required Training vs. Professional Development Training Multiple modules Available in self-directed and facilitated formats (states decide which to use) Covers critical content for managing and delivering the assessment Required for all test administrators No tests delivered without it PD Multiple modules Available in self-directed and facilitated formats (states decide which to use) Covers a variety of topics to support instruction in academics States determine what is required/optional 8
Required Training - Delivery Available mid-august 18 for 2014-15 Must be completed before the teacher s first assessment window (mid-october) Self-directed or facilitated format Successful completion of post-test quizzes Topics are still under review by states Other Resources for Test Administrators Test Administration Manual Accessibility Manual Quick Start Guide Information about EEs and levels Access to released testlets and practice activities Supplemental modules & videos (optional) THANK YOU! Questions? For more information, please contact: dlm@ku.edu or Go to: www.dynamiclearningmaps.org For Professional Development, contact: dlm@unc.edu The present publication was developed under grant 84.373X100001 from the U.S. Department of Education, Office of Special Education Programs. The views expressed herein are solely those of the author(s), and no official endorsement by the U.S. Department should be inferred. 9