Classroom Assessment for Student Learning Jan Chappuis et al. Second Edition

Similar documents
Practical Research Planning and Design Paul D. Leedy Jeanne Ellis Ormrod Tenth Edition

EDEXCEL FUNCTIONAL SKILLS PILOT TEACHER S NOTES. Maths Level 2. Chapter 4. Working with measures

EDEXCEL FUNCTIONAL SKILLS PILOT. Maths Level 2. Chapter 7. Working with probability

Test Blueprint. Grade 3 Reading English Standards of Learning

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

Sample Performance Assessment

SkillPort Quick Start Guide 7.0

Loyola University Chicago Chicago, Illinois

Houghton Mifflin Online Assessment System Walkthrough Guide

Quick Start Guide 7.0

12-WEEK GRE STUDY PLAN

Grade 3: Module 2B: Unit 3: Lesson 10 Reviewing Conventions and Editing Peers Work

A Practical Introduction to Teacher Training in ELT

Stacks Teacher notes. Activity description. Suitability. Time. AMP resources. Equipment. Key mathematical language. Key processes

Writing Effective Program Learning Outcomes. Deborah Panter, J.D. Director of Educational Effectiveness & Assessment

Measurement. Time. Teaching for mastery in primary maths

Norms How were TerraNova 3 norms derived? Does the norm sample reflect my diverse school population?

Sectionalism Prior to the Civil War

MMOG Subscription Business Models: Table of Contents

The lab is designed to remind you how to work with scientific data (including dealing with uncertainty) and to review experimental design.

Indiana Collaborative for Project Based Learning. PBL Certification Process

THE PROMOTION OF SOCIAL AWARENESS

Characteristics of the Text Genre Informational Text Text Structure

Learning Lesson Study Course

PowerTeacher Gradebook User Guide PowerSchool Student Information System

Adolescence and Young Adulthood / English Language Arts. Component 1: Content Knowledge SAMPLE ITEMS AND SCORING RUBRICS

Diagnostic Test. Middle School Mathematics

Pragmatic Use Case Writing

Characteristics of the Text Genre Informational Text Text Structure

PIRLS. International Achievement in the Processes of Reading Comprehension Results from PIRLS 2001 in 35 Countries

RETURNING TEACHER REQUIRED TRAINING MODULE YE TRANSCRIPT

TEACHING Simple Tools Set II

TCH_LRN 531 Frameworks for Research in Mathematics and Science Education (3 Credits)

What is PDE? Research Report. Paul Nichols

Mastering Team Skills and Interpersonal Communication. Copyright 2012 Pearson Education, Inc. publishing as Prentice Hall.

BIOH : Principles of Medical Physiology

Digital Media Literacy

Guidelines for the Use of the Continuing Education Unit (CEU)

Characteristics of the Text Genre Realistic fi ction Text Structure

Copyright Corwin 2015

Summarize The Main Ideas In Nonfiction Text

INTRODUCTION TO GENERAL PSYCHOLOGY (PSYC 1101) ONLINE SYLLABUS. Instructor: April Babb Crisp, M.S., LPC

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Secondary English-Language Arts

What can I learn from worms?

EDUCATION AND TRAINING (QCF) Qualification Specification

Exams: Accommodations Guidelines. English Language Learners

Ready Common Core Ccls Answer Key

Guide to Teaching Computer Science

Strategies for Differentiating

Intermediate Algebra

Dialogue Live Clientside

Intensive English Program Southwest College

Kindergarten Lessons for Unit 7: On The Move Me on the Map By Joan Sweeney

Susan K. Woodruff. instructional coaching scale: measuring the impact of coaching interactions

Measurement & Analysis in the Real World

Class Meeting Time and Place: Section 3: MTWF10:00-10:50 TILT 221

Activities, Exercises, Assignments Copyright 2009 Cem Kaner 1

An Analysis of the Early Assessment Program (EAP) Assessment for English

English Language Arts Summative Assessment

Evidence-based Practice: A Workshop for Training Adult Basic Education, TANF and One Stop Practitioners and Program Administrators

Fountas-Pinnell Level M Realistic Fiction

Excel Formulas & Functions

1. Answer the questions below on the Lesson Planning Response Document.

Instructional Supports for Common Core and Beyond: FORMATIVE ASSESMENT

Rubric for Scoring English 1 Unit 1, Rhetorical Analysis

Philosophy of Literacy. on a daily basis. My students will be motivated, fluent, and flexible because I will make my reading

NCEO Technical Report 27

Course Description. Student Learning Outcomes

Growth of empowerment in career science teachers: Implications for professional development

Focus on. Learning THE ACCREDITATION MANUAL 2013 WASC EDITION

Course Groups and Coordinator Courses MyLab and Mastering for Blackboard Learn

Language Arts Methods

Grade 6: Module 2A: Unit 2: Lesson 8 Mid-Unit 3 Assessment: Analyzing Structure and Theme in Stanza 4 of If

A Study of Metacognitive Awareness of Non-English Majors in L2 Listening

KIS MYP Humanities Research Journal

36TITE 140. Course Description:

Teachers Guide Chair Study

Grade 5: Module 3A: Overview

John Jay College of Criminal Justice, CUNY ASSESSMENT REPORT: SPRING Undergraduate Public Administration Major

Chapter 9 The Beginning Teacher Support Program

Florida Reading for College Success

University Library Collection Development and Management Policy

A Survey of Authentic Assessment in the Teaching of Social Sciences

Syllabus: PHI 2010, Introduction to Philosophy

Applying Florida s Planning and Problem-Solving Process (Using RtI Data) in Virtual Settings

Presentation 4 23 May 2017 Erasmus+ LOAF Project, Vilnius, Lithuania Dr Declan Kennedy, Department of Education, University College Cork, Ireland.

PROGRESS MONITORING FOR STUDENTS WITH DISABILITIES Participant Materials

Syllabus: CS 377 Communication and Ethical Issues in Computing 3 Credit Hours Prerequisite: CS 251, Data Structures Fall 2015

Non-Secure Information Only

Hardhatting in a Geo-World

MCAS_2017_Gr5_ELA_RID. IV. English Language Arts, Grade 5

University of North Carolina at Greensboro Bryan School of Business and Economics Department of Information Systems and Supply Chain Management

Financial Accounting Concepts and Research

10.2. Behavior models

Tutoring First-Year Writing Students at UNM

Karla Brooks Baehr, Ed.D. Senior Advisor and Consultant The District Management Council

How to Judge the Quality of an Objective Classroom Test

Transcription:

Classroom Assessment for Student Learning Jan Chappuis et al. Second Edition

Pearson Education Limited Edinburgh Gate Harlow Essex CM20 2JE England and Associated Companies throughout the world Visit us on the World Wide Web at: www.pearsoned.co.uk Pearson Education Limited 2014 All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without either the prior written permission of the publisher or a licence permitting restricted copying in the United Kingdom issued by the Copyright Licensing Agency Ltd, Saffron House, 6 10 Kirby Street, London EC1N 8TS. All trademarks used herein are the property of their respective owners. The use of any trademark in this text does not vest in the author or publisher any trademark ownership rights in such trademarks, nor does the use of such trademarks imply any affiliation with or endorsement of this book by such owners. ISBN 10: 1-292-02120-9 ISBN 13: 978-1-292-02120-1 British Library Cataloguing-in-Publication Data A catalogue record for this book is available from the British Library Printed in the United States of America

Chapter 5 Learning Targets At the end of this chapter you will know how to do the following: Make a test blueprint for a selected response assessment. Choose from among selected response formats. Create high-quality items. Audit any selected response test for quality. Use selected response assessments to plan further instruction. Use selected response assessments as feedback to students and for student self-assessment and goal setting. FIGURE 5.1 Keys to Quality Classroom Assessment Key 1: Clear Purpose Who will use the information? How will they use it? What information, in what detail, is required? Key 2: Clear Targets Are learning targets clear to teachers? What kinds of achievement are to be assessed? Are these learning targets the focus of instruction? Key 3: Sound Design Do assessment methods match learning targets? Does the sample represent learning appropriately? Are items, tasks, and scoring rubrics of high quality? Does the assessment control for bias? Key 4: Effective Communication Can assessment results be used to guide instruction? Do formative assessments function as effective feedback? Is achievement tracked by learning target and reported by standard? Do grades communicate achievement accurately? Key 5: Student Involvement Do assessment practices meet students information needs? Are learning targets clear to students? Will the assessment yield information that students can use to self-assess and set goals? Are students tracking and communicating their evolving learning? 129

WHEN TO USE SELECTED RESPONSE ASSESSMENT The first condition for using selected response is that it must be capable of reflecting the type of learning target to be assessed. Selected response formats are ideal for assessing knowledge-level learning targets, some patterns of reasoning, and a very few number of skill targets, as described in Chapter 4. Several other key conditions influence choosing the selected response method of assessment. Use it when The content to be assessed is broad, requiring wide-ranging coverage. Since the response time to one item is so short, you can include lots of items per unit of testing time and thus sample student achievement thoroughly. You want to diagnose student misconceptions and flaws in reasoning. Students can read English well enough to understand what each test item is asking of them. FAQ 5.1 Misconceptions about Selected Response Assessment Question: Shouldn t we be using mostly multiple-choice tests because all the high-stakes tests use them? Answer: No. Although high-stakes tests use this format extensively, the reason for that choice is not because it is a better method. Large-scale tests usually need to be administered and scored in as little time as possible, as inexpensively as possible. These requirements lead to the use of selected response formats such as multiple choice. The obvious problem is that, unless other formats are also part of the assessment, learning targets representing important patterns of reasoning, skills, and products are going unmeasured. Giving students practice on answering large-scale test items is one thing, but mirroring characteristics of high-stakes tests that are not instructionally useful or that do not provide accurate results in the classroom is not a good idea. Question: Shouldn t we minimize the use of selected response assessments because they are not authentic? Answer: First, let s define authentic. The New American Oxford Dictionary offers this as one definition: made or done in a way that faithfully resembles the original (p. 107 ). In the usual application to assessment, authentic refers to the context of the assessment 130

mirroring the use or application of the learning in a situation that would require it in life. (We prefer to call this life beyond school rather than real-world because school can and should be part of the real world for students.) By that definition, selected response methodology is not inauthentic. Life beyond school often calls for correct answers and solutions chosen from a variety of options. We believe it is more helpful to think of authenticity as a dimension of assessments, not as a label given to some forms rather than others. We can keep it as a consideration when writing assessments of any sort, as long as the application or context doesn t interfere with the accuracy of the item, task, or scoring guide. DEVELOPING A SELECTED RESPONSE TEST We will follow the steps in the Assessment Development Cycle described in Chapter 4. Planning 1. Determine who will use the assessment results and how they will use them. 2. Identify the learning targets to be assessed. 3. Select the appropriate assessment method or methods. 4. Determine sample size. Development 5. Develop or select items, exercises, tasks, and scoring procedures. 6. Review and critique the overall assessment for quality before use. Use 7. Conduct and score the assessment. 8. Revise as needed for future use. PLANNING STEPS As we saw in Chapter 4, careful attention to each of the four planning steps is essential to ensuring that the resulting assessment will do what you want it to. Step 1: Determine Users and Uses We begin planning by answering these questions: How do we want to use the information? Who else will use it? What decisions will they make? Typically, we will use assessment information for one or more of the following purposes: To plan instruction, as with a pretest To offer feedback to students so they can self-assess and set goals for further learning To differentiate instruction according to student needs, as with a mid-unit quiz or an interim assessment To measure level of achievement to inform grading decisions, as with a post-test 131

Each one of these purposes can be accomplished with selected response formats, as long as we keep the intended use in mind while making further planning and design decisions. Step 2: Identify Learning Targets At this step we simply list the specific learning targets we have identified for the assessment. If one or more targets on the list are complex or unclear, clarify them or deconstruct them first, following the processes outlined in Chapter 3. Step 3: Select Assessment Method(s) Although we have already determined that we will use selected response, we must make sure the list of clarified targets only includes knowledge and reasoning learning targets and also that those targets can be assessed well with selected response methodology. So, review the list of learning targets to verify that they are knowledge and reasoning targets and that selected response items can capture an accurate picture of achievement. Step 4: Determine Sample Size This step requires that we assign a relative importance to each learning target. One simple way to do this with selected response questions is to decide how many points the test will be worth and then divide the points according to relative importance When using a test that you didn t develop, check carefully that it matches the learning targets you taught and that the amount of emphasis each receives is appropriate. of each learning target. The number of points we assign to each learning target outlines our sample, which should represent the breadth of the learning targets and their importance relative to each other in the instructional period the test is to cover. At this step, you may want to review the sampling considerations described in Chapter 4. Remember, when identifying the relative importance of each learning target, we consciously match our emphasis in assessment to our emphasis in the classroom. If, say, we spend 50 percent of the time learning how to read maps, then roughly 50 percent of the assessment should focus on map reading. If only 5 percent of the course deals with reading maps, then in most cases it would misrepresent learning to devote 50 percent of the final assessment to map reading. If, on the other hand, the results are to be reported by individual learning target, or if the test only measures one single target, the sample must be sufficient to defend an inference about mastery of that individual target. COMBINING PLANNING DECISIONS INTO AN ASSESSMENT BLUEPRINT. For selected response assessments, we offer two useful types of blueprints. One is a list of the learning targets and the other is a table crossing content with knowledge and 132

FIGURE 5.2 Blueprints for Third-Grade Mathematics and Reading Tests Mathematics Learning Targets Number of Points Identify place value to thousands 6 Read, write, order, and compare numbers through four digits 10 Use place value understanding to round whole numbers to 4 the nearest 10 or 100 Reading Learning Targets Number of Points Determine the lesson of a fable 1 Identify key supporting details 2 Infer a character s feelings 2 Distinguish literal from nonliteral language 2 Identify meanings of words in a text 3 pattern(s) of reasoning to be assessed. Each is suited to different types of content, but both are equally effective as test planning instruments. Figure 5.2 shows plans for a third-grade mathematics test and a third-grading reading test consisting of a list of learning targets and how many points each will be worth. Note that on the reading test, only one to three points are assigned to each learning target. That is because, for any one reading passage, it can be difficult to develop more than one or two items to assess targets such as Infer a character s feelings. So, especially with the shorter passages at the lower grades, you would want to construct similar items for a variety of reading passages at the same level of difficulty to obtain a sufficient sample size from which to draw conclusions about students level of mastery. Figure 5.3 shows a list of learning targets for a fifth-grade social studies unit on westward expansion. The test blueprint consists of a list of the content embedded in the learning targets in the left-hand column labeled Content Categories. Each category represents many facts and concepts, some of which will be sufficiently important to test. The blueprint also includes columns labeled for the cognitive action to be carried out: know outright and reason comparatively. These patterns will be emphasized during the unit of study. The numbers in each cell represent its relative importance in the unit as planned. This kind of test plan is especially useful if we want to ensure that the test covers both recall of important information and reasoning processes we have taught. (Remember that there could be other learning targets taught during the unit this blueprint represents only those covered by the selected response portion of the test.) 133