Rubrics & Assessment Data Collection Making Things Good, Better, & Innovative. March 7, 2014 Presented by: Kathleen Voge & Kathi Trawver
|
|
- Gabriel Myles Watson
- 6 years ago
- Views:
Transcription
1 Rubrics & Assessment Data Collection Making Things Good, Better, & Innovative March 7, 2014 Presented by: Kathleen Voge & Kathi Trawver
2 Session Topics Academic Assessment Committee Purpose Developing and Evaluating Rubrics Samples of Rubrics Methods/Trends in Assessment Data Collection Innovative Data Collection & Analysis Discussion and Sharing
3 AAC Purpose The Academic Assessment Committee (AAC) was created to provide peer leadership, support and review of academic assessment. The AAC serves as a cross-campus forum for the exchange of ideas, information and advice on methods and practices of academic assessment. FSAAC UAA Assessment
4 A Symbol for Wisdom & Knowledge Dimensions/Criteria Usage Rating Scales Descriptors
5 Why Use Rubrics? They help us Evaluate student performance Improve communication of expectations Improve transparency of what matters Enhance thinking and learning Increase grading/assessment objectivity
6 Structure and Components of a Typical Rubric (University of Colorado, 2014). A rubric is a matrix of criteria and their descriptors Generally contain three primary components Dimensions The left side of a rubric matrix lists the elements that make up the full criteria for the scale/performance standards Scale Across the top of the rubric matrix is the rating scale (including qualitative markers and numbers) that provides a set of values for rating the quality of performance for each criterion (levels of performance, milestones) Descriptors Under the rating scale provide examples or concrete indicators for each level of performance
7 Anatomy of a Rubric Rubric to Assess X Skills Exemplary 4 Satisfactory 3 Scale Developing 2 Unsatisfactory 1 Performance Indicator 1 Performance Indicator 2 Performance Indicator 3 Performance Indicator 4 Descriptor Descriptor Descriptor Descriptor Descriptor Descriptor Descriptor Descriptor Descriptor Descriptor Descriptor Descriptor Descriptor Descriptor Descriptor Descriptor Dimensions/Criteria Descriptors
8 Rubric Dimensions (Down the Left) Lists the performance indicators that make up the whole SLO (e.g., critical thinking = explanation of issues, evidence, influence of context and assumptions, student s perspective/hypothesis, and conclusions/outcomes) Should be Measurable Include what is most important to you Essential to the learning objective being measured Distinct and not overlapping Specific and clear
9 Rubric Scale (Across the Top) You want to read the aspiring/exemplar description first so you have the ideal in mind as you are reading the work (left to right) Rating scales should be Clear to everyone Include qualitative and quantitative descriptions Differentiated between scale points Able to be utilized consistently Disagreement number of points, inclusion of a neutral
10 Rubric Descriptors (The Cells) Rubric descriptions should Descriptors should be positively worded, rather than what isn t there what is present rather than what is absent = strengths vs. deficit based Descriptors should say what students are DOING Use verbs in developmental order (e.g., identify to evaluate) This also helps to determine if your what you consider exemplar is a part of the assignment/measure
11 Reporting Rubric Data Should measure SLOs consistently across evaluators (reliability) Should measure what it was designed to measure (validity) Treatment of data from rubrics Ordinal level data only Report percentages Use non-parametric tests (e.g., Spearman s Rank Coefficient for correlations) Report median distributions or percentage distributions, rather than reporting means
12 Evaluating Your Rubric Norming or Calibrating your measure helps you answer how well does the rubric Reflect what is really important to you in terms of student learning and competencies? Address performance levels/scoring? Describe performance levels? Utilize specific language? Provide utility/usefulness in its ratings? Perform in terms of inter-rater reliability
13 Rubric to Evaluate Your Rubric Example 1 Source: (Teaching, Learning, and Technology Group, 2002)
14 Criteria 4 Exemplary 3 Good 2 Acceptable 1 Unacceptable Clarity of criteria Each criteria is distinct, clearly delineated and fully appropriate for the assignment(s)/course Criteria being assessed are clear, appropriate and distinct Criteria being assessed can be identified, but are not clearly differentiated or are inappropriate Criteria being assessed are unclear, inappropriate and/or have significant overlap Distinction b/w levels Each level is distinct and progresses in a clear and logical order Distinction between levels is apparent Some distinction between levels is evident, but remain unclear Little/no distinction can be made between levels Reliability of scoring Cross-scoring of assignments using rubric results in consistent agreement between scorers There is general agreement between different scorers when using the rubric Cross-scoring occasionally produces inconsistent results Cross-scoring often results in significant differences Clarity of expectations/ guidance to learners Rubric serves as primary reference point for discussion and guidance for course/assignment(s) and evaluation of assignment(s) Rubric is used to explicitly introduce an assignment and guide learners Rubric is shared and provides some idea of the assignment/ expectations Rubric is not shared with learners Support of metacognition Rubric is regularly referenced and used to help learners identify the skills and knowledge they are developing throughout the course/ assignment(s) Rubric is shared and identified as a tool for helping learners to understand what they are learning through the assignment/ in the course Rubric is shared but no further reference is made to it in the course/ assignment(s) Learners do not see/know of the rubric Engagement of learner in rubric use Faculty and learners are jointly responsible for design of rubrics and learners use them in peer and/or self-evaluation Learners discuss and offer feedback/input into the design of the rubric, and are responsible for use of rubrics in peer and/or selfevaluation Learners offered the rubric and may choose to use it for self assessment Learners are not engaged in either development or use of the rubrics Scoring: 0-10 = needs improvement = workable = solid/good = exemplary (TLT Group, 2002)
15 Rubric to Evaluate Your Rubric Example 2 Source: (Educational Testing Service, 2006)
16 Coverage/Organization (A - Covers the Right Content) (Educational Testing Service, 2006) Strong = 5 Medium = 3 Weak = 1 1. The content of the rubric represents the best thinking in the field about what it means to perform well on the skill or product under consideration. 2. The content of the rubric aligns directly with the content standards/ learning targets it is intended to assess. 1. Much of the content represents the best thinking in the field, but there are a few places that are questionable. 2. Some features don t align well with the content standards/learning targets it is intended to assess. 1. You can t tell what learning target(s) the rubric is intended to assess, or you can guess at the learning targets, but they don t seem important, or content is far removed from current best thinking in the field about what it means to perform well on the skill or product under consideration. 2. The rubric doesn t seem to align with the content standards/learning targets it is intended to assess. 3. The content has the ring of truth the content is truly what you look for when you evaluate the quality of performance or product. In fact, the rubric is insightful; it helps you organize your own thinking about what it means to perform well. 3. Much of the content is relevant, but you can easily think of some important things that have been left out or that have been given short shrift, or it contains an irrelevant criterion or descriptor that might lead to an incorrect conclusion about the quality of student performance. 3. You can think of many important dimensions of a quality performance or product that are not in the rubric, or content focuses on irrelevant features. You find yourself asking, Why assess this? or Why should this count? or Why should students have to do it this way?
17 Coverage/Organization (1B Criteria are Well Organized) Strong = 5 Medium = 3 Weak = 1 1. The rubric is divided into easily understandable criteria. The number of criteria reflects the complexity of the learning target. If a holistic rubric is used, a single criterion adequately describes performance. 2. The details that are used to describe a criterion go together; you can see how they are facets of the same criterion. 1. The number of criteria needs to be adjusted a little: either a single criterion should be made into two criteria, or two criteria should be combined. 2. Some details that are used to describe a criterion are in the wrong criterion, but most are placed correctly. 1. The rubric is holistic when an analytic one is better suited to the intended use or learning targets; or the rubric is an endless list of everything; there is no organization; the rubric looks like a brainstormed list. 2. The rubric seems mixed up descriptors that go together don t seem to be placed together. Things that are different are put together. 3. The emphasis on various features of performance is right things that are more important are stressed more; things that are less important are stressed less. 4. The criteria are independent. Each important feature that contributes to quality work appears in only one place in the rubric. 3. The emphasis on some criteria or descriptors is either too small or too great; others are all right. 4. Although there are instances when the same feature is included in more than one criterion, the criteria structure holds up pretty well. 3. The rubric is out of balance features of more importance are emphasized the same as features of less importance. 4. Descriptors of quality work are represented redundantly in more than one criterion to the extent that the criteria are really not covering different things.
18 Coverage/Organization (1C Number of Levels Fits Targets and Uses) Strong = 5 Medium = 3 Weak = 1 1. The number of levels of quality used in the rating scale makes sense. There are enough levels to be able to show student progress, but not so many levels that it is impossible to distinguish among them. 1. Teachers might find it useful to create more levels to make finer distinctions in student progress, or to merge levels to suit the rubric s intended use. The number of levels could be adjusted easily. 1. The number of levels is not appropriate for the learning target being assessed or intended use. There are so many levels it is impossible to reliably distinguish between them, or too few to make important distinctions. It would take major work to fix the problem.
19 Clarity (2A: Levels Defined Well) Strong = 5 Medium = 3 Weak = 1 1. Each score point (level) is defined with indicators and/or descriptors. A plus: There are examples of student work that illustrate each level of each trait. 2. There is enough descriptive detail in the form of concrete indicators, adjectives, and descriptive phrases that allow you to match a student performance to the right score. 3. Two independent users, with training and practice, assign the same rating most of the time. A plus: There is information on rater agreement rates that shows that raters can exactly agree on a score 65% of the time, and within one point 98% of the time. 1. Only the top level is defined. The other levels are not defined. 2. There is some attempt to define terms and include descriptors, but some key ideas are fuzzy in meaning. 3. You have a question whether independent raters, even with practice, could assign the same rating most of the time. 1. No levels are defined; the rubric is little more that a list of categories to rate followed by a rating scale. 2. Wording of the levels, if present, is vague or confusing. You find yourself saying, I m confused, or I don t have any idea what this means. Or, the only way to distinguish levels is with words such as extremely, very, some, little, and none; or completely, substantially, fairly well, little, and not at all. 3. It is unlikely that independent raters could consistently rate work the same, even with practice.
20 Clarity (2A: Levels Defined Well, Continued) Strong = 5 Medium = 3 Weak = 1 4. If counting the number or frequency of something is included as an indicator, changes in such counts really are indicators of changes in quality. 5. There is enough descriptive detail in the form of concrete indicators, adjectives, and descriptive phrases that allow you to match a student performance to the right score. 4. There is some descriptive detail in the form of words, adjectives, and descriptive phrases, but counting the frequency of something or vague quantitative words are also present. 5. Wording is mostly descriptive of the work, but there are a few instances of evaluative labels. 4. Rating is almost totally based on counting the number or frequency of something, even though quality is more important than quantity. 5. Wording is mostly descriptive of the work, but there are a few instances of evaluative labels.
21 Clarity (2A: Levels Defined Well) Strong = 5 Medium = 3 Weak = 1 1. Each score point (level) is defined with indicators and/or descriptors. A plus: There are examples of student work that illustrate each level of each trait. 2. There is enough descriptive detail in the form of concrete indicators, adjectives, and descriptive phrases that allow you to match a student performance to the right score. 3. Two independent users, with training and practice, assign the same rating most of the time. A plus: There is information on rater agreement rates that shows that raters can exactly agree on a score 65% of the time, and within one point 98% of the time. 1. Only the top level is defined. The other levels are not defined. 2. There is some attempt to define terms and include descriptors, but some key ideas are fuzzy in meaning. 3. You have a question whether independent raters, even with practice, could assign the same rating most of the time. 1. No levels are defined; the rubric is little more that a list of categories to rate followed by a rating scale. 2. Wording of the levels, if present, is vague or confusing. You find yourself saying, I m confused, or I don t have any idea what this means. Or, the only way to distinguish levels is with words such as extremely, very, some, little, and none; or completely, substantially, fairly well, little, and not at all. 3. It is unlikely that independent raters could consistently rate work the same, even with practice.
22 Clarity (2B: Levels are Parallel) Strong = 5 Medium = 3 Weak = 1 1. The levels of the rubric are parallel in content if an indicator of quality is discussed in one level, it is discussed in all levels. If the levels are not parallel, there is a good explanation why. 1. The levels are mostly parallel in content, but there are some places where there is an indicator at one level that is not present at the other levels. 1. Levels are not parallel in content and there is no explanation of why, or the explanation doesn t make sense.
23 Rubric to Evaluate the Quality of Your Rubric Example 3 Source: SBE Design Team, 1997
24 Criteria Clearly Written Acceptable, but needs more clarity if used for high stakes assessment Needs to be Reworked Performance levels addressed Scoring guide is descriptive of each level of performance Scoring guide provides for different performance levels Scoring is open-ended Description of performance levels The descriptions define clear and significant differences between the performance levels Differences between the levels rely on looking for a number of examples or responses There are no specific descriptions of the different performance levels Language specificity The critical attributes between each level of performance are included Subjective words (good, excellent, some) are used to discriminate between levels, but are further defined Vague words are used to discriminate between levels: Some, many, few, good, excellent Usefulness Ratings provide useful instructional information Rating provide instructional information that needs further task analysis Ratings do not provide useful instruction
25 Methods & Trends in Assessment Data Collection Student self-efficacy surveys Indirect versus direct assessment Activity-based learning Computer-based assessment Online learning As trends in education constantly evolve so will assessment of learning!
26 Analyzing Collected Data Data can be compared to Previous assessment results Baseline data Existing standards Specific competencies/criterion
27 Trends in Rubrics & Data Collection Students like rubrics Rubrics promote transparency and consistency Summarized rubric results promote an ongoing dialogue about teaching and learning
28 Examples / Discussion Handouts Online resources
29 A Symbol for Wisdom & Knowledge Dimensions/Criteria Usage Rating Scales Descriptors
30 Online Resources Evaluating Rubrics, < CollegeAssessmentCenter/RubricDirectory/evaluatingrubrics.pdf>, accessed on February 3, Creating a Rubric, An Online Tutorial for Faculty, < accessed on February 3, Practical Assessment, Research & Evaluation, < accessed on February 3, Using Data to Guide Instruction and Improve Student Learning, < accessed on February 3, VALUE: Value Assessment of Learning in Undergraduate Education, < accessed on March 5, 2014.
31 References Educational Testing Service. (2006). Creating and recognizing quality rubrics. Retrieved from Teaching Learning and Technology Group. (2002). A rubric for rubrics: A tool for assessing the quality and use of rubrics in education. Retrieved from Teaching, Learning, and Technology Group. (2006). Creating and recognizing quality rubrics. Retrieved from University of Colorado. (2014). Creating a rubric: An online tutorial for faculty. Retrieved from
English 491: Methods of Teaching English in Secondary School. Identify when this occurs in the program: Senior Year (capstone course), week 11
English 491: Methods of Teaching English in Secondary School Literacy Story and Analysis through Critical Lens Identify when this occurs in the program: Senior Year (capstone course), week 11 Part 1: Story
More informationGraduate Program in Education
SPECIAL EDUCATION THESIS/PROJECT AND SEMINAR (EDME 531-01) SPRING / 2015 Professor: Janet DeRosa, D.Ed. Course Dates: January 11 to May 9, 2015 Phone: 717-258-5389 (home) Office hours: Tuesday evenings
More informationRubric for Scoring English 1 Unit 1, Rhetorical Analysis
FYE Program at Marquette University Rubric for Scoring English 1 Unit 1, Rhetorical Analysis Writing Conventions INTEGRATING SOURCE MATERIAL 3 Proficient Outcome Effectively expresses purpose in the introduction
More informationEvidence for Reliability, Validity and Learning Effectiveness
PEARSON EDUCATION Evidence for Reliability, Validity and Learning Effectiveness Introduction Pearson Knowledge Technologies has conducted a large number and wide variety of reliability and validity studies
More informationACADEMIC AFFAIRS GUIDELINES
ACADEMIC AFFAIRS GUIDELINES Section 8: General Education Title: General Education Assessment Guidelines Number (Current Format) Number (Prior Format) Date Last Revised 8.7 XIV 09/2017 Reference: BOR Policy
More informationPresentation 4 23 May 2017 Erasmus+ LOAF Project, Vilnius, Lithuania Dr Declan Kennedy, Department of Education, University College Cork, Ireland.
The role of rubrics in making the assessment visible and clear Presentation 4 23 May 2017 Erasmus+ LOAF Project, Vilnius, Lithuania Dr Declan Kennedy, Department of Education, University College Cork,
More informationThe College Board Redesigned SAT Grade 12
A Correlation of, 2017 To the Redesigned SAT Introduction This document demonstrates how myperspectives English Language Arts meets the Reading, Writing and Language and Essay Domains of Redesigned SAT.
More informationASSESSMENT REPORT FOR GENERAL EDUCATION CATEGORY 1C: WRITING INTENSIVE
ASSESSMENT REPORT FOR GENERAL EDUCATION CATEGORY 1C: WRITING INTENSIVE March 28, 2002 Prepared by the Writing Intensive General Education Category Course Instructor Group Table of Contents Section Page
More informationArkansas Tech University Secondary Education Exit Portfolio
Arkansas Tech University Secondary Education Exit Portfolio Guidelines, Rubrics, and Requirements 2 THE EXIT PORTFOLIO A s-based Presentation of Evidence for the Licensure of Beginning Teachers Purpose:
More informationDOES RETELLING TECHNIQUE IMPROVE SPEAKING FLUENCY?
DOES RETELLING TECHNIQUE IMPROVE SPEAKING FLUENCY? Noor Rachmawaty (itaw75123@yahoo.com) Istanti Hermagustiana (dulcemaria_81@yahoo.com) Universitas Mulawarman, Indonesia Abstract: This paper is based
More informationSouth Carolina English Language Arts
South Carolina English Language Arts A S O F J U N E 2 0, 2 0 1 0, T H I S S TAT E H A D A D O P T E D T H E CO M M O N CO R E S TAT E S TA N DA R D S. DOCUMENTS REVIEWED South Carolina Academic Content
More informationScoring Guide for Candidates For retake candidates who began the Certification process in and earlier.
Adolescence and Young Adulthood SOCIAL STUDIES HISTORY For retake candidates who began the Certification process in 2013-14 and earlier. Part 1 provides you with the tools to understand and interpret your
More informationHow to Judge the Quality of an Objective Classroom Test
How to Judge the Quality of an Objective Classroom Test Technical Bulletin #6 Evaluation and Examination Service The University of Iowa (319) 335-0356 HOW TO JUDGE THE QUALITY OF AN OBJECTIVE CLASSROOM
More informationMaximizing Learning Through Course Alignment and Experience with Different Types of Knowledge
Innov High Educ (2009) 34:93 103 DOI 10.1007/s10755-009-9095-2 Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge Phyllis Blumberg Published online: 3 February
More informationWelcome to the Purdue OWL. Where do I begin? General Strategies. Personalizing Proofreading
Welcome to the Purdue OWL This page is brought to you by the OWL at Purdue (http://owl.english.purdue.edu/). When printing this page, you must include the entire legal notice at bottom. Where do I begin?
More informationPHILOSOPHY & CULTURE Syllabus
PHILOSOPHY & CULTURE Syllabus PHIL 1050 FALL 2013 MWF 10:00-10:50 ADM 218 Dr. Seth Holtzman office: 308 Administration Bldg phones: 637-4229 office; 636-8626 home hours: MWF 3-5; T 11-12 if no meeting;
More informationGrade 4. Common Core Adoption Process. (Unpacked Standards)
Grade 4 Common Core Adoption Process (Unpacked Standards) Grade 4 Reading: Literature RL.4.1 Refer to details and examples in a text when explaining what the text says explicitly and when drawing inferences
More informationSt. Martin s Marking and Feedback Policy
St. Martin s Marking and Feedback Policy The School s Approach to Marking and Feedback At St. Martin s School we believe that feedback, in both written and verbal form, is an integral part of the learning
More informationGuidelines for Writing an Internship Report
Guidelines for Writing an Internship Report Master of Commerce (MCOM) Program Bahauddin Zakariya University, Multan Table of Contents Table of Contents... 2 1. Introduction.... 3 2. The Required Components
More informationTRAITS OF GOOD WRITING
TRAITS OF GOOD WRITING Each paper was scored on a scale of - on the following traits of good writing: Ideas and Content: Organization: Voice: Word Choice: Sentence Fluency: Conventions: The ideas are clear,
More informationResearch Design & Analysis Made Easy! Brainstorming Worksheet
Brainstorming Worksheet 1) Choose a Topic a) What are you passionate about? b) What are your library s strengths? c) What are your library s weaknesses? d) What is a hot topic in the field right now that
More informationPhysics 270: Experimental Physics
2017 edition Lab Manual Physics 270 3 Physics 270: Experimental Physics Lecture: Lab: Instructor: Office: Email: Tuesdays, 2 3:50 PM Thursdays, 2 4:50 PM Dr. Uttam Manna 313C Moulton Hall umanna@ilstu.edu
More informationNumber of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)
Program: Journalism Minor Department: Communication Studies Number of students enrolled in the program in Fall, 2011: 20 Faculty member completing template: Molly Dugan (Date: 1/26/2012) Period of reference
More informationStatistical Analysis of Climate Change, Renewable Energies, and Sustainability An Independent Investigation for Introduction to Statistics
5/22/2012 Statistical Analysis of Climate Change, Renewable Energies, and Sustainability An Independent Investigation for Introduction to Statistics College of Menominee Nation & University of Wisconsin
More informationAssessment System for M.S. in Health Professions Education (rev. 4/2011)
Assessment System for M.S. in Health Professions Education (rev. 4/2011) Health professions education programs - Conceptual framework The University of Rochester interdisciplinary program in Health Professions
More informationSecondary English-Language Arts
Secondary English-Language Arts Assessment Handbook January 2013 edtpa_secela_01 edtpa stems from a twenty-five-year history of developing performance-based assessments of teaching quality and effectiveness.
More informationSSIS SEL Edition Overview Fall 2017
Image by Photographer s Name (Credit in black type) or Image by Photographer s Name (Credit in white type) Use of the new SSIS-SEL Edition for Screening, Assessing, Intervention Planning, and Progress
More informationDeveloping an Assessment Plan to Learn About Student Learning
Developing an Assessment Plan to Learn About Student Learning By Peggy L. Maki, Senior Scholar, Assessing for Learning American Association for Higher Education (pre-publication version of article that
More informationDegree Qualification Profiles Intellectual Skills
Degree Qualification Profiles Intellectual Skills Intellectual Skills: These are cross-cutting skills that should transcend disciplinary boundaries. Students need all of these Intellectual Skills to acquire
More informationOVERVIEW OF CURRICULUM-BASED MEASUREMENT AS A GENERAL OUTCOME MEASURE
OVERVIEW OF CURRICULUM-BASED MEASUREMENT AS A GENERAL OUTCOME MEASURE Mark R. Shinn, Ph.D. Michelle M. Shinn, Ph.D. Formative Evaluation to Inform Teaching Summative Assessment: Culmination measure. Mastery
More informationNovember 2012 MUET (800)
November 2012 MUET (800) OVERALL PERFORMANCE A total of 75 589 candidates took the November 2012 MUET. The performance of candidates for each paper, 800/1 Listening, 800/2 Speaking, 800/3 Reading and 800/4
More informationEvidence-based Practice: A Workshop for Training Adult Basic Education, TANF and One Stop Practitioners and Program Administrators
Evidence-based Practice: A Workshop for Training Adult Basic Education, TANF and One Stop Practitioners and Program Administrators May 2007 Developed by Cristine Smith, Beth Bingman, Lennox McLendon and
More information10.2. Behavior models
User behavior research 10.2. Behavior models Overview Why do users seek information? How do they seek information? How do they search for information? How do they use libraries? These questions are addressed
More informationPractical Research. Planning and Design. Paul D. Leedy. Jeanne Ellis Ormrod. Upper Saddle River, New Jersey Columbus, Ohio
SUB Gfittingen 213 789 981 2001 B 865 Practical Research Planning and Design Paul D. Leedy The American University, Emeritus Jeanne Ellis Ormrod University of New Hampshire Upper Saddle River, New Jersey
More informationQualitative Site Review Protocol for DC Charter Schools
Qualitative Site Review Protocol for DC Charter Schools Updated November 2013 DC Public Charter School Board 3333 14 th Street NW, Suite 210 Washington, DC 20010 Phone: 202-328-2600 Fax: 202-328-2661 Table
More informationCONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION. Connecticut State Department of Education
CONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION Connecticut State Department of Education October 2017 Preface Connecticut s educators are committed to ensuring that students develop the skills and acquire
More informationEQuIP Review Feedback
EQuIP Review Feedback Lesson/Unit Name: On the Rainy River and The Red Convertible (Module 4, Unit 1) Content Area: English language arts Grade Level: 11 Dimension I Alignment to the Depth of the CCSS
More informationFOR TEACHERS ONLY. The University of the State of New York REGENTS HIGH SCHOOL EXAMINATION. ENGLISH LANGUAGE ARTS (Common Core)
FOR TEACHERS ONLY The University of the State of New York REGENTS HIGH SCHOOL EXAMINATION CCE ENGLISH LANGUAGE ARTS (Common Core) Wednesday, June 14, 2017 9:15 a.m. to 12:15 p.m., only SCORING KEY AND
More informationAnalyzing Linguistically Appropriate IEP Goals in Dual Language Programs
Analyzing Linguistically Appropriate IEP Goals in Dual Language Programs 2016 Dual Language Conference: Making Connections Between Policy and Practice March 19, 2016 Framingham, MA Session Description
More informationSaint Louis University Program Assessment Plan. Program Learning Outcomes Curriculum Mapping Assessment Methods Use of Assessment Data
Saint Louis University Program Assessment Plan Program (Major, Minor, Core): Sociology Department: Anthropology & Sociology College/School: College of Arts & Sciences Person(s) Responsible for Implementing
More informationSmarter Balanced Assessment Consortium: Brief Write Rubrics. October 2015
Smarter Balanced Assessment Consortium: Brief Write Rubrics October 2015 Target 1 Narrative (Organization Opening) provides an adequate opening or introduction to the narrative that may establish setting
More informationInstructor: Mario D. Garrett, Ph.D. Phone: Office: Hepner Hall (HH) 100
San Diego State University School of Social Work 610 COMPUTER APPLICATIONS FOR SOCIAL WORK PRACTICE Statistical Package for the Social Sciences Office: Hepner Hall (HH) 100 Instructor: Mario D. Garrett,
More informationSTANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION
Arizona Department of Education Tom Horne, Superintendent of Public Instruction STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 5 REVISED EDITION Arizona Department of Education School Effectiveness Division
More informationMetadiscourse in Knowledge Building: A question about written or verbal metadiscourse
Metadiscourse in Knowledge Building: A question about written or verbal metadiscourse Rolf K. Baltzersen Paper submitted to the Knowledge Building Summer Institute 2013 in Puebla, Mexico Author: Rolf K.
More informationSummary results (year 1-3)
Summary results (year 1-3) Evaluation and accountability are key issues in ensuring quality provision for all (Eurydice, 2004). In Europe, the dominant arrangement for educational accountability is school
More informationTHE PENNSYLVANIA STATE UNIVERSITY SCHREYER HONORS COLLEGE DEPARTMENT OF MATHEMATICS ASSESSING THE EFFECTIVENESS OF MULTIPLE CHOICE MATH TESTS
THE PENNSYLVANIA STATE UNIVERSITY SCHREYER HONORS COLLEGE DEPARTMENT OF MATHEMATICS ASSESSING THE EFFECTIVENESS OF MULTIPLE CHOICE MATH TESTS ELIZABETH ANNE SOMERS Spring 2011 A thesis submitted in partial
More informationPurpose of internal assessment. Guidance and authenticity. Internal assessment. Assessment
Assessment Internal assessment Purpose of internal assessment Internal assessment is an integral part of the course and is compulsory for both SL and HL students. It enables students to demonstrate the
More informationUnit 3. Design Activity. Overview. Purpose. Profile
Unit 3 Design Activity Overview Purpose The purpose of the Design Activity unit is to provide students with experience designing a communications product. Students will develop capability with the design
More information5. UPPER INTERMEDIATE
Triolearn General Programmes adapt the standards and the Qualifications of Common European Framework of Reference (CEFR) and Cambridge ESOL. It is designed to be compatible to the local and the regional
More informationEDUC-E328 Science in the Elementary Schools
1 INDIANA UNIVERSITY NORTHWEST School of Education EDUC-E328 Science in the Elementary Schools Time: Monday 9 a.m. to 3:45 Place: Instructor: Matthew Benus, Ph.D. Office: Hawthorn Hall 337 E-mail: mbenus@iun.edu
More informationDeveloping a Language for Assessing Creativity: a taxonomy to support student learning and assessment
Investigations in university teaching and learning vol. 5 (1) autumn 2008 ISSN 1740-5106 Developing a Language for Assessing Creativity: a taxonomy to support student learning and assessment Janette Harris
More informationMathematics Scoring Guide for Sample Test 2005
Mathematics Scoring Guide for Sample Test 2005 Grade 4 Contents Strand and Performance Indicator Map with Answer Key...................... 2 Holistic Rubrics.......................................................
More informationAuthor: Justyna Kowalczys Stowarzyszenie Angielski w Medycynie (PL) Feb 2015
Author: Justyna Kowalczys Stowarzyszenie Angielski w Medycynie (PL) www.angielskiwmedycynie.org.pl Feb 2015 Developing speaking abilities is a prerequisite for HELP in order to promote effective communication
More informationKarla Brooks Baehr, Ed.D. Senior Advisor and Consultant The District Management Council
Karla Brooks Baehr, Ed.D. Senior Advisor and Consultant The District Management Council This paper aims to inform the debate about how best to incorporate student learning into teacher evaluation systems
More informationPolitics and Society Curriculum Specification
Leaving Certificate Politics and Society Curriculum Specification Ordinary and Higher Level 1 September 2015 2 Contents Senior cycle 5 The experience of senior cycle 6 Politics and Society 9 Introduction
More informationWriting a Basic Assessment Report. CUNY Office of Undergraduate Studies
Writing a Basic Assessment Report What is a Basic Assessment Report? A basic assessment report is useful when assessing selected Common Core SLOs across a set of single courses A basic assessment report
More informationFacing our Fears: Reading and Writing about Characters in Literary Text
Facing our Fears: Reading and Writing about Characters in Literary Text by Barbara Goggans Students in 6th grade have been reading and analyzing characters in short stories such as "The Ravine," by Graham
More informationAssessment for Student Learning: Institutional-level Assessment Board of Trustees Meeting, August 23, 2016
KPI SUMMARY REPORT Assessment for Student Learning: -level Assessment Board of Trustees Meeting, August 23, 2016 BACKGROUND Assessment for Student Learning is a key performance indicator aligned to the
More informationEssentials of Ability Testing. Joni Lakin Assistant Professor Educational Foundations, Leadership, and Technology
Essentials of Ability Testing Joni Lakin Assistant Professor Educational Foundations, Leadership, and Technology Basic Topics Why do we administer ability tests? What do ability tests measure? How are
More informationLITERACY ACROSS THE CURRICULUM POLICY
"Pupils should be taught in all subjects to express themselves correctly and appropriately and to read accurately and with understanding." QCA Use of Language across the Curriculum "Thomas Estley Community
More informationThe Writing Process. The Academic Support Centre // September 2015
The Writing Process The Academic Support Centre // September 2015 + so that someone else can understand it! Why write? Why do academics (scientists) write? The Academic Writing Process Describe your writing
More informationStandards-Based Bulletin Boards. Tuesday, January 17, 2012 Principals Meeting
Standards-Based Bulletin Boards Tuesday, January 17, 2012 Principals Meeting Questions: How do your teachers demonstrate the rigor of the standards-based assignments? How do your teachers demonstrate that
More informationMultiple Measures Assessment Project - FAQs
Multiple Measures Assessment Project - FAQs (This is a working document which will be expanded as additional questions arise.) Common Assessment Initiative How is MMAP research related to the Common Assessment
More informationWhy Pay Attention to Race?
Why Pay Attention to Race? Witnessing Whiteness Chapter 1 Workshop 1.1 1.1-1 Dear Facilitator(s), This workshop series was carefully crafted, reviewed (by a multiracial team), and revised with several
More informationFinal Teach For America Interim Certification Program
Teach For America Interim Certification Program Program Rubric Overview The Teach For America (TFA) Interim Certification Program Rubric was designed to provide formative and summative feedback to TFA
More informationAnnual Report Accredited Member
International Assembly for Collegiate Business Education Annual Report Accredited Member Institution: Academic Business Unit: Palm Beach Atlantic University Rinker School of Business Academic Year: 2013-14
More informationVIEW: An Assessment of Problem Solving Style
1 VIEW: An Assessment of Problem Solving Style Edwin C. Selby, Donald J. Treffinger, Scott G. Isaksen, and Kenneth Lauer This document is a working paper, the purposes of which are to describe the three
More informationCommon Core State Standards for English Language Arts
Reading Standards for Literature 6-12 Grade 9-10 Students: 1. Cite strong and thorough textual evidence to support analysis of what the text says explicitly as well as inferences drawn from the text. 2.
More informationThe Political Engagement Activity Student Guide
The Political Engagement Activity Student Guide Internal Assessment (SL & HL) IB Global Politics UWC Costa Rica CONTENTS INTRODUCTION TO THE POLITICAL ENGAGEMENT ACTIVITY 3 COMPONENT 1: ENGAGEMENT 4 COMPONENT
More informationIntensive Writing Class
Intensive Writing Class Student Profile: This class is for students who are committed to improving their writing. It is for students whose writing has been identified as their weakest skill and whose CASAS
More informationGeorge Mason University Graduate School of Education Program: Special Education
George Mason University Graduate School of Education Program: Special Education 1 EDSE 590: Research Methods in Special Education Instructor: Margo A. Mastropieri, Ph.D. Assistant: Judy Ericksen Section
More informationSectionalism Prior to the Civil War
Sectionalism Prior to the Civil War GRADE 7 This sample task contains a set of primary and authentic sources about how the differences between the North and South deepened the feelings of sectionalism
More informationPresentation Advice for your Professional Review
Presentation Advice for your Professional Review This document contains useful tips for both aspiring engineers and technicians on: managing your professional development from the start planning your Review
More informationOakland Unified School District English/ Language Arts Course Syllabus
Oakland Unified School District English/ Language Arts Course Syllabus For Secondary Schools The attached course syllabus is a developmental and integrated approach to skill acquisition throughout the
More informationSETTING STANDARDS FOR CRITERION- REFERENCED MEASUREMENT
SETTING STANDARDS FOR CRITERION- REFERENCED MEASUREMENT By: Dr. MAHMOUD M. GHANDOUR QATAR UNIVERSITY Improving human resources is the responsibility of the educational system in many societies. The outputs
More informationCEFR Overall Illustrative English Proficiency Scales
CEFR Overall Illustrative English Proficiency s CEFR CEFR OVERALL ORAL PRODUCTION Has a good command of idiomatic expressions and colloquialisms with awareness of connotative levels of meaning. Can convey
More informationTeaching a Laboratory Section
Chapter 3 Teaching a Laboratory Section Page I. Cooperative Problem Solving Labs in Operation 57 II. Grading the Labs 75 III. Overview of Teaching a Lab Session 79 IV. Outline for Teaching a Lab Session
More informationCurriculum and Assessment Policy
*Note: Much of policy heavily based on Assessment Policy of The International School Paris, an IB World School, with permission. Principles of assessment Why do we assess? How do we assess? Students not
More informationNo Parent Left Behind
No Parent Left Behind Navigating the Special Education Universe SUSAN M. BREFACH, Ed.D. Page i Introduction How To Know If This Book Is For You Parents have become so convinced that educators know what
More informationAlpha provides an overall measure of the internal reliability of the test. The Coefficient Alphas for the STEP are:
Every individual is unique. From the way we look to how we behave, speak, and act, we all do it differently. We also have our own unique methods of learning. Once those methods are identified, it can make
More informationAssessment of Student Academic Achievement
Assessment of Student Academic Achievement 13 Chapter Parkland s commitment to the assessment of student academic achievement and its documentation is reflected in the college s mission statement; it also
More informationA process by any other name
January 05, 2016 Roger Tregear A process by any other name thoughts on the conflicted use of process language What s in a name? That which we call a rose By any other name would smell as sweet. William
More informationPROGRAM REVIEW REPORT EXTERNAL REVIEWER
PROGRAM REVIEW REPORT EXTERNAL REVIEWER MASTER OF PUBLIC POLICY AND ADMINISTRATION DEPARTMENT OF PUBLIC POLICY AND ADMINISTRATION CALIFORNIA STATE UNIVERSITY SACRAMENTO NOVEMBER, 2012 Submitted by Michelle
More informationPerson Centered Positive Behavior Support Plan (PC PBS) Report Scoring Criteria & Checklist (Rev ) P. 1 of 8
Scoring Criteria & Checklist (Rev. 3 5 07) P. 1 of 8 Name: Case Name: Case #: Rater: Date: Critical Features Note: The plan needs to meet all of the critical features listed below, and needs to obtain
More informationPIRLS. International Achievement in the Processes of Reading Comprehension Results from PIRLS 2001 in 35 Countries
Ina V.S. Mullis Michael O. Martin Eugenio J. Gonzalez PIRLS International Achievement in the Processes of Reading Comprehension Results from PIRLS 2001 in 35 Countries International Study Center International
More informationABET Criteria for Accrediting Computer Science Programs
ABET Criteria for Accrediting Computer Science Programs Mapped to 2008 NSSE Survey Questions First Edition, June 2008 Introduction and Rationale for Using NSSE in ABET Accreditation One of the most common
More informationMASTER S THESIS GUIDE MASTER S PROGRAMME IN COMMUNICATION SCIENCE
MASTER S THESIS GUIDE MASTER S PROGRAMME IN COMMUNICATION SCIENCE University of Amsterdam Graduate School of Communication Kloveniersburgwal 48 1012 CX Amsterdam The Netherlands E-mail address: scripties-cw-fmg@uva.nl
More informationCopyright Corwin 2015
2 Defining Essential Learnings How do I find clarity in a sea of standards? For students truly to be able to take responsibility for their learning, both teacher and students need to be very clear about
More informationAN ANALYSIS OF GRAMMTICAL ERRORS MADE BY THE SECOND YEAR STUDENTS OF SMAN 5 PADANG IN WRITING PAST EXPERIENCES
AN ANALYSIS OF GRAMMTICAL ERRORS MADE BY THE SECOND YEAR STUDENTS OF SMAN 5 PADANG IN WRITING PAST EXPERIENCES Yelna Oktavia 1, Lely Refnita 1,Ernati 1 1 English Department, the Faculty of Teacher Training
More informationQUESTIONS ABOUT ACCESSING THE HANDOUTS AND THE POWERPOINT
Answers to Questions Posed During Pearson aimsweb Webinar: Special Education Leads: Quality IEPs and Progress Monitoring Using Curriculum-Based Measurement (CBM) Mark R. Shinn, Ph.D. QUESTIONS ABOUT ACCESSING
More informationMini Lesson Ideas for Expository Writing
Mini LessonIdeasforExpositoryWriting Expository WheredoIbegin? (From3 5Writing:FocusingonOrganizationandProgressiontoMoveWriters, ContinuousImprovementConference2016) ManylessonideastakenfromB oxesandbullets,personalandpersuasiveessaysbylucycalkins
More informationre An Interactive web based tool for sorting textbook images prior to adaptation to accessible format: Year 1 Final Report
to Anh Bui, DIAGRAM Center from Steve Landau, Touch Graphics, Inc. re An Interactive web based tool for sorting textbook images prior to adaptation to accessible format: Year 1 Final Report date 8 May
More informationQUESTIONS and Answers from Chad Rice?
QUESTIONS and Answers from Chad Rice? If a teacher, who teaches in a self contained ED class, only has 3 students, must she do SLOs? For these teachers that do not have enough students to capture The 6
More informationIntra-talker Variation: Audience Design Factors Affecting Lexical Selections
Tyler Perrachione LING 451-0 Proseminar in Sound Structure Prof. A. Bradlow 17 March 2006 Intra-talker Variation: Audience Design Factors Affecting Lexical Selections Abstract Although the acoustic and
More informationA Study of Metacognitive Awareness of Non-English Majors in L2 Listening
ISSN 1798-4769 Journal of Language Teaching and Research, Vol. 4, No. 3, pp. 504-510, May 2013 Manufactured in Finland. doi:10.4304/jltr.4.3.504-510 A Study of Metacognitive Awareness of Non-English Majors
More informationNORTH CAROLINA STATE BOARD OF EDUCATION Policy Manual
NORTH CAROLINA STATE BOARD OF EDUCATION Policy Manual Policy Identification Priority: Twenty-first Century Professionals Category: Qualifications and Evaluations Policy ID Number: TCP-C-006 Policy Title:
More informationUnpacking a Standard: Making Dinner with Student Differences in Mind
Unpacking a Standard: Making Dinner with Student Differences in Mind Analyze how particular elements of a story or drama interact (e.g., how setting shapes the characters or plot). Grade 7 Reading Standards
More informationAchievement Level Descriptors for American Literature and Composition
Achievement Level Descriptors for American Literature and Composition Georgia Department of Education September 2015 All Rights Reserved Achievement Levels and Achievement Level Descriptors With the implementation
More informationIndividual Component Checklist L I S T E N I N G. for use with ONE task ENGLISH VERSION
L I S T E N I N G Individual Component Checklist for use with ONE task ENGLISH VERSION INTRODUCTION This checklist has been designed for use as a practical tool for describing ONE TASK in a test of listening.
More informationFurther, Robert W. Lissitz, University of Maryland Huynh Huynh, University of South Carolina ADEQUATE YEARLY PROGRESS
A peer-reviewed electronic journal. Copyright is retained by the first or sole author, who grants right of first publication to Practical Assessment, Research & Evaluation. Permission is granted to distribute
More information