Quantitative Reasoning Assessment

Similar documents
eportfolio Assessment of General Education

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

Achievement Level Descriptors for American Literature and Composition

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Arizona s English Language Arts Standards th Grade ARIZONA DEPARTMENT OF EDUCATION HIGH ACADEMIC STANDARDS FOR STUDENTS

Mathematics Program Assessment Plan

ECON 365 fall papers GEOS 330Z fall papers HUMN 300Z fall papers PHIL 370 fall papers

EDIT 576 DL1 (2 credits) Mobile Learning and Applications Fall Semester 2014 August 25 October 12, 2014 Fully Online Course

Final Teach For America Interim Certification Program

EDIT 576 (2 credits) Mobile Learning and Applications Fall Semester 2015 August 31 October 18, 2015 Fully Online Course

Update on Standards and Educator Evaluation

Common Performance Task Data

MYP Language A Course Outline Year 3

TABLE OF CONTENTS Credit for Prior Learning... 74

Graduate Program in Education

This Performance Standards include four major components. They are

Facing our Fears: Reading and Writing about Characters in Literary Text

Rubric for Scoring English 1 Unit 1, Rhetorical Analysis

Scoring Guide for Candidates For retake candidates who began the Certification process in and earlier.

DESIGNPRINCIPLES RUBRIC 3.0

Extending Place Value with Whole Numbers to 1,000,000

Calculators in a Middle School Mathematics Classroom: Helpful or Harmful?

EQuIP Review Feedback

Proficiency Illusion

What does Quality Look Like?

Secondary English-Language Arts

Purpose of internal assessment. Guidance and authenticity. Internal assessment. Assessment

Fourth Grade. Reporting Student Progress. Libertyville School District 70. Fourth Grade

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

Evaluation of a College Freshman Diversity Research Program

What is PDE? Research Report. Paul Nichols

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Physics 270: Experimental Physics

An Introduction to LEAP

TIM: Table of Summary Descriptors This table contains the summary descriptors for each cell of the Technology Integration Matrix (TIM).

1. READING ENGAGEMENT 2. ORAL READING FLUENCY

ASSESSMENT REPORT FOR GENERAL EDUCATION CATEGORY 1C: WRITING INTENSIVE

Writing a Basic Assessment Report. CUNY Office of Undergraduate Studies

Timeline. Recommendations

CEFR Overall Illustrative English Proficiency Scales

THE PENNSYLVANIA STATE UNIVERSITY SCHREYER HONORS COLLEGE DEPARTMENT OF MATHEMATICS ASSESSING THE EFFECTIVENESS OF MULTIPLE CHOICE MATH TESTS

Norms How were TerraNova 3 norms derived? Does the norm sample reflect my diverse school population?

The lab is designed to remind you how to work with scientific data (including dealing with uncertainty) and to review experimental design.

The Political Engagement Activity Student Guide

Assessment and Evaluation

Grade 2: Using a Number Line to Order and Compare Numbers Place Value Horizontal Content Strand

Qualitative Site Review Protocol for DC Charter Schools

Guidelines for Project I Delivery and Assessment Department of Industrial and Mechanical Engineering Lebanese American University

University of Toronto Mississauga Degree Level Expectations. Preamble

NCEO Technical Report 27

Scoring Notes for Secondary Social Studies CBAs (Grades 6 12)

Statistical Analysis of Climate Change, Renewable Energies, and Sustainability An Independent Investigation for Introduction to Statistics

St. Martin s Marking and Feedback Policy

ACADEMIC AFFAIRS GUIDELINES

Mathematics Scoring Guide for Sample Test 2005

Syllabus Fall 2014 Earth Science 130: Introduction to Oceanography

WHY SOLVE PROBLEMS? INTERVIEWING COLLEGE FACULTY ABOUT THE LEARNING AND TEACHING OF PROBLEM SOLVING

Grade 4. Common Core Adoption Process. (Unpacked Standards)

ASSESSMENT OF STUDENT LEARNING OUTCOMES WITHIN ACADEMIC PROGRAMS AT WEST CHESTER UNIVERSITY

BENCHMARK TREND COMPARISON REPORT:

The portrayal of the nature of science in upper elementary instructional materials

Politics and Society Curriculum Specification

MASTER OF ARTS IN APPLIED SOCIOLOGY. Thesis Option

Classifying combinations: Do students distinguish between different types of combination problems?

Honors Mathematics. Introduction and Definition of Honors Mathematics

Program Elements Definitions and Structure

John Jay College of Criminal Justice, CUNY ASSESSMENT REPORT: SPRING Undergraduate Public Administration Major

Reference to Tenure track faculty in this document includes tenured faculty, unless otherwise noted.

AC : DEVELOPMENT OF AN INTRODUCTION TO INFRAS- TRUCTURE COURSE

This table contains the extended descriptors for Active Learning on the Technology Integration Matrix (TIM).

Language Arts: ( ) Instructional Syllabus. Teachers: T. Beard address

Pedagogical Content Knowledge for Teaching Primary Mathematics: A Case Study of Two Teachers

TABE 9&10. Revised 8/2013- with reference to College and Career Readiness Standards

PSYCHOLOGY 353: SOCIAL AND PERSONALITY DEVELOPMENT IN CHILDREN SPRING 2006

ABET Criteria for Accrediting Computer Science Programs

HARPER ADAMS UNIVERSITY Programme Specification

CHMB16H3 TECHNIQUES IN ANALYTICAL CHEMISTRY

University of Massachusetts Lowell Graduate School of Education Program Evaluation Spring Online

Focus on. Learning THE ACCREDITATION MANUAL 2013 WASC EDITION

Common Core State Standards for English Language Arts

Copyright Corwin 2015

Improvement of Writing Across the Curriculum: Full Report. Administered Spring 2014

Unit 7 Data analysis and design

Intra-talker Variation: Audience Design Factors Affecting Lexical Selections

Indicators Teacher understands the active nature of student learning and attains information about levels of development for groups of students.

PEDAGOGICAL LEARNING WALKS: MAKING THE THEORY; PRACTICE

Writing Functional Ot Goals In Snf

NCSC Alternate Assessments and Instructional Materials Based on Common Core State Standards

Arkansas Tech University Secondary Education Exit Portfolio

Technical Manual Supplement

BPS Information and Digital Literacy Goals

PROGRESS MONITORING FOR STUDENTS WITH DISABILITIES Participant Materials

STUDENT LEARNING ASSESSMENT REPORT

CAAP. Content Analysis Report. Sample College. Institution Code: 9011 Institution Type: 4-Year Subgroup: none Test Date: Spring 2011

University of Groningen. Systemen, planning, netwerken Bosman, Aart

Undergraduate Admissions Standards for the Massachusetts State University System and the University of Massachusetts. Reference Guide April 2016

Course Name: Elementary Calculus Course Number: Math 2103 Semester: Fall Phone:

Reading Horizons. Aid for the School Principle: Evaluate Classroom Reading Programs. Sandra McCormick JANUARY Volume 19, Issue Article 7

NAME OF ASSESSMENT: Reading Informational Texts and Argument Writing Performance Assessment

Disciplinary Literacy in Science

Transcription:

QR Assessment Final 2014/2015 Quantitative Reasoning Assessment Quantitative Reasoning (QR) is one of Pitzer College s educational objectives, and one of WASC s core competencies. As such, it is important to take a close look at how QR is represented at Pitzer and to what degree our students are meeting the QR educational objective. In order to gain a strong understanding of Quantitative Reasoning at Pitzer College, it is important to examine QR from the following perspectives: 1) In what ways students are meeting the QR requirement (Course analysis) 2) The common themes/criteria evident in the top QR courses (Syllabus analysis) 3) How well are students achieving the QR educational objective (Direct assessment) The foundational data for this assessment is the graduating student body of 2013-2014, which is a population of 273 students. All graduation check forms and transcripts for that population were used to create the database for this analysis. College Analysis According to the data collected, the vast majority of students fulfill the QR requirement through a course here at Pitzer College (79.9%). However, a number of students have fulfilled this requirement through Transfer Credit (6.2%). Although it is not a large number of students, it is important to note that it is challenging to use student work from outside of Pitzer for assessment purposes and the further away from Pitzer, the more challenging (Table 1). Table 1: School Where QR Courses are Taken School % Pitzer 79.9 CMC 2.9 Pomona 5.9 Scripps 2.9 Harvey Mudd.4 Keck Science 1.5 Transfer 6.2 Course Analysis Initially, we assessed all Degree Verification forms taking them at face value. This meant that what was identified in that record as meeting the QR (indicated as FOR in Degree Verification Form) requirement was used in determining the department and course dispersion. On this first analysis, most students (59.3%) met the QR requirement by taking a course within the math department. The next closest department being economics (15.8% of graduating students) (Table 2). Office of Academic Assessment Omar Safie Pg. 1

QR Assessment Final 2014/2015 However, upon closer examination, the majority of those identified as meeting the QR requirement through a course in the economics department (a total of 43 students), did so through ECON 52 (Microeconomics). As being a course that does not meet the QR requirement as defined by Pitzer, we looked closely at each student s Degree Verification Form and transcript to determine if another course met that requirement. When recalculated, the percentage of students meeting the QR requirement through a course in the math department increased by over four percentage points, and economics department courses fell by slightly over five percentage points. In addition, economics fell to third highest with 10.6%. However, it still remains true that the majority of students meet the QR requirement through a course in the math department (Table 2). Table 2: QR Course Breakdown by Department Post Recalculation Prior to Recalculation Dept. % Dept. % Math 63.7 Math 59.3 Economics 10.6 Economics 15.8 Psychology 11.7 Psychology 11.4 Sociology 4.8 Sociology 4.8 Biology 1.5 Biology 1.5 Other 4.8 Other 4.8 Using the recalculated numbers that substitutes an appropriate QR course for ECON 52 (if there is an alternative course listed in the student s transcript), we were able to identify the top 10 courses which make up slightly more than 68% of all courses identified in graduating student Degree Verification Forms as fulfilling the QR requirement (Table 3). Table 3: Top 10 Courses Taken to Meet Quantitative Reasoning Requirement Course No. Course Name % of Students Number of Students Math 30 Calculus I 13.6 37 Psyc 91 Psychological 11.0 30 Stats. Math 52 Intro to Stats. 9.9 27 Math 001 Math Philosophy 8.1 22 & the Real World Math 010G Math in Many 7.7 21 Cultures Econ 52 Microeconomics 6.6 18 Math 31 Calculus II 3.3 9 Soc 101 Quantitative 3.3 9 Research Methods Math 005 Rubik s Cube and 2.6 7 Math Puzzles Math 006 Two Player 2.2 6 Games Total 68.3 186 Yellow highlighted courses indicate courses used in the direct assessment discussed later. Office of Academic Assessment Omar Safie Pg. 2

QR Assessment Final 2014/2015 Direct Assessment QR Methdology For the direct assessment of student learning, we utilized the Association of American Colleges and Universities (AAC&U) VALUE Rubric for Quantitative Literacy (see Appendix A). It was applied ex post facto as there were no common QR criteria, learning outcomes, or assessment rubrics at the time of this assessment. Quantitative Literacy is defined in the AAC&U Quantitative Literacy VALUE rubric as: Quantitative Literacy (QL) - Also known as Numeracy or Quantitative Reasoning is a habit of mind, competency, and comfort in working with numerical data. Individuals with strong QL skills possess the ability to reason and solve quantitative problems from a wide array of authentic contexts and everyday life situations. They understand and can create sophisticated arguments supported by quantitative evidence and they can clearly communicate those arguments in a variety of formats (using words, tables, graphs, mathematical equations, etc., as appropriate). This general definition was created over many years with feedback and input from expert faculty and practitioners in higher education at various institutions across the country (see https://www.aacu.org/value/rubrics for more information). However, it is important to keep in mind that this rubric is being applied ex post facto without any modification to it. What this means is that the rubric is non-specific to Pitzer College QR. The drawback to this approach is that the rubric may not be directly aligned to Pitzer s definition of QR and the outcomes related to it. Unfortunately, given that there is no clear definition of QR here at Pitzer or its associated outcomes, this was the only direct assessment approach that could be utilized. In order to conduct the direct assessment of QR here at Pitzer, we needed to first identify the courses that could be used in this assessment. From the course analysis that was conducted, we were able to identify the top 10 courses in terms of enrollment and identification by graduating 2013-2014 students as meeting the QR requirement. All faculty who taught these courses were contacted by the Office of Academic Assessment to request their participation in the institutional level assessment of QR. Each faculty member who taught one of the top ten courses was asked to provide samples of student work representative of the work students completed relative to the outcomes expected. This could be a final exam, essay, project, or any combination of work that could be considered representative. Once all of the faculty who had samples of student work to provide were able to, we were able to capture samples of student work from six of the top ten courses (see Table 3). This resulted in samples of student work being captured from 142 students. Out of this sample, the Office of Academic Assessment used a stratified random selection process to randomly select 5 of the samples of student work from each course. Once all samples were selected, we were left with a sample of 71 artifacts to be used in the direct assessment of QR here at Pitzer. The methodology for direct assessment was straightforward. The Office of Academic Assessment reviewed each student artifact (or group of artifacts if more than one item was provided for students in a specific course) and then scored it using the QL VALUE rubric (see Appendix A). The scores were then tabulated and are presented here. Office of Academic Assessment Omar Safie Pg. 3

Percentage of Student Scores QR Assessment Final 2014/2015 QR Findings Overall, the majority of student work represented Developed or Highly Developed work quality in terms of the QR criteria of Representation, Calculation, and Interpretation. However, the student work assessed was not representative of the Assumption criterion of QR (see Figure 1). This was true to a certain degree for all but two of the QR criteria assessed, and as such, a closer look at the student learning outcomes and the alignment of courses and outcomes to the QR institutional outcome should be conducted. Figure 1: Quantitative Reasoning Overview Quantitative Reasoning Overview Representation Calculation Communication Interpretation Application Analysis Assumption 1 47% 56% 71% 79% 93% 9 2 4 6 8 10 Percetnage of Student Scores Not Evident Developed/ Highly Developed When disaggregated, students performed very well on the Representation criterion for QR. In this criterion, students were assessed on their ability to represent and or present information in quantitative/mathematical formats. With the vast majority of students being scored at either Developed or Highly Developed, students at Pitzer are able to do this well (see Figure 2). Figure 2: QR Criteria Representation Representation Score Distribution 6 5 4 3 2 48% 1 7% Office of Academic Assessment Omar Safie Pg. 4

Percentage of Student Scores Percentage of Student Scores QR Assessment Final 2014/2015 As part of the QR direct assessment, students were also assessed on their ability to calculate quantitative information. Similar to Representation, the vast majority of students are able to calculate quantitative information very well. While not a considerably large percentage of students, it should be noted that 21% of students were scored at emerging given the work samples provided. It will be important to re-examine this criteria at the next assessment of QR to see how it changes over time (see Figure 3). Figure 3: QR Criteria Calculation Calculation Score Distribution 5 4 35% 3 25% 2 15% 1 5% 21% 34% Students were also assessed on their ability to communicate quantitative information to support an argument or purpose. Again, the majority of students were scored at Developed or Highly Developed, indicating that they are able to communicate quantitative information, at least through written work, very well. However, with nearly a quarter of the work assessed not showing evidence of the Communication QR criteria, a closer examination of the alignment of course assessment strategies to QR criteria and associated learning outcomes should be conducted prior to the next assessment of QR (see Figure 4). Figure 4: QR Criteria Communication Communication Score Distribution 6 5 51% 4 3 2 2 1 6% Office of Academic Assessment Omar Safie Pg. 5

Percentage of Student Scores Percentage of Student Scores QR Assessment Final 2014/2015 As part of the QR assessment, students were also evaluated on their ability to interpret/explain information presented in quantitative formats. Only a slight majority of students were scored at Developed or Highly Developed. However, a considerable percentage of students were scored at Emerging and an even greater percentage of student work did not demonstrate evidence of the Interpretation criteria. As such, the alignment of this criteria to course assessment strategies to QR criteria and associated learning outcomes should be conducted prior to the next assessment of QR (see Figure 5). Figure 5: QR Criteria Interpretation 35% 3 25% 2 15% 1 5% Interpretation Score Distribution 2 32% The QR criteria Application/Analysis assesses students on their ability to make judgments and draw conclusions based on the analysis of quantitative data and the limits associated with those conclusions. While only a small percentage of student work was scored at Emerging, nearly half of the student work assessed did not demonstrate any evidence of this criteria. Again, alignment of this criteria and associated student work, and student learning outcomes at the institutional level and course level will be necessary to determine the fidelity of this criteria in future assessments (see Figure 6). Figure 6: QR Criteria Application/Analysis Application Analysis Score Distribution 5 4 35% 3 25% 2 15% 1 5% 8% 13% 34% Office of Academic Assessment Omar Safie Pg. 6

Percentage of Student Scores QR Assessment Final 2014/2015 The Assumption criteria is meant to determine how well students are able to identify important assumptions in quantitative work and findings. However, it is very clear that the vast majority of student work used in this assessment did not provide evidence of this criteria. In addition to examining the alignment of this criteria to associated student work, and student learning outcomes at the institutional level and course level, an additional assessment of whether this criteria is applicable to Pitzer s QR Educational Objective should be conducted (see Figure 7). Figure 7: QR Criteria Assumptions Assumption Score Distribution 10 9 8 7 6 5 4 3 2 1 9 1 Syllabus Analysis In order to provide a general picture of the top ten courses and how well they were aligned with the QL VALUE rubric used in this direct assessment, it was important to gain a general picture of the course as identified by the syllabi provided and the alignment of the student artifacts provided to the rubric. In terms of course student learning outcomes (SLOs), eight out of the ten courses did have SLOs, and of these, an average of 83% of the SLOs were aligned to the QL VALUE rubric criteria. Five of the ten courses were identified as having all of their SLOs linked to QL VALUE rubric criteria. While these are strong numbers, there is still room for improvement, particularly because some SLOs were not clear in the expected student outcome nor was it clear if the SLOs were aligned to the student assessments. In addition to the syllabus specific findings, the student artifacts provided for the direct assessment were also limited in their representation of the QL VALUE rubric criteria. As part of the direct assessment, we also assessed student work on whether each rubric criteria was evident. Combined with findings from the direct assessment presented earlier, it is very clear that student work strongly resented evidence of some criteria while not providing for other criteria. This is extremely important to review further in relation to the SLOs written into each syllabi and determine if all of these aspects of QL should be utilized for all QR courses here at Pitzer. Office of Academic Assessment Omar Safie Pg. 7

Not Evident QR Assessment Final 2014/2015 Figure 8: Percentage of QL VALUE Criteria Not Evident in Student Artifacts Percentage of Not Evident in Student Artifacts Assumptions 9 Application/ Analysis Communication Interpretation Calculation Representation 2 4 6 8 10 Percentage of Students Conclusions and Next Steps It is clear that when specific QR criteria are represented in course expectations, students tend to address them at a high level. However, it must be represented in the syllabi through SLOs, aligned curriculum, and aligned student assessments, which is not the norm here at Pitzer. In addition, QR is currently only represented in typically mathematically oriented courses. However, how QR may be represented in other courses, and as part of an institution-wide discussion, should be examined further. With this in mind, here are some suggestions for next steps: 1) Clearly define QR within the Pitzer context with specific institutional level outcomes. 2) Insure that identified QR courses are aligned to institutional level outcomes and expectations of students. 3) Expand the discussion of QR and where it is evidenced at Pitzer beyond the traditional courses, such as mathematics and statistics courses. Office of Academic Assessment Omar Safie Pg. 8

QR Assessment Final 2014/2015 Appendix A: Quantitative Literacy VALUE Rubric QUANTITATIVE LITERACY VALUE RUBRIC for more information, please contact value@aacu.org The VALUE rubrics were developed by teams of faculty experts representing colleges and universities across the United States through a process that examined many existing campus rubrics and related documents for each learning outcome and incorporated additional feedback from faculty. The rubrics articulate fundamental criteria for each learning outcome, with performance descriptors demonstrating progressively more sophisticated levels of attainment. The rubrics are intended for institutional-level use in evaluating and discussing student learning, not for grading. The core expectations articulated in all 15 of the VALUE rubrics can and should be translated into the language of individual campuses, disciplines, and even courses. The utility of the VALUE rubrics is to position learning at all undergraduate levels within a basic framework of expectations such that evidence of learning can by shared nationally through a common dialog and understanding of student success. Definition Quantitative Literacy (QL) also known as Numeracy or Quantitative Reasoning (QR) is a "habit of mind," competency, and comfort in working with numerical data. Individuals with strong QL skills possess the ability to reason and solve quantitative problems from a wide array of authentic contexts and everyday life situations. They understand and can create sophisticated arguments supported by quantitative evidence and they can clearly communicate those arguments in a variety of formats (using words, tables, graphs, mathematical equations, etc., as appropriate). Quantitative Literacy Across the Disciplines Current trends in general education reform demonstrate that faculty are recognizing the steadily growing importance of Quantitative Literacy (QL) in an increasingly quantitative and data-dense world. AAC&U s recent survey showed that concerns about QL skills are shared by employers, who recognize that many of today s students will need a wide range of high level quantitative skills to complete their work responsibilities. Virtually all of today s students, regardless of career choice, will need basic QL skills such as the ability to draw information from charts, graphs, and geometric figures, and the ability to accurately complete straightforward estimations and calculations. Preliminary efforts to find student work products which demonstrate QL skills proved a challenge in this rubric creation process. It s possible to find pages of mathematical problems, but what those problem sets don t demonstrate is whether the student was able to think about and understand the meaning of her work. It s possible to find research papers that include quantitative information, but those papers often don t provide evidence that allows the evaluator to see how much of the thinking was done by the original source (often carefully cited in the paper) and how much was done by the student herself, or whether conclusions drawn from analysis of the source material are even accurate. Given widespread agreement about the importance of QL, it becomes incumbent on faculty to develop new kinds of assignments which give students substantive, contextualized experience in using such skills as analyzing quantitative information, representing quantitative information in appropriate forms, completing calculations to answer meaningful questions, making judgments based on quantitative data and communicating the results of that work for various purposes and audiences. As students gain experience with those skills, faculty must develop assignments that require students to create work products which reveal their thought processes and demonstrate the range of their QL skills. This rubric provides for faculty a definition for QL and a rubric describing four levels of QL achievement which might be observed in work products within work samples or collections of work. Members of AAC&U s rubric development team for QL hope that these materials will aid in the assessment of QL but, equally important, we hope that they will help institutions and individuals in the effort to more thoroughly embed QL across the curriculum of colleges and universities. Framing Language This rubric has been designed for the evaluation of work that addresses quantitative literacy (QL) in a substantive way. QL is not just computation, not just the citing of someone else s data. QL is a habit of mind, a way of thinking about the world that relies on data and on the mathematical analysis of data to make connections and draw conclusions. Teaching QL requires us to design assignments that address authentic, data-based problems. Such assignments may call for the traditional written paper, but we can imagine other alternatives: a video of a PowerPoint presentation, perhaps, or a well designed series of web pages. In any case, a successful demonstration of QL will place the mathematical work in the context of a full and robust discussion of the underlying issues addressed by the assignment. Finally, QL skills can be applied to a wide array of problems of varying difficulty, confounding the use of this rubric. For example, the same student might demonstrate high levels of QL achievement when working on a simplistic problem and low levels of QL achievement when working on a very complex problem. Thus, to accurately assess a students QL achievement it may be necessary to measure QL achievement within the context of problem complexity, much as is done in diving competitions where two scores are given, one for the difficulty of the dive, and the other for the skill in accomplishing the dive. In this context, that would mean giving one score for the complexity of the problem and another score for the QL achievement in solving the problem. Office of Academic Assessment Omar Safie Pg. 9

QR Assessment Final 2014/2015 QUANTITATIVE LITERACY VALUE RUBRIC for more information, please contact value@aacu.org Definition Quantitative Literacy (QL) also known as Numeracy or Quantitative Reasoning (QR) is a "habit of mind," competency, and comfort in working with numerical data. Individuals with strong QL skills possess the ability to reason and solve quantitative problems from a wide array of authentic contexts and everyday life situations. They understand and can create sophisticated arguments supported by quantitative evidence and they can clearly communicate those arguments in a variety of formats (using words, tables, graphs, mathematical equations, etc., as appropriate). Interpretation Ability to explain information presented in mathematical forms (e.g., equations, graphs, diagrams, tables, words) Representation Ability to convert relevant information into various mathematical forms (e.g., equations, graphs, diagrams, tables, words) Calculation Application / Analysis Ability to make judgments and draw appropriate conclusions based on the quantitative analysis of data, while recognizing the limits of this analysis Assumptions Ability to make and evaluate important assumptions in estimation, modeling, and data analysis Communication Expressing quantitative evidence in support of the argument or purpose of the work (in terms of what evidence is used and how it is formatted, presented, and contextualized) Evaluators are encouraged to assign a zero to any work sample or collection of work that does not meet benchmark (cell one) level performance. Capstone 4 Provides accurate explanations of information presented in mathematical forms. Makes appropriate inferences based on that information. For example, accurately explains the trend data shown in a graph and makes reasonable predictions regarding what the data suggest about future events. Skillfully converts relevant information into an insightful mathematical portrayal in a way that contributes to a further or deeper understanding. Calculations attempted are essentially all successful and sufficiently comprehensive to solve the problem. Calculations are also presented elegantly (clearly, concisely, etc.) Uses the quantitative analysis of data as the basis for deep and thoughtful judgments, drawing insightful, carefully qualified conclusions from this work. Explicitly describes assumptions and provides compelling rationale for why each assumption is appropriate. Shows awareness that confidence in final conclusions is limited by the accuracy of the assumptions. Uses quantitative information in connection with the argument or purpose of the work, presents it in an effective format, and explicates it with consistently high quality. Provides accurate explanations of information presented in mathematical forms. For instance, accurately explains the trend data shown in a graph. Competently converts relevant information into an appropriate and desired mathematical portrayal. Calculations attempted are essentially all successful and sufficiently comprehensive to solve the problem. Uses the quantitative analysis of data as the basis for competent judgments, drawing reasonable and appropriately qualified conclusions from this work. Explicitly describes assumptions and provides compelling rationale for why assumptions are appropriate. Uses quantitative information in connection with the argument or purpose of the work, though data may be presented in a less than completely effective format or some parts of the explication may be uneven. Milestones 3 2 1 Provides somewhat accurate explanations of information presented in mathematical forms, but occasionally makes minor errors related to computations or units. For instance, accurately explains trend data shown in a graph, but may miscalculate the slope of the trend line. Completes conversion of information but resulting mathematical portrayal is only partially appropriate or accurate. Calculations attempted are either unsuccessful or represent only a portion of the calculations required to comprehensively solve the problem. Uses the quantitative analysis of data as the basis for workmanlike (without inspiration or nuance, ordinary) judgments, drawing plausible conclusions from this work. Explicitly describes assumptions. Uses quantitative information, but does not effectively connect it to the argument or purpose of the work. Attempts to explain information presented in mathematical forms, but draws incorrect conclusions about what the information means. For example, attempts to explain the trend data shown in a graph, but will frequently misinterpret the nature of that trend, perhaps by confusing positive and negative trends. Completes conversion of information but resulting mathematical portrayal is inappropriate or inaccurate. Calculations are attempted but are both unsuccessful and are not comprehensive. Uses the quantitative analysis of data as the basis for tentative, basic judgments, although is hesitant or uncertain about drawing conclusions from this work. Attempts to describe assumptions. Presents an argument for which quantitative evidence is pertinent, but does not provide adequate explicit numerical support. (May use quasi-quantitative words such as "many," "few," "increasing," "small," and the like in place of actual quantities.) Office of Academic Assessment Omar Safie Pg. 10