: USING RUBRICS FOR THE ASSESSMENT OF SENIOR DESIGN PROJECTS

Similar documents
Rubric for Scoring English 1 Unit 1, Rhetorical Analysis

TRAITS OF GOOD WRITING

Examining the Structure of a Multidisciplinary Engineering Capstone Design Program

Guidelines for Project I Delivery and Assessment Department of Industrial and Mechanical Engineering Lebanese American University

Introduction and Motivation

EDIT 576 (2 credits) Mobile Learning and Applications Fall Semester 2015 August 31 October 18, 2015 Fully Online Course

Physics 270: Experimental Physics

Scoring Guide for Candidates For retake candidates who began the Certification process in and earlier.

Statistical Analysis of Climate Change, Renewable Energies, and Sustainability An Independent Investigation for Introduction to Statistics

EDIT 576 DL1 (2 credits) Mobile Learning and Applications Fall Semester 2014 August 25 October 12, 2014 Fully Online Course

Designing a Rubric to Assess the Modelling Phase of Student Design Projects in Upper Year Engineering Courses

English 491: Methods of Teaching English in Secondary School. Identify when this occurs in the program: Senior Year (capstone course), week 11

Guidelines for Writing an Internship Report

Supervised Agriculture Experience Suffield Regional 2013

PAGE(S) WHERE TAUGHT If sub mission ins not a book, cite appropriate location(s))

Unit 3. Design Activity. Overview. Purpose. Profile

MADERA SCIENCE FAIR 2013 Grades 4 th 6 th Project due date: Tuesday, April 9, 8:15 am Parent Night: Tuesday, April 16, 6:00 8:00 pm

Major Milestones, Team Activities, and Individual Deliverables

CEFR Overall Illustrative English Proficiency Scales

University of Waterloo School of Accountancy. AFM 102: Introductory Management Accounting. Fall Term 2004: Section 4

How to Judge the Quality of an Objective Classroom Test

November 2012 MUET (800)

Higher Education / Student Affairs Internship Manual

ACADEMIC AFFAIRS GUIDELINES

Mathematics Program Assessment Plan

KENTUCKY FRAMEWORK FOR TEACHING

Smarter Balanced Assessment Consortium: Brief Write Rubrics. October 2015

Colorado State University Department of Construction Management. Assessment Results and Action Plans

BENGKEL 21ST CENTURY LEARNING DESIGN PERINGKAT DAERAH KUNAK, 2016

MASTER S THESIS GUIDE MASTER S PROGRAMME IN COMMUNICATION SCIENCE

The College Board Redesigned SAT Grade 12

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

Facing our Fears: Reading and Writing about Characters in Literary Text

Writing a Basic Assessment Report. CUNY Office of Undergraduate Studies

Mathematics Scoring Guide for Sample Test 2005

TU-E2090 Research Assignment in Operations Management and Services

FOR TEACHERS ONLY. The University of the State of New York REGENTS HIGH SCHOOL EXAMINATION. ENGLISH LANGUAGE ARTS (Common Core)

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

Audit Documentation. This redrafted SSA 230 supersedes the SSA of the same title in April 2008.

SOFTWARE EVALUATION TOOL

ENG 111 Achievement Requirements Fall Semester 2007 MWF 10:30-11: OLSC

CARITAS PROJECT GRADING RUBRIC

Indiana Collaborative for Project Based Learning. PBL Certification Process

5. UPPER INTERMEDIATE

THE PENNSYLVANIA STATE UNIVERSITY SCHREYER HONORS COLLEGE DEPARTMENT OF MATHEMATICS ASSESSING THE EFFECTIVENESS OF MULTIPLE CHOICE MATH TESTS

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

ECE-492 SENIOR ADVANCED DESIGN PROJECT

Delaware Performance Appraisal System Building greater skills and knowledge for educators

PREPARING FOR THE SITE VISIT IN YOUR FUTURE

Language Acquisition Chart

Conceptual Framework: Presentation

Writing Research Articles

Assessment for Student Learning: Institutional-level Assessment Board of Trustees Meeting, August 23, 2016

Tap vs. Bottled Water

CHEM 591 Seminar in Inorganic Chemistry

With guidance, use images of a relevant/suggested. Research a

SETTING STANDARDS FOR CRITERION- REFERENCED MEASUREMENT

Earl of March SS Physical and Health Education Grade 11 Summative Project (15%)

Evidence for Reliability, Validity and Learning Effectiveness

Assessment and Evaluation

BSc (Hons) in International Business

Process to Identify Minimum Passing Criteria and Objective Evidence in Support of ABET EC2000 Criteria Fulfillment

ASSESSMENT REPORT FOR GENERAL EDUCATION CATEGORY 1C: WRITING INTENSIVE

HISTORY COURSE WORK GUIDE 1. LECTURES, TUTORIALS AND ASSESSMENT 2. GRADES/MARKS SCHEDULE

Firms and Markets Saturdays Summer I 2014

b) Allegation means information in any form forwarded to a Dean relating to possible Misconduct in Scholarly Activity.

Developing an Assessment Plan to Learn About Student Learning

ANGLAIS LANGUE SECONDE

West Georgia RESA 99 Brown School Drive Grantville, GA

Last Editorial Change:

Technical Skills for Journalism

Interpreting ACER Test Results

WebQuest - Student Web Page

Dublin City Schools Broadcast Video I Graded Course of Study GRADES 9-12

Welcome to the Purdue OWL. Where do I begin? General Strategies. Personalizing Proofreading

Think A F R I C A when assessing speaking. C.E.F.R. Oral Assessment Criteria. Think A F R I C A - 1 -

Grade 11 Language Arts (2 Semester Course) CURRICULUM. Course Description ENGLISH 11 (2 Semester Course) Duration: 2 Semesters Prerequisite: None

learning collegiate assessment]

re An Interactive web based tool for sorting textbook images prior to adaptation to accessible format: Year 1 Final Report

Developing Highly Effective Industry Partnerships: Co-op to Capstone Courses

AC : CAREER DEVELOPMENT AND PROFESSIONALISM WITHIN A BIOMEDICAL ENGINEERING CAPSTONE COURSE

Mini Lesson Ideas for Expository Writing

Program Report for the Preparation of Journalism Teachers

The functions and elements of a training system

Writing for the AP U.S. History Exam

Activities, Exercises, Assignments Copyright 2009 Cem Kaner 1

Journalism 336/Media Law Texas A&M University-Commerce Spring, 2015/9:30-10:45 a.m., TR Journalism Building, Room 104

Inquiry Learning Methodologies and the Disposition to Energy Systems Problem Solving

Mater Dei Institute of Education A College of Dublin City University

Master Program: Strategic Management. Master s Thesis a roadmap to success. Innsbruck University School of Management

School Inspection in Hesse/Germany

BUSINESS OPERATIONS RESEARCH EVENTS

English Policy Statement and Syllabus Fall 2017 MW 10:00 12:00 TT 12:15 1:00 F 9:00 11:00

AC : DEVELOPMENT OF AN INTRODUCTION TO INFRAS- TRUCTURE COURSE

University of Groningen. Systemen, planning, netwerken Bosman, Aart

S T A T 251 C o u r s e S y l l a b u s I n t r o d u c t i o n t o p r o b a b i l i t y

I N T E R P R E T H O G A N D E V E L O P HOGAN BUSINESS REASONING INVENTORY. Report for: Martina Mustermann ID: HC Date: May 02, 2017

Department of Statistics. STAT399 Statistical Consulting. Semester 2, Unit Outline. Unit Convener: Dr Ayse Bilgin

Policy for Hiring, Evaluation, and Promotion of Full-time, Ranked, Non-Regular Faculty Department of Philosophy

Planning a Dissertation/ Project

Transcription:

2006-853: USING RUBRICS FOR THE ASSESSMENT OF SENIOR DESIGN PROJECTS John K. Estell, Ohio Northern University JOHN K. ESTELL is Chair of the Electrical & Computer Engineering and Computer Science Department at Ohio Northern University. He received his doctorate from the University of Illinois at Urbana-Champaign. His areas of research include simplifying the outcomes assessment process, user interface design, and the pedagogical aspects of writing computer games. Dr. Estell is a Senior Member of IEEE, and a member of ACM, ASEE, Tau Beta Pi, Eta Kappa Nu, and Upsilon Pi Epsilon. Juliet Hurtig, Ohio Northern University JULIET K. HURTIG is an Associate Professor of Electrical Engineering and Assistant Dean of the T.J. Smull College of Engineering at Ohio Northern University. Her doctorate is from The Ohio State University. Research interests include control systems, nonlinear system identification, and undergraduate pedagogical methods. Dr. Hurtig is a member of IEEE, ASEE, and Tau Beta Pi. American Society for Engineering Education, 2006 Page 11.1409.1

Using Rubrics for the Assessment of Senior Design Projects Abstract The process of evaluating senior design projects typically involves assessing reports and presentations, then assigning relatively broad performance categories to the work. Unfortunately, the use of professional judgment in this process varies from faculty member to faculty member; as a consequence, one person's "excellent" can be another person's "very good." The lack of standard definitions for such terms act as an impedance toward fair and impartial grading of student performance. At its 2002 Faculty Retreat, the Electrical & Computer Engineering and Computer Science (ECCS) Department at Ohio Northern University examined the effectiveness of the senior design evaluation process. Senior design at this school is a year-long endeavor, with multiple teams of faculty grading several capstone projects each at the end of each quarter. The differences between the individual graders and between each team of graders were readily apparent, making it difficult to negotiate to a final fair course letter grade for the students. Accordingly, standard definitions were developed in the form of rubrics for each one of the four communication formats utilized in our senior design sequence. These rubrics are distributed at the beginning of each term to both students and faulty. The written report evaluation is conducted through the use of three separate rubrics: writing style, technical design, and consideration factors for addressing the coverage of multiple realistic design constraints as called for in the ABET Criteria. Each student team is required to make two oral presentations throughout the capstone process; a rubric specific to oral presentations guides these evaluations. External evaluators are invited to campus to judge the senior design poster competition, and these individuals follow a rubric specific to the poster format. A final rubric focuses on web site design, where students provide an overview of their project and the results obtained. Since the inclusion of rubrics in the 2002-2003 academic year, subsequent evaluations performed at annual Faculty Retreats have indicated that the rubrics have been a successful model for conducting the evaluation of the various aspects of the senior design experience. Additionally, by coalescing subjective faculty judgments into an objective numerical form through the use of rubrics, the results can be readily used for program outcomes assessment. As a result of this methodology, what was once considered to be essentially a random process by the students is now a more uniform process grounded in a fundamental set of definitions accessible by all. I. Senior Design Sequence All seniors in the College of Engineering are required to complete a capstone project. The senior capstone project is not the student s first exposure to formal design work; however, it does challenge students to draw from all of their previous coursework and complete a design that is large enough in scope to require a team effort and a six-month time period. The ECCS Department developed a year-long, three course senior design sequence common to all majors offered in the department. 1 This approach allows students to work on interdisciplinary comprehensive projects, and also allows for participation on interdepartmental teams. The students are presented with a mixture of faculty- and industry-sponsored projects and are Page 11.1409.2

assigned to project teams based upon their specified preferences. The course sequence requires the students to research an open-ended problem statement, develop a proposal, design a prototype, validate the design, produce a physical deliverable, and report the results. As part of the experience, students deal with various management issues and technical aspects of design. Both written and oral communication of the proposal and final project results are required, and all aspects of project documentation are available on the team s website. Applying this methodology to the capstone process has improved the overall quality of the project designs and better prepared our graduates for their industrial careers. Faculty evaluate projects at the end of each quarter through sets of rubrics; external feedback is obtained through project group interactions with the department's industrial advisory board and with the local IEEE branch. Two competitions, for the best poster and oral presentations, are held to provide performance incentives. II. Why Rubrics? The assigning of grades to a senior design project can be a cumbersome experience. By its very nature, a culminating design experience such as that called for in ABET Criterion 4 draws from several performance areas, and the evaluation of student performance in many of these areas can be very subjective and time-consuming. Accordingly, there is the temptation of utilizing a holistic approach to the grading of such design projects. The efficiency of assigning a single grade to the overall project, or to an individual component of a project such as an oral presentation, makes such an approach compelling, especially for those instructors who profess to intrinsically know the difference between A -level and B -level work. However, what is gained in efficiency is more than offset by the lost opportunity for an understanding of student performance, or more to the point, deficiencies in student performance. When a professor holistically assigns a grade of D for an oral presentation, how can one properly evaluate student performance such that appropriate action can be taken as part of a continuous improvement process? It could be that the low grade was for glaring grammatical errors, or for a flawed design based on a poor understanding of certain engineering concepts. If it was determined that the curriculum was to blame, an action plan for correcting poor grammar would be radically different that an action plan for reinforcing the pertinent engineering concepts. Furthermore, as senior design projects usually involve multiple faculty members, there is a question of fairness, as grading standards will often differ between faculty members. Simply put, the holistic approach does not provide sufficient detail for use in the outcomes assessment process. 2 From a practical standpoint, the evaluation of senior design projects needs to be conducted with sufficient justification and consistency present in the results to support objective decisions for the assignment of grades and for the assessment of multiple program outcomes. This requires an analytic approach to grading, where the assignment is broken down into its constituent parts, with each part being scored independently. 3 Rubrics are becoming popular as a mechanism for conducting assessment, particularly in performance areas where there can be an inherent amount of subjectivity. A rubric is simply a scoring guide, consisting of a set of performance criteria against which a student is evaluated. 4 The criteria describe traits that constitute specified goals which are embodied within the course Page 11.1409.3

assignment. To measure how well a criterion is being achieved, descriptive indicators are used that identify typical traits commensurate to a specified performance level. The use of rubrics presents many benefits. Instructors are forced to examine an assignment and determine ahead of time the criteria upon which that assignment is to be graded. The amount of time evaluating student work is lessened, as performance in each criterion can be categorized according to exhibited traits that correspond to the specified descriptive indicators. If the rubrics are distributed to students at the time the assignment is made, student performance is often improved as there are now clear guidelines as to what the expectations are. Collectively, the individual criterion scores on an assignment provide information to the instructor as to what instructional areas are in need of improvement. When multiple faculty are involved, the use of rubrics provides a common framework that establishes a level playing field for all students, minimizing the potential for inconsistent scoring. Finally, the use of rubrics constitutes a form of authentic assessment, where work can be measured according to real-life criteria; for example, written reports can be evaluated under the same criteria as those used for rating manuscripts submitted for journal publication. The creation of rubrics can be a challenging process; fortunately, there are abundant resources available in the educational community providing tips on the design of rubrics. 5,6,7 One can apply these tips toward the development of rubrics for use in senior design. First and foremost, one needs to identify the relative criteria for each assignment in the senior design curriculum; this is best done by consulting one s program outcomes and identifying those aspects of each assignment that can be used as evidence to support said outcomes. For scoring the indicators, the adaptation of a 4-point scale has been found useful, 7 as it avoids the neutral response typically found in a 5-point Likert scale, thereby forcing the selection of a positive or negative rating. An additional benefit is that the use of a 4-point scale provides a one-to-one correspondence between the rubric and established mechanisms used to report the results of both course and program outcomes assessment. 8,9 There are different ways to approach the delineation of the categories. Newell, Dahm, and Newell 7 give example classifications of meta-cognitive expert, skilled problem solver lacking meta-cognition, student with some skills but lack the level of performance required for an engineering graduate, and novice problem solver. Andrade 10 borrows from an unnamed elementary school teacher in providing gradation descriptions of yes, yes but, no but, and no for the indicators. When writing the descriptive indicators for each criterion, it is best to build from a position of excellence, and then work one s way down in quality, in order to establish a valid target to which students can aspire. The descriptive indicators need to focus on the central features of performance, which are not necessarily the easiest to count, see, or score. 7 Use of bulleted indicators makes descriptions less ambiguous and therefore more reliable by providing concrete examples of what to look for. The descriptions need to be written such that the jumps between categories are as continuous as possible, preferably constructed in parallel to all other indicators in terms of the language used, and are coherent in that language changes should only reflect on the variance of quality and not implicitly introduce new criteria. Each descriptive indicator for a criterion should avoid both unclear and unnecessary negative language use. Additionally, the descriptive language used should be sufficiently rich to allow for student self-evaluation, and it should be reliable such that it enables consistent scoring across both judges and time. This requires that evaluative language ( excellent, poor ) and comparative language ( better than, worse than ) is transformed into highly descriptive language that specifies the distinctive features of each performance level. 5 Page 11.1409.4

Ideally, the rubrics should be sufficiently generic for use throughout the curriculum, as this requires less overall work plus it provides a basis for conducting cohort longitudinal assessment. III. Utilizing Rubrics in Senior Design Criterion 4 states that students must participate in a culminating design experience that incorporates appropriate engineering standards and multiple realistic constraints. The senior capstone process as implemented in the ECCS Department 1 runs on a three-quarter approach that breaks down the senior capstone into three main components: proposal development, prototype development and verification, and final reporting. Rubrics assist each phase of the capstone evaluation process, and are presented in this section. Examples of all of these rubrics can be found in the appendix. A. Engineering Design Proposal The capstone proposal begins with a problem identification statement that specifically addresses the history of the problem, the project goals and deliverables, and answers the basic What? question. Students are then tasked to complete further research and information gathering, which will support the definition of the project. Here, various design solutions are proposed and evaluated via decision matrices. For each project, the realistic constraints as listed in Criterion 3(c) economic, environmental, sustainability, manufacturability, ethical, health and safety, social, and political factors are individually assessed by each team member. For each constraint the team as a whole formulates a position which is then incorporated into their proposal. The proposal then presents the best solution for the stated problem and a detailed system block diagram with team member responsibilities assigned to each block. The proposal concludes with an economic budget presentation and a Gantt chart for the project scheduling. The course culminates with both the submission of the written project proposal and an oral presentation based upon the proposal. All faculty members in the department participate in the review process; however, due to the number of both design teams and faculty members, each written proposal and oral presentation is reviewed by a three to five member committee of the faculty, which allows for both fewer reports to review and the setting up of parallel tracks to speed up the oral presentation process. Each committee is composed of representatives from all degree programs in the department, thereby providing a broad experience base that can be collectively brought to bear upon each proposal and The written proposal employs three separate rubrics: one for the evaluation of the technical writing aspects of the report, one for the evaluation of the technical design aspects of the project, and one specifically for the evaluation of the realistic constraints inherent to the project. The oral presentation employs one rubric that covers elements of oral technical communication skills as applied to the project. Each rubric category is evaluated on a zero to three point scale, where descriptions are provided for every performance indicator level for all of the present criteria. Figure 1 provides examples of criteria from the rubric used to evaluate the technical writing aspects of the proposal. Page 11.1409.5

Visual Format and Organization Language (Word Choice, Grammar) 3 - Expert 2 Practitioner 1 - Apprentice 0 - Novice The document is Errors in the Table of organized. Contents are present. Use of white space and Within sections, the typography help the order in which ideas are reader navigate the presented is occasionally document, although the confusing. layout could be more effective. The document is visually appealing and easily navigated. Appropriate typography and usage of white space are used appropriate to separate blocks of text and add emphasis. Sentences are complete and grammatical. They flow together easily.. Words are chosen for their precise meaning. Engineering terms and jargon are used correctly. No misspelled words are present. For the most part, sentences are complete and grammatical, and they flow together easily. Any errors are minor and do not distract the reader. Repetition of words and phrases is mostly avoided. For the most part, terms and jargon are used correctly with some attempt to define them. There are one or two misspelled words. In a few places, errors in sentence structure and grammar distract the reader and interfere with meaning. Word choice could be improved. Occasionally, technical jargon is used without definition. There are a few misspelled words. The document is not visually appealing and there are few cues to help the reader navigate the document. There is no apparent ordering of paragraphs, and thus there is no progressive flow of ideas. Errors in sentence structure and grammar are frequent enough that they distract the reader and interfere with meaning. There is unnecessary repetition of the same words and phrases. There is an overuse of jargon and technical terms without adequate explanation of their meaning. There are many misspelled words. Figure 1. Example criteria from the written report rubric. Following the evaluation of the proposals and presentations, the faculty return their individual rubric score sheets for each team back to the senior design sequence coordinator, who enters the data into a spreadsheet for tabulating the results. The results for each evaluation category are averaged, weighted to form the total score for each team, and are reported to each faculty team supervisor, who then assigns letter grades based in large part upon this information. An example of such a report is presented in Figure 2. Design Team #1 Prof A Prof B Prof C Prof D Average Oral Presentation (25%) 91 72 63 57 70.3 Written Report (30%) 83 67 50 56 63.9 Technical Design (20%) 96 79 67 58 75.0 Constraints (25%) 88 66 50 50 63.3 Weighted Total % 67.6 Design Team #2 Prof A Prof B Prof C Prof D Average Oral Presentation (25%) 78 66 63 47 63.3 Written Report (30%) 67 50 50 50 54.2 Technical Design (20%) 92 67 50 50 64.6 Constraints (25%) 72 56 50 50 57.0 Weighted Total % 59.2 Figure 2. Rubric evaluation report. Page 11.1409.6

B. Prototyping The ECCS Department's design process follows a seven-step procedure. The proposal incorporates the areas of problem identification, research and information gathering, definition of the project, and the development of the plan. Winter quarter begins with each student team resolving any comments or suggestions regarding their graded and reviewed proposal. The team is then ready to move on to the next steps: execution of the plan and verification of the design. The winter quarter course is a three-credit hour course which does not meet for lecture. Instead, each student is expected to log a minimum of nine hours per week working on their project. The students track their hours in a lab book which includes the student s name, the amount of time spent and a brief description of the task addressed. The log book is stored in their project lab where it can be accessed by all team members and the faculty advisor. Most faculty advisors will meet weekly with the teams to evaluate progress and require team meeting minutes and other progress report memos. The course grade is again determined using a rubric system, as found in the appendix. Each rubric category is evaluated on a zero to three point scale, where definitions are provided for each performance indicator level. In this quarter, the grade is determined solely by the team s faculty advisor. The weekly status report, team meetings, and lab log books are worth 45% of the course grade. The students utilize a peer-peer evaluation again for 30% of their course grade, and an end-of-quarter written or oral progress report contributes the remaining 25%. If chosen, the oral progress report is typically made in front of the department s freshman engineering students. This allows students just entering the program to realize the possibilities of projects and the level of presentations required in the engineering profession. C. Final Reporting Engineering Technical Communication is the last course in the senior design sequence and focuses on the last step of the ECCS Department s seven-step design process, namely communicating the design results. Ideally, at this point the design implementation is complete and the project verification indicates that all project specifications have been met. Often delays will occur, and if a project finds itself needing two or three more weeks for completion, the students can continue to work on their designs while the course lectures focus on what technical communication requires: to inform and/or persuade the audience. By the end of the quarter, students will complete a final written report, an oral presentation, a project website, and enter a poster competition. These tasks will require teamwork in order to be accomplished within the course time frame. The course grade is based mainly on the group projects (70%) with the remaining points earned on an individual basis. This 30% is composed of homework and attendance (10%) and the peer-peer evaluation score (20%). The group projects are evaluated again with the use of rubrics, with the same point scale as the previous ones. The significant difference this quarter comes via the evaluators, as both faculty and external evaluators are utilized. The final written report and team website are graded by a committee of the department s faculty, just as in the fall quarter course. The external evaluators enter the process via two poster competitions and the oral The first poster Page 11.1409.7

competition is hosted by the local section of IEEE. Members of this section come to campus for a banquet dinner with the seniors and then judge the posters. The department also schedules an afternoon meeting that day with its industrial advisory board, and invites the members to attend the banquet and participate in the judging. The College s advisory board also judges the posters of all capstone projects later in the quarter. For this event, the posters are hung in the hallways of the building to allow all students to view the posters as they walk to class. These external evaluators provide opportunities in which the team members have to assess their audience members and style their presentations appropriately. IV. Continuing Improvements The adoption of a rubrics-based approach to the evaluation of senior design projects in the ECCS Department occurred as a result of implementing an outcomes-based assessment process in preparation for an accreditation visit under the then new Engineering Criteria 2000 guidelines. An analysis of the evaluation methodology then in place showed lack of definition. An example can be seen in the Mechanics section of the Oral Presentations Evaluation Form that is presented in Figure 3. Evaluation Form Senior Design Oral Presentations ECCS Department Project: Oral Presentation Mechanics: (60%) A B C D F Introduction... 5 4 3 2 1 Body (organization, progression)... 5 4 3 2 1 Smooth transition between speakers... 5 4 3 2 1 Visuals (appropriate, readable, spelling)... 5 4 3 2 1 Presentation level appropriate for audience... 5 4 3 2 1 Delivery..... 5 4 3 2 1 Conclusion... 5 4 3 2 1 Handling of questions... 5 4 3 2 1 Figure 3. Original Evaluation Tool for Oral Presentations The problem observed with this approach was with the definition of the scores through the use of letter grades, as it assumes that an evaluator "knows" what 'A'-level work is in each category and can differentiate it from other letter grade levels of work. Unfortunately, these types of definitions vary from instructor to instructor, which leads to grading inconsistencies; it was because of these observed inconsistencies that at the department's 2002 Faculty Retreat a rubricsbased approach was adopted. A five-point scale was adopted for use, where the middle indicator was used to indicate satisfactory performance. The rubrics provided specifications for only three of the indicators, with the premise that the other two unnamed indicator values would be for instances where the student performance fell in-between the classifications provided by the indicator descriptions. Figure 4 presents an example of a portion of the original oral presentation rubric. Page 11.1409.8

Criteria Problem Identification Visuals Clarity of Purpose 4 Exemplary Insightful and in-depth background information is provided to illuminate the issues through inclusion of: history relevant to the presentation, the big picture a succinct description of the significance of the project Communication aids enhance the They are prepared in a professional way. Font on visuals is large enough to be seen by all. Information is organized to maximize audience understanding. Details are minimized so that main points stand out. The project s objectives are clearly stated. Motivation for pursuing the project and its relevance are clearly established. The audience can understand the significance of the work. 2 Satisfactory Background information is provided, including references to the work of others and an explanation of why the project was undertaken, to help put the presentation in context. Communication aids contribute to the quality of the Font size is appropriate for reading. Appropriate information is included. Some material is not supported by visual aids. The project s objectives are presented. The motivation for pursuing the project and its relevance are addressed. The discussion is reasonably clear. 0 Unacceptable Little or no background information is presented to help the audience understand the history and significance of the project. Communication aids are either not used, or they are poorly prepared or used inappropriately. Font is too small to be easily seen. Too much information is included. Unimportant material is highlighted. Listeners may be confused. The project s objectives are missing or incomplete. There is little or no discussion of motivation or relevance. The listener is confused about the nature of the project and why it was undertaken. Technical Detail High level of relevant detail is presented to allow the audience to make judgments about the content. The details are not so elaborate that the presentation becomes tedious. Sufficient technical detail is included to enable the audience to understand the nature of progress. In places, the information was too detailed or was lacking. Significant amounts of technical detail are lacking or inadequate so that the audience cannot appreciate the progress that has been made. Figure 4. Portion of Original Oral Presentation Rubric While the adoption of this rubric style did provide definition to the assessment criteria and as a result worked better than the previous assessment method, there were problems observed in its use. First, some faculty erroneously assumed that the only possible scores that could be assigned were the three defined indicators. Second, for several of the rubrics there were a large number of criteria that, when the descriptive indicators were included, took up both sides of a sheet of paper. The descriptions for the indicators were written in complete sentences, with a new sentence often starting on the same line as the end of the old sentence. Additionally, the scores were reported on a separate sheet of paper. This combination resulted in a cumbersome evaluation process where, while trying to listen to an oral presentation, one was flipping the rubric page over in search of a criterion, having to read the descriptive indicators, and then was required to record the result elsewhere. Another problem was that for the reporting of course and program outcomes, the department adopted the use of a 4-tuple performance vector; 9 this resulted in the setting up of an impediment when trying to convert the assessment of a 5-point criterion into a 4-tuple vector. The students were also reporting problems with the rubrics. As students reviewed their score sheets at the end of each quarter, teams were questioning why one faculty member scored them as 4 Exemplary down the categories while another faculty member scored them consistently as 2 Satisfactory. It can be explained that professional judgment varies from one professor to another, but that there is consistency within one faculty member s grading across all teams. Page 11.1409.9

However, no student easily accepts that in all categories, they were always the same performance level. They expect to see some good job marks in some categories and improvement marks in other areas. If a faculty member truly feels a team is consistent across all categories, there must be written notes to substantiate to the students that time was taken by the professor to come to this conclusion and that it was not just a cursory judgment. As a result of these observations, in the 2005-2006 academic year the rubrics were revised so that the indicators mapped in a one-to-one relationship to the performance levels of the 4-tuple performance vector. Furthermore, the number of criteria was reduced and the descriptors for each indicator were presented in a short and bulleted format. Faculty are encouraged to circle bullets within different performance level boxes directly on the rubric to quickly indicate student performance. The final score for that category is then a reflection of all of these bullets combined. This provides immediate feedback to students as to where they did well, and where they can improve. Figure 5 presents a portion of the current oral presentation rubric that illustrates the changes made to our assessment device. Content 3 - Expert 2 - Practitioner 1 - Apprentice 0 Novice Addresses most content Addresses some of the areas. content areas. Material sufficiently Material minimally supports the topic. supports the topic. Use of engineering Use of engineering terms and jargon mostly terms and jargon matches audience minimally matches knowledge level. audience knowledge level. Addresses all specified content areas. Material abundantly supports the topic. Use of engineering terms and jargon matches audience knowledge level. Addresses few of the content areas. Material does not support the topic. Use of engineering terms and jargon does not match audience knowledge level. Visuals Presentation Skills Text is easily readable. Graphics use constantly supports the Slide composition has a professional look that enhances the Clearly heard and polished. Attitude indicates confidence and enthusiasm. Audience attention is constantly maintained. Text is readable. Graphics use mostly supports the Slide composition is not visually appealing, but does not detract from the Clearly heard but not polished. Attitude indicates confidence but not enthusiasm. Audience attention is mostly maintained. Text is readable with effort. Graphics use rarely supports the Slide composition sometimes distracts from the Difficult to hear and/or moments of awkwardness. Attitude indicates some lack of confidence and/or disinterest in subject. Audience attention is minimally maintained. Text is not readable. Graphics use does not support the Slide composition format is clearly distracting, obscuring the Inaudible; several awkward pauses. Attitude indicates lack of confidence and/or disinterest in subject. Audience attention is not maintained. Figure 5. Portion of Current Oral Presentation Rubric V. Conclusions The use of performance-based assessments is critical for an accurate portrayal of student performance. Rubrics provide a methodology that, when properly used, greatly assist in the evaluation of such assessments. Unfortunately, there is a great deal of misunderstanding in the engineering education community as to the use, the value, and the construction of rubrics. In terms of use, there will be those individuals who, with rubrics in hand, will ignore the content of the descriptive performance indicators and score the students based on their own criteria; the Page 11.1409.10

scores provided by Prof A in Figure 2 give evidence to that. There will also be those individuals who will agree that providing a rubric to students is a good thing to do, but as faculty they intrinsically know how to grade and therefore do not need to use those same rubrics to guide them. Handling these situations requires a combination of education and patience on the part of department chairs and assessment committees. Dealing with the value question is often part of a bigger issue, which is that of faculty not buying into the use of assessment as a component of a continuous quality improvement process. If faculty support is not present to at least some degree, then it does not matter what sort of assessment instrument is in place since, regardless of its benefits, it will not be effectively used. If anything, rubrics offer a way of streamlining the assessment process, resulting in a reduction in faculty assessment workload. 3 The construction of rubrics can be a daunting task; however, once they are constructed they are relatively easy to use. Information such as that contained in this paper is of great value for developing rubrics; unfortunately, the authors learned that lesson the hard way as our original rubrics, while better than a subjective, holistic score, could have been written better. Additionally, as part of our literature search for this paper, the authors encountered published examples of rubrics that consisted solely of criteria with nothing more than ill-defined indicator labels such as good and fair being used to differentiate between performance levels. It is the authors contention that descriptive performance levels are a necessary component of rubrics in order to provide specific information on expectations to both students and faculty. The incorporation of rubrics into our senior design sequence has proved to be an interesting and enlightening experience. Over time, our understanding of rubrics has grown, and as a result the department has developed a well-defined set of rubrics. The appendix to this paper contains all of the rubrics currently used in our senior design courses; these rubrics can also be downloaded from our senior design web site. 11 Furthermore, the department has accumulated, in terms of quality, a significant amount of data for use in both course and program outcomes assessment. Rubrics have been a successful mechanism for conducting the evaluation of the various aspects of the senior design experience in a uniform, objective and equitable process, to the benefit of both students and faculty. Bibliography 1. J. K. Hurtig and J. K. Estell, "Truly Interdisciplinary: The ONU ECCS Senior Design Experience," Proceedings of the 2005 American Society for Engineering Education Annual Conference & Exposition.. 2. Why Rubrics? Teach-nology.com 2005 [Online]. Available: http://www.teachnology.com/tutorials/teaching/rubrics 3. S. M. Blanchard, M. G. McCord, P. L. Mente, D. S. Lalush, C. F. Abrams, E. G. Loboa, H. T. Nagle, Rubrics Cubed: Tying Grades to Assessment to Reduce Faculty Workloads, Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition. 4. V. L. Young, D. Ridgeway, M. E. Prudich, D. J. Goetz, B. J. Stuart, Criterion-Based Grading for Learning and Assessment in Unit Observations Laboratory, Proceedings of the 2001 American Society for Engineering Education Annual Conference & Exposition. 5. What is a Rubric? Relearning by Design, Inc., 2000 [Online]. Available: http://www.relearning.org/resources/pdf/rubric_sampler.pdf 6. H. Goodrich, Understanding Rubrics, Educational Leadership, vol. 54, no. 4, pp. 14-17. Available online: http://www.middleweb.com/rubricshg.html 7. J. A. Newell, K. D. Dahm, H. L. Newell, Rubric Development and Inter-Rater Reliability Issues in Assessing Learning Outcomes, Proceedings of the 2002 American Society for Engineering Education Annual Conference & Exposition. Page 11.1409.11

8. R. L. Miller, B. M. Olds, Performance Assessment of EC-2000 Student Outcomes in the Unit Operations Laboratory, Proceedings of the 1999 American Society for Engineering Education Annual Conference & Exposition. 9. J. K. Estell, "Streamlining the Assessment Process with the Faculty Course Assessment Report," Proceedings of the Best Assessment Practices VIII Symposium, Rose-Hulman Institute of Technology, Terre Haute, IN, 2006. 10. H. G. Andrade, Using Rubrics to Promote Thinking and Learning, Educational Leadership, vol. 57, no. 5, pp. 6-12. Available online: http://www.nycenet.edu/nr/rdonlyres/5cf749a8-d90f-4646-beaf- 9DD3130EB82E/2716/AppendixC.pdf 11. http://www.onu.edu/engineering/eccs/rubrics.html Page 11.1409.12

ECCS Department - Written Report Rubric Visual Format and Organization Language (Word Choice, Grammar) Equations, Numerical Usage, and Illustrations Use of references Use of appendices 3 - Expert 2 Practitioner 1 - Apprentice 0 - Novice The document is Errors in the Table of organized. Contents are present. Use of white space and Within sections, the typography help the order in which ideas reader navigate the are presented is document, although the occasionally layout could be more confusing. effective. The document is visually appealing and easily navigated. Appropriate typography and usage of white space are used as appropriate to separate blocks of text and add emphasis. Sentences are complete and grammatical. They flow together easily. Words are chosen for their precise meaning. Engineering terms and jargon are used correctly. No misspelled words are present. All equations are clear, accurate, and labeled. All variables are defined and units specified. Discussion regarding the equation development and use has been stated. All figures, graphs, charts, and drawings are accurate, consistent with the text, and of good quality. They enhance understanding of the text. All items are labeled in accordance with engineering standards and are referred to in the text. Prior work is acknowledged by referring to sources for theories, assumptions, quotations, and findings. References are exact with author, journal, volume number, page number, and year. Information is placed appropriately in either the main text or an appendix. Appendices are documented and referenced in the text. For the most part, sentences are complete and grammatical, and they flow together easily. Any errors are minor and do not distract the reader. Repetition of words and phrases is mostly avoided. For the most part, terms and jargon are used correctly with some attempt to define them. There are one or two misspelled words. Most equations are accurate and clear. Most variables are defined and units specified. With some minor exceptions, adequate discussion regarding the equation development and usage has been stated. For the most part, illustrations are accurate, consistent with the text, and of good quality. All items are generally labeled in accordance with engineering standards and are referred to in the text. With an occasional oversight, prior work is acknowledged by referring to sources for theories, assumptions, quotations, and findings. With some minor exceptions, references are exact with author, journal, volume number, page number, and year. Appendices are used when appropriate. Selection and/or extent of material in appendix may not be optimal. In a few places, errors in sentence structure and grammar distract the reader and interfere with meaning. Word choice could be improved. Occasionally, technical jargon is used without definition. There are a few misspelled words. Most equations are accurate. Too many variables are not defined. Discussion regarding the development and usage of the equation is unclear. In some cases, illustrations are not conveying information clearly. While items are labeled, references to these items are missing. On several instances, references are not stated when appropriate. Bibliographical entries are not complete. While appendices are present, material in appendix is not referred to properly in text. Content in appendix is not complete. The document is not visually appealing and there are few cues to help the reader navigate the document. There is no apparent ordering of paragraphs, and thus there is no progressive flow of ideas. Errors in sentence structure and grammar frequently distract the reader and interfere with meaning. There is unnecessary repetition of the same words and phrases. There is an overuse of jargon and technical terms without definition. There are many misspelled words. There may be inaccuracies within the equation. Little or no attempt is made to make it easy for the reader to understand the use of an equation or its derivation. Figures, graphs, charts, and drawings are of poor quality, have numerous inaccuracies and mislabeling, or may be missing. There is no corresponding explanatory text for included items. Little attempt is made to acknowledge the work of others. Most references that are included are inaccurate or unclear. Appendices were not utilized when appropriate. There is unnecessary inclusion of detailed information in the main body of the text. Page 11.1409.13

ECCS Department - Technical Design Rubric 3 - Expert 2 Practitioner 1 - Apprentice 0 - Novice Identification of the problem The problem has been shown (not just stated) to exist with supporting factual evidence. A problem statement has been stated. The problem statement has weak support. Problem has not been stated clearly and lacks any supporting evidence. Research and information gathering Existing solutions to the problem, including their good and bad points, have been stated. Existing solutions have been stated. Additional discussion may be warranted in places. A complete review of existing solutions and research related to this problem is not presented. Connection between references and what is written is not clear. Little investigation has been done. Definition of the project There are clear expectations of the specific outputs or deliverables for the project. A set of measurable performance requirements has been created. Expectations have been stated. Some objectives may not be measurable. Expectations have been stated. Most objectives are not measurable. Expectations are not clear. Expectations are not measurable. Development of a plan A system block diagram has been developed to assist the team in solving the design. All blocks have been broken down to a manageable level. A system block diagram has been developed to assist the team in solving the design. Not all blocks have been broken down to a manageable level. A system block diagram has not been fully developed. A few blocks have been broken down. A system block diagram has not been fully developed. The problem has not been divided into manageable tasks and blocks. Execution of the plan All major points of the project were completed.. Most major project points were accomplished Few of the major project points were accomplished. None of the major project points were accomplished. Verification of the design The prototype has been tested against the performance requirements listed in the definition of the project. The prototype has not been fully developed or tested. Little verification of design was accomplished. No verification of design was accomplished. Project Scheduling A plan stating the cost, completion date, and required resources has been presented. Gantt charts and a budget spreadsheet have been generated. Some aspects of the plan have not been fully developed. Few aspects of the plan have been developed. Lack of planning is evident. Technical level of project A significant portion of this project involves technical information outside the scope of the undergraduate curriculum. Several technical aspects were new to the students and required research. This project contains some research but mostly involves technical information taught at the junior and senior levels. This project did not challenge the students to perform much research, as it relied mainly on information taught within the curriculum. Page 11.1409.14

ECCS Department Realistic Constraints Rubric For All Realistic Constraints: 3 - Expert 2 - Practitioner 1 - Apprentice 0 - Novice Analysis correctly reasons how the design is constrained and provides sufficient, in-depth discussion in a clear and easy to follow manner. Analysis correctly reasons how the design is constrained but provides only a cursory discussion. Analysis contains a mixture of correct and incorrect reasoning as to how the design is constrained. Analysis incorrectly reasons how the design is constrained. Constraint Pertinent Questions Applicable? Score Economic Environmental Sustainability Manufacturabil ity Ethical Health and Safety What are the development costs? What are the production costs? What are the operational costs? What valuable resources are being used? Is pollution being produced as a result of production and/or use of the product? To what degree over time will the product be useful and viable? Can the resources associated with the product be used effectively in a sustainable economy? Can the product be built? How can the product be designed to eliminate manufacturing errors? How can the product be designed to minimize manufacturing costs? Are there any foreseen potential conflicts with a profession s Code of Ethics arising from the development or use of the product? Are there any laws that determine how safe this product has to be? Are there related ethical issues? Are there relevant health effects that are affected by this product? Yes No 3 2 1 0 Yes No 3 2 1 0 Yes No 3 2 1 0 Yes No 3 2 1 0 Yes No 3 2 1 0 Yes No 3 2 1 0 Social What anticipated impacts will the product have within a community? Yes No 3 2 1 0 Political Is the project being conducted for a governmental entity? Is the project being regulated, or being proposed for regulation, by a governmental entity? Yes No 3 2 1 0 Page 11.1409.15

ECCS Department Oral Presentation Rubric Content 3 - Expert 2 - Practitioner 1 - Apprentice 0 Novice Addresses all specified content areas. Material abundantly supports the topic. Use of engineering terms and jargon matches audience knowledge level. Addresses most content areas. Material sufficiently supports the topic. Use of engineering terms and jargon mostly matches audience knowledge level. Addresses some of the content areas. Material minimally supports the topic. Use of engineering terms and jargon minimally matches audience knowledge level. Addresses few of the content areas. Material does not support the topic. Use of engineering terms and jargon does not match audience knowledge level. Visuals Presentation Skills Text is easily readable. Graphics use constantly supports the Slide composition has a professional look that enhances the Clearly heard and polished. Attitude indicates confidence and enthusiasm. Audience attention is constantly maintained. Organization Information presented in logical and interesting sequence. The audience can easily follow the Text is readable. Graphics use mostly supports the Slide composition is not visually appealing, but does not detract from the Clearly heard but not polished. Attitude indicates confidence but not enthusiasm. Audience attention is mostly maintained. Information presented in a logical sequence. Audience can follow the Text is readable with effort. Graphics use rarely supports the Slide composition sometimes distracts from the Difficult to hear and/or moments of awkwardness. Attitude indicates some lack of confidence and/or disinterest in subject. Audience attention is minimally maintained. Information not always presented in a logical sequence Audience has difficulty following Text is not readable. Graphics use does not support the Slide composition format is clearly distracting, obscuring the Inaudible; several awkward pauses. Attitude indicates lack of confidence and/or disinterest in subject. Audience attention is not maintained. Information not presented a logical sequence; Audience cannot understand Handling of Questions Demonstrates full knowledge of the material; can explain and elaborate on expected questions. Demonstrates sufficient knowledge of the material to answer expected questions. Demonstrates difficulty answering expected questions beyond a rudimentary level. Demonstrates an inability to answer expected questions. Page 11.1409.16

Grading Rubrics for Senior Design, Winter Quarter Category Weight 3 - Excellent 2 - Adequate 1 Minimal 0 Unsatisfactory Weekly Status Report 15% Weekly Meeting Attendance 15% Maintaining Lab Books 15% All weekly status reports turned in on time; reports accurately reflect and summarize project status. Student attended and participated in all scheduled meetings Lab books accurately reflect what each student worked on and the time spent on each activity. All weekly status reports turned in, with no report being more than a few days late; a handful of reports do not fully reflect or summarize project status, but contain sufficient information for independent judgments to be made regarding progress on the project. Student attended all meetings, but did not fully participate in 1 or 2 of the meetings or was unprepared for 1 meeting. Lab books accurately reflect activities, but have occasional lapses of information about who did what or how much time was spent. 1 status report not turned in or no more than 2 turned in more than a week late; some reports do not provide sufficient information for independent judgments to be made regarding progress. Student missed 1 meeting; did not fully participate in more than 2 meetings or was unprepared for more than 1 meeting. Lab books sporadically kept, or are inaccurate, or are not up to date when spot-checked. 2 or more status reports not turned in or more than 2 reports turned in more than a week late; majority of reports do not provide sufficient information for independent judgments to be made regarding progress. Student missed 2 or more meetings; student was unprepared for more than 2 meetings or did not fully participate in a majority of the meetings. No lab book. Peer-Peer Evaluation 30% Written/Oral Progress Report 25% Students get along well; Team Charter successfully used in cases of conflict. Students write selfassessment on their performance for each evaluation. Group is fully functional. Student receives high evaluation from peers. Conference-quality paper or presentation which gives a clear and accurate picture of the standing of the project. Students are able to get along; Team Charter was referred to in cases of conflict and conflicts are for the most part resolved. Group is able to function. Student receives acceptable (positive) evaluation from peers. Self-assessment not always performed. Members of the group tolerate each other. Group is able to function, but without full participation from all of its members. Student receives positive evaluation from peers. Paper or presentation which gives a mostly clear and accurate picture of the standing of the project; some minor questions might be left in the mind of an independent reviewer. Members of the group tolerate each other. Group is able to function, but without full participation from all of its members. Student does not receive positive evaluation from peers. Group is dysfunctional. Project suffers from unresolved conflict. Student score is dramatically higher than peers. Paper or presentation which provides only a partially clear and accurate picture of the standing of the project; major questions are left in the mind of an independent reviewer. Group is dysfunctional. Project suffers from unresolved conflict. Student score is equal to or dramatically lower than peers. Paper or presentation that does not provide a clear and accurate picture of the standing of the project; leaves significant questions in the mind of an independent reviewer, or has serious inaccuracies. Page 11.1409.17