College of EMS ABET Process Workshop Student Centered Assessment: Know Our Students, Know Ourselves Janel Miller, College of EMS Assessment Showcase 2017: 17 May 2017 University of Wisconsin-Platteville
Objective of session is to enhance understanding and promote engagement in program s ABET assessment and evaluation processes 1. Understand ABET s terminology and expectations and your role in the process 2. Articulate student performance expectations and link to course assessments 3. Enhance toolkit for identifying problems and implementing effective improvements 2
ABET terminology defines key accreditation criteria 1. Students Criteria 2. Program Educational Objectives (PEOs) 3. Student Outcomes (Outcomes) 4. Continuous Improvement 5. Curriculum 6. Faculty 7. Facilities 8. Institutional Support Program Criteria Program Educational Objectives ( PEOs ) Broad statements of what graduates are expected to attain within 3 5 years after graduation, established with input from employers, graduate schools Student Outcomes ( Outcomes ) What students are expected to know and be able to do by graduation (knowledge, skills, behaviors) Performance Indicators Specific, measurable statements that describe performance required to achieve student outcome What do we look for to have confidence that students have attained the Student Outcome? ABET, Assessment: Choosing Assessment Methods Webinar, http://www.abet.org/accreditation/get-accredited-2/assessment-planning/ 3
ABET s criteria drive a program-level perspective in course design focused on student attainment of prescribed outcomes University Mission By the end of this course, students will By the end of this unit, students will I know students have achieved the learning objectives because they will successfully Program Objectives Course Learning Objectives Unit Learning Objectives Assessment Methods Instructional Materials and Course Activities Program Educational Objectives Student Outcomes Performance Indicators In 3-5 years, graduates will By graduation, students will I know students have attained the outcome when Performance indicators link Student Outcomes to course objectives, content, and assessments 4
ABET s Student Outcomes are broadly defined and apply to all engineering programs (a) an ability to apply knowledge of mathematics, science, and engineering (b) an ability to design and conduct experiments, as well as to analyze and interpret data (c) an ability to design a system, component, or process to meet desired needs within realistic constraints such as economic, environmental, social, political, ethical, health and safety, manufacturability, and sustainability (d) an ability to function on multidisciplinary teams (e) an ability to identify, formulate, and solve engineering problems (f) an understanding of professional and ethical responsibility (g) an ability to communicate effectively (h) the broad education necessary to understand the impact of engineering solutions in a global, economic, environmental, and societal context (i) a recognition of the need for, and an ability to engage in life-long learning (j) a knowledge of contemporary issues (k) an ability to use the techniques, skills, and modern engineering tools necessary for engineering practice. 5
ABET s assessment process drives continuous improvement so graduates are able to achieve PEOs in the workplace Each program s Assessment Plan defines an ongoing process of assessment and evaluation to identify program strengths and opportunities for improvement Goal is to enhance student learning through improvements to courses, curriculum, program, and processes University Mission Program Educational Objectives Student Outcomes Identify performance indicators Implement improvements Identify assessment methods Conduct assessments Evaluation Identify needed improvements Collect and assemble data Assessment Interpret results Compile and analyze data 6
ABET s assessment process drives continuous improvement so graduates are able to achieve PEOs in the workplace University Mission Program Educational Objectives Student Outcomes Assessment Plan describes the activities and responsibilities within these process steps Identify performance indicators Implement improvements Identify assessment methods Conduct assessments Determine how well Student Outcomes are attained in program Identify needed improvements Interpret results Collect and assemble data Compile and analyze data 7
The program s Assessment Plan should focus on assessing student learning efficiently Assessment Plan Is the heart of Criterion 4: Continuous Improvement The assessment plan focuses on student learning by assessing student achievement defined by common Student Outcomes assesses the program, not each student Describes processes to identify, collect, and prepare data to evaluate the attainment of student outcomes, and identify and implement improvements in program Processes need to be sustainable with a manageable workload... provide meaningful, useful information to guide improvements in student learning in the program ABET Self-study template, Criterion 4, http://www.abet.org/accreditation/self-study-templates/ 8
Performance indicators are specific, measurable statements that describe performance required to achieve each student outcome What do students do and produce that demonstrates attainment of a Student Outcome? Organize, prepare, and deliver a clear, complete, logical presentation Estimate the cost to produce a product Apply ethical principles and the Code of Ethics to evaluate an ethical dilemma Formulate and solve differential equations to describe material behavior under complex loading University Mission Program Educational Objectives Student Outcomes Identify performance indicators Implement improvements Identify assessment methods Conduct assessments Identify needed improvements Collect and assemble data ABET, Assessment: Choosing Assessment Methods Webinar, http://www.abet.org/accreditation/get-accredited-2/assessment-planning/ Interpret results Compile and analyze data 9
Performance indicators reflect the level of student understanding and performance using appropriate verbs and instructional content Action verb Content in instruction Create Assemble, compose, construct, formulate, propose Evaluate Assess, critique, evaluate, justify, validate Emphasize Analyze Apply Analyze, characterize, compare, contrast, research Apply, demonstrate, illustrate, implement, model Reinforce Understand Remember Associate, explain, interpret, summarize Define, identify, list, reproduce, Introduce Estimate the cost to produce a product ABET, Assessment: Choosing Assessment Methods Webinar, http://www.abet.org/accreditation/get-accredited-2/assessment-planning/ ABET, Assessment: Developing Rubrics Webinar, http://www.abet.org/accreditation/get-accredited-2/assessment-planning/ TeachOnline@UW, Bloom s Taxonomy, Plan and Design Unit 2: Learning Objectives and Alignment, Fall 2015 10
Workgroup session #1: Writing Performance Indicators For a course you teach: 1. Examine the syllabus and your program s PEOs and Student Outcomes: Does it specify Student Outcomes that apply to the course? If not, identify two or three Student Outcome(s) that apply? 2. Write at least one Performance Indicator for two or three Student Outcomes 3. Share and discuss what makes a good Performance Indicator Student Outcome (g): Ability to communicate effectively Performance Indicators Assessment Type (D or I) Scoring and expected attainment Organize, prepare, and deliver a clear, complete, logical presentation of technical information Organize and compose a coherent technical report appropriate for a multi-disciplinary audience Specific, measurable statements that describe performance required to achieve Student Outcome Worded at the program level, rather than the course level Demonstrate confidence in their ability to communicate effectively Action verb + Content in instruction 11
Assessment methods provide evidence of student knowledge, skills, and behaviors to achieve performance indicators and student outcomes University Mission Program Educational Objectives Student Outcomes How will students demonstrate their ability to perform at an acceptable level? How do we define acceptable? Identify performance indicators Implement improvements Identify assessment methods Conduct assessments Identify needed improvements Collect and assemble data G. Rodgers, Do Grades Make the Grade for Program Assessment?, Assessment Tips with Gloria Rodgers, http://www.abet.org/accreditation/get-accredited-2/assessment-planning/ Interpret results Compile and analyze data 12
A robust assessment plan includes both direct and indirect assessments Direct assessments directly examine or observe student knowledge or skills Indirect assessments measure perceived extent or value of learning (opinions) Standardized exams e.g., certification exams, FE exam Local assessments e.g., projects, tests, homework, presentations Portfolios to showcase best work or growth in learning over time Simulations approximating real world Student surveys what students think they know and can do Exit interviews/surveys can be worded to include elements of direct measures, e.g., lifelong learning Focus groups small group discussions with a trained moderator Demonstrate what students actually know and can do Provide strong evidence of learning Sample students perceptions about their own learning, or how learning is valued by themselves or employers Weaker than direct measures because they are based on opinions and perspectives that may not be realistic Used to supplement direct assessments Direct and Indirect Assessment, Assessment Tips with Gloria Rodgers, http://www.abet.org/accreditation/get-accredited-2/assessment-planning/ ABET, Assessment: Choosing Assessment Methods Webinar, http://www.abet.org/accreditation/get-accredited-2/assessment-planning/ 13
A multi-method, multi-source approach maximizes validity, reduces bias Student Outcome: Ability to communicate effectively Student work (D): Report Student work (D): Presentation Survey (I): Student course survey or Graduate exit survey Associated Performance Indicators: Organize, prepare, and deliver a clear, complete, logical presentation Organize and compose a coherent report appropriate for a multi-disciplinary audience Demonstrate confidence in their ability to communicate effectively Credit: L. Grossenbacher, Director of Undergraduate Program Review, College of Engineering, UW-Madison 14
When choosing assessment methods, look for best fit for program needs, satisfactory validity, usefulness of data, and affordability Assessment Methods Relevance: Does method measure the student outcome as directly as possible? Accuracy: Does method measure the student outcome as correctly as possible? Utility: Does the assessment method provide useful data to evaluate the program and identify improvements? Affordability: Is the time, effort, and cost of the method acceptable? Should course grades be used to assess attainment of student outcomes? Should final exams be used to assess attainment of student outcomes? Do we include every assignment in the Assessment Plan? Must we assess every student on every assessment included in the plan? ABET, Assessment: Choosing Assessment Methods Webinar, http://www.abet.org/accreditation/get-accredited-2/assessment-planning/ 15
Workgroup session #2: Identifying assessment methods For each Performance Indicator you wrote, list the assessment in your course that best indicates the desired student performance at the program level? Consider relevance, accuracy, utility, and affordability Consider both direct methods (e.g., quiz, exam question, report, presentation) and indirect methods (e.g., student survey, project mentor survey) Student Outcome (g): Ability to communicate effectively Performance Indicators Assessment Type (D or I) Organize, prepare, and deliver a clear, complete, logical presentation of technical information Water treatment plant conceptual design presentation D Scoring and expected attainment Organize and compose a coherent technical report appropriate for a multi-disciplinary audience Demonstrate confidence in their ability to communicate effectively Water quality indicators report Student survey D I 16
A robust assessment plan employs tools to enable a consistent, objective assessment of student work University Mission Program Educational Objectives Student Outcomes What is the best way to articulate expectations and score assessments to determine student performance? Identify performance indicators Implement improvements Identify assessment methods Conduct assessments Identify needed improvements Collect and assemble data Interpret results Compile and analyze data G. Rodgers, Do Grades Make the Grade for Program Assessment?, Assessment Tips with Gloria Rodgers, http://www.abet.org/accreditation/getaccredited-2/assessment-planning/ 17
Rubrics articulate expectations for student performance and are a way of scoring direct assessments Dimensions/ Competencies 1 Unsatisfactory 2 Developing 3 Proficient Descriptions of each competency at each level of performance 4 Exemplary Analytic Rubrics: raters evaluate performance for each dimension or competency Take time to create and use Tend to provide detail on where and how to improve program Work for multi-dimensional assessments or complicated skills 1 Unsatisfactory Description of Unsatisfactory performance and behaviors 2 Developing Description of Developing performance and behaviors 3 Proficient Description of Proficient performance and behaviors 4 Exemplary Description of Exemplary performance and behaviors Holistic Rubrics: raters judge best fit of overall impression of performance Work for one-dimensional assessments Easier to write and apply, but require more judgment if performance spans levels Provide less detail on where and how to improve program than analytic rubrics ABET, Assessment: Developing Rubrics Webinar, http://www.abet.org/accreditation/get-accredited-2/assessment-planning/ 18
Rubrics require planning to provide accurate, useful data Decision to use rubric Decide rubric type and application Type = analytic, holistic, variation Application = generic or task-specific Generic rubric applied across program, e.g., problem solving, communication Task specific rubric designed for a single task, e.g., specific exam question or HW problem Design rubric Use student work to guide descriptions Populate extremes of scale first, then middle Pilot rubric Test rubric by scoring samples of student work Identify flaws in rubric and calibrate raters Implement rubric in courses Compile and analyze results Analyze for course and sections Identify causes of inconsistencies and determine improvements ABET, Assessment: Developing Rubrics Webinar, http://www.abet.org/accreditation/get-accredited-2/assessment-planning/ 19
Workgroup session #3: Scoring assessments and expected attainment For each assessment, identify how you will score the assessment objectively and the expected attainment that indicates adequate student performance Student Outcome g: Ability to communicate effectively Performance Indicators Assessment Type (D or I) Organize, prepare, and deliver a clear, complete, logical presentation of technical information Water treatment plant conceptual design presentation D Scoring and expected attainment Presentation Rubric 70% earn 3 or 4 Organize and compose a coherent technical report appropriate for a multi-disciplinary audience Water quality indicators report D Writing competency of Lab Report Rubric 70% earn 3 or 4 Demonstrate confidence in their ability to communicate effectively Student survey I Graduate Survey Q12 80% rate 4 or 5 20
ABET s Criterion 4 Continuous Improvement requires a description of how the program uses results to identify and implement improvements University Mission Program Educational Objectives Student Outcomes What are our program s strengths? What problems have we identified? How can we improve? Goal is to enhance students learning in the program through continuous improvements to courses, curriculum, and program Identify performance indicators Implement improvements Identify needed improvements Identify assessment methods Conduct assessments Collect and assemble data G. Rodgers, Do Grades Make the Grade for Program Assessment?, Assessment Tips with Gloria Rodgers, http://www.abet.org/accreditation/get-accredited-2/assessment-planning/ Interpret results Compile and analyze data 21
Instructors have the responsibility to conduct assessments, and interpret results, and identify and implement improvements within their courses University Mission Program Educational Objectives Student Outcomes How well did the students achieve the level of performance in my course? What problems did I identify? How can I improve my course to improve students learning? Identify performance indicators Implement improvements Identify needed improvements Identify assessment methods Conduct assessments Collect and assemble data G. Rodgers, Do Grades Make the Grade for Program Assessment?, Assessment Tips with Gloria Rodgers, http://www.abet.org/accreditation/get-accredited-2/assessment-planning/ Interpret results Compile and analyze data 22
Evaluation of results often reveals shortcomings and problems with program, curriculum, and courses Performance Indicator: Students organize, prepare, and deliver a clear, complete, logical presentation Assignment/rubric: Clear, complete, logical presentation on a relevant controversial topic 5 4 3 2 1 Result: Students summarize facts clearly, but fail to synthesize and interpret information to draw persuasive conclusions about the topic Facts about my topic 23
Asking 5 Whys? helps to peel away the layers of symptoms to uncover the root cause(s) of a problem The first layer is usually a symptom not a cause Why 2? Why 1? Why 3? The answer to each Why? helps to reveal relationships between symptoms and causes Why 4? Why 5? Answers to four-to-six Whys? usually expose the root cause EBA. 3 Steps to Using 5 Whys Problem Solving and Finally Eradicating those Pesky Problems 2016, http://www.educational-business-articles.com/5-whys/ Charles Duhigg. How Asking 5 Questions Allowed me to Eat Dinner With My Kids, The New York Times, March 10, 2016. https://well.blogs.nytimes.com/2016/03/10/how-asking-5-questions-allowed-me-to-eat-dinner-with-my-kids/ 24
What are root cause(s) of the performance the students demonstrated? Result: Students summarize facts clearly, but fail to synthesize and interpret information to draw persuasive conclusions Why 1? Facts about my topic Performance indicator emphasizes clarity, completeness, and logic, but not persuasion; doesn t describe desired skills and behaviors Why 5? Why 3? Students didn t seem to know that a persuasive presentation was required 1. Expectations of assignment Why 2? were unclear 2. How to develop a persuasive presentation was not covered adequately in class The assignment focused on clarity, completeness, and logic Why 4? 1. Assignment didn t specifically require students to demonstrate persuasion 2. Class discussions focused on research tactics and slide design 25
What kind of improvements should we make? Performance Indicator: Students organize, prepare, and deliver a clear, complete, clear, complete, logical, logical and persuasive presentation presentation Intro, facts, Facts about my interpretation, topic conclusions! Solution: Fix disconnect between performance indicator, instructional materials, assignment, and rubric: Revise performance indicator to define required skills and behaviors Revise assignment description to reflect expected performance Verify that rubric is aligned with assignment description and performance indicator Revise instructional content to reinforce required skills and behaviors Close the loop: Verify effectiveness of solution in subsequent assessment cycles 26
Workgroup session #4: Determining root causes and solutions Think about a course you teach and the results of some of your assessments (e.g., direct assessments and course surveys). Where results suggest a problem or shortcoming in your course, Write the problem statement Ask 4 to 6 Whys? ; write responses to each Why? to identify root causes List possible solutions in terms of what you can do The first layer is usually a symptom not a cause Why 2? Why 1? Why 3? Why 5? Why 4? Answers to four-to-six Whys? usually expose the root cause 27
The College of EMS is entering the final year of ABET s 6-year reaccreditation cycle and is preparing for the Fall 2018 site visit Program review Program review Assessment and Evaluation Program review Program review Program review Program review Periodic review of PEOs 2012 ABET site visit 7-day response ABET s Draft Statement Program responds to Draft Statement ABET s Final Statement Aug: ABET informs on Final Action Oct: ABET releases Final Action to public 2018 Self Study Preparation Dec: Facility walk through/safety checks Jan 31: RFE due to ABET Display Room Preparation Jul 01: Final Self Study due We are here 2018 ABET site visit AY 2012-2013 AY 2013-2014 AY 2014-2015 AY 2015-2016 AY 2016-2017 AY 2017-2018 28
Everyone has an important role in the program s ABET assessment and evaluation processes Know and use ABET s terminology: Criteria, Program Educational Objectives, Student Outcomes, Performance Indicators Know which Student Outcomes are assessed in your course Know and have input in the assessment plan for your course: assessments, frequency, scoring Act on responsibility and authority to implement improvements Provide input on strengths and opportunities for improvement Follow through on responsibility and requests to collect assessment data and samples of student work for your course Close the loop to verify effectiveness of improvements 29