Developing an Effective Program Assessment Plan or What Do They Want?

Similar documents
Developing an Assessment Plan to Learn About Student Learning

D direct? or I indirect?

ACADEMIC AFFAIRS GUIDELINES

UNIVERSIDAD DEL ESTE Vicerrectoría Académica Vicerrectoría Asociada de Assessment Escuela de Ciencias y Tecnología

Assessment of Student Academic Achievement

Assessment Essentials for Tribal Colleges

Revision and Assessment Plan for the Neumann University Core Experience

EQuIP Review Feedback

ABET Criteria for Accrediting Computer Science Programs

Saint Louis University Program Assessment Plan. Program Learning Outcomes Curriculum Mapping Assessment Methods Use of Assessment Data

STUDENT LEARNING ASSESSMENT REPORT

Colorado State University Department of Construction Management. Assessment Results and Action Plans

CAFE ESSENTIAL ELEMENTS O S E P P C E A. 1 Framework 2 CAFE Menu. 3 Classroom Design 4 Materials 5 Record Keeping

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

Programme Specification

Mathematics Program Assessment Plan

ACCREDITATION STANDARDS

Reference to Tenure track faculty in this document includes tenured faculty, unless otherwise noted.

The College of Law Mission Statement

Teachers Guide Chair Study

University of Toronto Mississauga Degree Level Expectations. Preamble

Wildlife, Fisheries, & Conservation Biology

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Biological Sciences, BS and BA

ASSESSMENT REPORT FOR GENERAL EDUCATION CATEGORY 1C: WRITING INTENSIVE

Procedures for Academic Program Review. Office of Institutional Effectiveness, Academic Planning and Review

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

Annual Report Accredited Member

University of the Arts London (UAL) Diploma in Professional Studies Art and Design Date of production/revision May 2015

Volunteer State Community College Strategic Plan,

Faculty of Social Sciences

SACS Reaffirmation of Accreditation: Process and Reports

Student Engagement and Cultures of Self-Discovery

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Linguistics Program Outcomes Assessment 2012

ASSESSMENT OF STUDENT LEARNING OUTCOMES WITHIN ACADEMIC PROGRAMS AT WEST CHESTER UNIVERSITY

Senior Project Information

University of Cambridge: Programme Specifications POSTGRADUATE ADVANCED CERTIFICATE IN EDUCATIONAL STUDIES. June 2012

Focus on. Learning THE ACCREDITATION MANUAL 2013 WASC EDITION

Facing our Fears: Reading and Writing about Characters in Literary Text

Chart 5: Overview of standard C

VOL VISION 2020 STRATEGIC PLAN IMPLEMENTATION

HIGHLAND HIGH SCHOOL CREDIT FLEXIBILITY PLAN

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

HARPER ADAMS UNIVERSITY Programme Specification

Scoring Guide for Candidates For retake candidates who began the Certification process in and earlier.

Oklahoma State University Policy and Procedures

Programme Specification 1

NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE)

Case of the Department of Biomedical Engineering at the Lebanese. International University

July 17, 2017 VIA CERTIFIED MAIL. John Tafaro, President Chatfield College State Route 251 St. Martin, OH Dear President Tafaro:

Classroom Assessment Techniques (CATs; Angelo & Cross, 1993)

Early Warning System Implementation Guide

Clinical Mental Health Counseling Program School Counseling Program Counselor Education and Practice Program Academic Year

New Jersey Department of Education World Languages Model Program Application Guidance Document

College of Education & Social Services (CESS) Advising Plan April 10, 2015

Karla Brooks Baehr, Ed.D. Senior Advisor and Consultant The District Management Council

Inquiry Learning Methodologies and the Disposition to Energy Systems Problem Solving

Loyola University Chicago Chicago, Illinois

5 Star Writing Persuasive Essay

Programme Specification

American University, Washington, DC Webinar for U.S. High School Counselors with Students on F, J, & Diplomatic Visas

Curriculum and Assessment Policy

Ph.D. in Behavior Analysis Ph.d. i atferdsanalyse

Assessment and Evaluation for Student Performance Improvement. I. Evaluation of Instructional Programs for Performance Improvement

Programme Specification. MSc in International Real Estate

BSc (Hons) Banking Practice and Management (Full-time programmes of study)

Program Guidebook. Endorsement Preparation Program, Educational Leadership

Online Master of Business Administration (MBA)

AC : DEVELOPMENT OF AN INTRODUCTION TO INFRAS- TRUCTURE COURSE

Anthropology Graduate Student Handbook (revised 5/15)

ADDIE: A systematic methodology for instructional design that includes five phases: Analysis, Design, Development, Implementation, and Evaluation.

Standards and Criteria for Demonstrating Excellence in BACCALAUREATE/GRADUATE DEGREE PROGRAMS

Department of Political Science Kent State University. Graduate Studies Handbook (MA, MPA, PhD programs) *

Using Team-based learning for the Career Research Project. Francine White. LaGuardia Community College

CEFR Overall Illustrative English Proficiency Scales

West Georgia RESA 99 Brown School Drive Grantville, GA

Assessment for Student Learning: Institutional-level Assessment Board of Trustees Meeting, August 23, 2016

Designing a Rubric to Assess the Modelling Phase of Student Design Projects in Upper Year Engineering Courses

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

Arts, Humanities and Social Science Faculty

Xenia High School Credit Flexibility Plan (CFP) Application

National Survey of Student Engagement

Department of Education School of Education & Human Services Master of Education Policy Manual

Intensive Writing Class

The Writing Process. The Academic Support Centre // September 2015

MBA 5652, Research Methods Course Syllabus. Course Description. Course Material(s) Course Learning Outcomes. Credits.

Department of Communication Criteria for Promotion and Tenure College of Business and Technology Eastern Kentucky University

Honors Mathematics. Introduction and Definition of Honors Mathematics

Tentative School Practicum/Internship Guide Subject to Change

DMA Timeline and Checklist Modified for use by DAC Chairs (based on three-year timeline)

Mary Washington 2020: Excellence. Impact. Distinction.

National Survey of Student Engagement (NSSE)

Foundation Certificate in Higher Education

STRUCTURAL ENGINEERING PROGRAM INFORMATION FOR GRADUATE STUDENTS

American Studies Ph.D. Timeline and Requirements

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

2015 Academic Program Review. School of Natural Resources University of Nebraska Lincoln

MASTER S COURSES FASHION START-UP

PROGRAMME SPECIFICATION KEY FACTS

Transcription:

Developing an Effective Program Assessment Plan or What Do They Want? Bill Knight Academic Assessment and Institutional Research

The Three Basic Steps of Assessment Goals. What do we want students to be able to do when they complete our program? Information. How well are students achieving these goals, and what factors influence their learning? Action. How can we use the information to improve student learning? (Walvoord, 2010)

Adding Value with Two Additional Steps Goals. What do we want students to be able to do when they complete our courses of study? Curriculum. Where do students have the opportunity for learning? Information. How well are students achieving these goals, and what factors influence their learning? Expectations. What is the expected level of performance? (now required by HLC) Action. How can we use the information to improve student learning? (Maki, 2004; Walvoord, 2010)

Where and When Do Our Students Learn This? EXCERPT FROM A HYPOTHETICAL BIOLOGY PROGRAM CURRICULUM MATRIX Key: "I"=Introduced; "R"=reinforced and opportunity to practice; "M"=mastery at the senior or exit level; "A"=assessment evidence collected Intended Student Learning Outcomes Courses Apply the scientific method Develop laboratory techniques Diagram and explain major cellular processes Awareness of careers and job opportunities in biological sciences BIOL 101 I I I BIOL 202 R R I BIOL 303 R M, A R BIOL 404 M, A M, A R Other: Exit interview Retrieved from http://manoa.hawaii.edu/assessment/howto/mapping.htm A

The Basic No-Frills Department Assessment System (Walvoord, 2010) Learning goals for each degree program, co-curricular program, etc. Two measures of how well your students are achieving this goal One direct measure (e.g., student work samples near the time of graduation) One indirect measure (e.g., surveys, interviews, or focus groups that ask students how well they feel they achieved each of the learning goals, what aspects of their program helped them achieve those goals, and what might the department do differently that would help students to learn more effectively) One two-hour department meeting per year in which assessment results are discussed, at least one follow-up action to improve student learning is agreed upon, and for which meeting notes are kept

What do we mean by direct and indirect measures? Examples of Direct Measures of Student Learning Ratings of student skills by their field experience supervisors Scores and pass rates on appropriate licensure or certification exams Capstone experiences, such as research projects, presentations, theses, dissertations, oral defenses, exhibitions, performances, scored using a rubric Other written work, performances, and presentations, scored suing a rubric Portfolios of student work

Examples of Direct Measures of Student Learning Scores on locally designed multiple-choice or essay tests such as final examinations in key courses, qualifying examinations, and comprehensive examinations Score gains (referred to as value added ) between entry and exit on published or local tests or writing samples Observations of student behavior (such as presentations and group discussions), undertaken systematically and with notes recorded systematically Summaries and assessment of electronic class discussion threads

Examples of Direct Measures of Student Learning Think-alouds, which ask students to think aloud as they work on a problem or assignment Classroom response systems ( clickers ) that allow students in their classroom seats to answer questions posed by the instructor instantly and provide an immediate picture of student understanding Feedback from computer-simulated tasks such as information on patterns of action, decisions, and branches Student reflections on their values, attitudes, and beliefs, if developing those are intended outcomes of the program (Suskie, 2009)

Examples of Indirect Measures of Student Learning Course grades and grade distributions Assignment grades, if not accompanied by a rubric or scoring criteria Retention and graduation rates Admission rates into graduate programs and graduation rates from those programs Scores on tests required for further study (such as the GRE) that evaluate skills learned over a lifetime Quality and reputation of graduate programs into which alumni are accepted

Examples of Indirect Measures of Student Learning Placement rates of graduates into appropriate career positions and starting salaries Alumni perceptions of their career responsibilities and satisfaction Student feedback of their knowledge and skills and reflections on what they have learned over the course of their program Questions on end-of-course student evaluation forms that ask about the course rather than the instructor Student, alumni, and employer satisfaction with learning, collected through surveys, exit interviews, or focus groups

Examples of Indirect Measures of Student Learning Student participation rates in faculty research, publications, and conference publications Honors, awards, and scholarships earned by students and alumni (Suskie, 2009)

Using Samples of Student Work for Assessment Advantages Information is already available No student motivation problems, since students must complete the work for a grade No direct cost Reflects what faculty actually teach, not what s included on standardized tests, so faculty members are more motivated

Using Samples of Student Work for Assessment Disadvantages Evidence not comparable across institutions Everyone evaluates differently, so common standards or rubrics and training are needed Information is in multiple parts and multiple formats, so it needs to be collected in portfolios Quite a bit work, especially at the beginning (Walvoord, 2010)

Developing a Rubric Clearly define the assignment including the topic, the process that students will work through, and the product they are expected to produce. Brainstorm a list of what you expect to see in the student work that demonstrates the particular learning outcome(s) you are assessing. Keep the list manageable (3-8 items) and focus on the most important abilities, knowledge, or attitudes expected. Edit the list so that each component is specific and concrete (for instance, what do you mean by coherence?), use action verbs when possible, and descriptive, meaningful adjectives (e.g., not "adequate" or "appropriate" but "correctly" or "carefully").

Developing a Rubric Establish clear and detailed standards for performance for each component. Avoid relying on comparative language when distinguishing among performance levels. For instance, do not define the highest level as "thorough" and the medium level as "less thorough". Find descriptors that are unique to each level. Develop a scoring scale. Test the rubric with more than one rater by scoring a small sample of student work. Are your expectations too high or too low? Are some items difficult to rate and in need of revision? University of Virginia Office of Institutional Assessment & Studies. (n.d.)

Using a Rubric Evaluators should meet together for a training/norming session. A sample of student work should be examined and scored More than one faculty member should score the student work. Check to see if raters are applying the standards consistently If two faculty members disagree significantly (.e.g. more than 1 point on a 4 point scale) a third person should score the work. If frequent disagreements arise about a particular item, the item may need to be refined or removed University of Virginia Office of Institutional Assessment & Studies. (n.d.)

Available Rubric Libraries California State University Fresno http://www.csufresno.edu/oie/assessment/rubric.shtm University of Delaware http://assessment.udel.edu/resources/rubrics.html University of Virginia http://www.web.virginia.edu/iaas/assess/tools/rubrics. shtm

How Can We Decide What Are Reasonable Expectations? Benchmarks or Standards for Interpreting Assessment Results Local Standards External Standards Internal Peer Benchmark External Peer Benchmark Best Practices Benchmark Value-Added Benchmark Historical Trends Benchmark Strengths and Weaknesses Perspective Capability Benchmark Productivity Benchmark Are students meeting our own standards? Are students meeting standards set by someone else? How do our students compare to others within Ball State? How do our students compare with those of other universities that are similar to Ball State? How do our students compare to the best out their peers? Are our students improving? Is our program improving? What are our students areas of strengths and weaknesses? Are our students doing as well as they can? Are we getting the most for our investment?

Case Studies of Department Assessment Activities

What Is A Good Assessment Effort? How Will It Help Us and Our Students? A good assessment effort: gets faculty members, within and across disciplines, talking about their goals for student learning gets students to see how courses fit together makes our expectations more clear to students provides detailed feedback to students about their learning (Suskie, 2009; Wolvoord, 2010)

A good assessment effort helps us to: Increase our confidence that we are putting our time and resources into activities that we value as an institution Increase our confidence that we are allocating resources to areas that are producing the outcomes we value Gather and use data that will enable us to make decisions that lead to improved instruction, stronger curricula, and effective and efficient policies Strengthen our ability to say that our graduates are well-prepared to succeed in their future endeavors

A good assessment effort helps us to: Have ready access to data that will satisfy the requirements of accrediting agencies and funding agencies, and will inform various accountability driven conversations Gather and use data that will strengthen arguments for increased funding and/or resource allocations to areas that are producing valued outcomes Increase the effectiveness of our communications about the value of a Ball State education. (University of Delaware, n.d.)

Discussion