Direct Measures of Student Learning

Similar documents
Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

D direct? or I indirect?

Course Specification Executive MBA via e-learning (MBUSP)

Developing an Assessment Plan to Learn About Student Learning

UNIVERSIDAD DEL ESTE Vicerrectoría Académica Vicerrectoría Asociada de Assessment Escuela de Ciencias y Tecnología

CONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION. Connecticut State Department of Education

Welcome to the session on ACCUPLACER Policy Development. This session will touch upon common policy decisions an institution may encounter during the

Assessment and Evaluation

Providing Feedback to Learners. A useful aide memoire for mentors

Henley Business School at Univ of Reading

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

What does Quality Look Like?

ACADEMIC AFFAIRS GUIDELINES

Kelso School District and Kelso Education Association Teacher Evaluation Process (TPEP)

ABET Criteria for Accrediting Computer Science Programs

Curriculum and Assessment Policy

Presentation 4 23 May 2017 Erasmus+ LOAF Project, Vilnius, Lithuania Dr Declan Kennedy, Department of Education, University College Cork, Ireland.

Programme Specification. MSc in International Real Estate

Arts, Humanities and Social Science Faculty

A Pilot Study on Pearson s Interactive Science 2011 Program

Institution-Set Standards: CTE Job Placement Resources. February 17, 2016 Danielle Pearson, Institutional Research

ELS LanguagE CEntrES CurriCuLum OvErviEw & PEDagOgiCaL PhiLOSOPhy

Indiana Collaborative for Project Based Learning. PBL Certification Process

MBA 5652, Research Methods Course Syllabus. Course Description. Course Material(s) Course Learning Outcomes. Credits.

Final Teach For America Interim Certification Program

BSM 2801, Sport Marketing Course Syllabus. Course Description. Course Textbook. Course Learning Outcomes. Credits.

NC Global-Ready Schools

HIGHLAND HIGH SCHOOL CREDIT FLEXIBILITY PLAN

School Leadership Rubrics

Programme Specification. MSc in Palliative Care: Global Perspectives (Distance Learning) Valid from: September 2012 Faculty of Health & Life Sciences

National Survey of Student Engagement

Teachers Guide Chair Study

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

Arkansas Tech University Secondary Education Exit Portfolio

Delaware Performance Appraisal System Building greater skills and knowledge for educators

STUDENT LEARNING ASSESSMENT REPORT

What is PDE? Research Report. Paul Nichols

Designing a Rubric to Assess the Modelling Phase of Student Design Projects in Upper Year Engineering Courses

Exams: Accommodations Guidelines. English Language Learners

Automating Outcome Based Assessment

The Teaching and Learning Center

MASTER S COURSES FASHION START-UP

Using Team-based learning for the Career Research Project. Francine White. LaGuardia Community College

Standards and Criteria for Demonstrating Excellence in BACCALAUREATE/GRADUATE DEGREE PROGRAMS

Programme Specification

Unit 3. Design Activity. Overview. Purpose. Profile

Contract Language for Educators Evaluation. Table of Contents (1) Purpose of Educator Evaluation (2) Definitions (3) (4)

NCSC Alternate Assessments and Instructional Materials Based on Common Core State Standards

1 Use complex features of a word processing application to a given brief. 2 Create a complex document. 3 Collaborate on a complex document.

Content Teaching Methods: Social Studies. Dr. Melinda Butler

Carolina Course Evaluation Item Bank Last Revised Fall 2009

Foundation Certificate in Higher Education

Annual Report Accredited Member

CERTIFICATE OF HIGHER EDUCATION IN CONTINUING EDUCATION. Relevant QAA subject benchmarking group:

Loyola University Chicago Chicago, Illinois

Qualification handbook

Teaching and Assessing Professional Skills in an Undergraduate Civil Engineering

College of Liberal Arts (CLA)

Level 6. Higher Education Funding Council for England (HEFCE) Fee for 2017/18 is 9,250*

Making the ELPS-TELPAS Connection Grades K 12 Overview

How to Judge the Quality of an Objective Classroom Test

OVERVIEW OF CURRICULUM-BASED MEASUREMENT AS A GENERAL OUTCOME MEASURE

Diploma in Library and Information Science (Part-Time) - SH220

Assuring Graduate Capabilities

Executive Summary. Laurel County School District. Dr. Doug Bennett, Superintendent 718 N Main St London, KY

Indicators Teacher understands the active nature of student learning and attains information about levels of development for groups of students.

GCSE English Language 2012 An investigation into the outcomes for candidates in Wales

Activities, Exercises, Assignments Copyright 2009 Cem Kaner 1

Revision and Assessment Plan for the Neumann University Core Experience

Exhibition Techniques

Georgia Department of Education

Jazz Dance. Module Descriptor.

Wide Open Access: Information Literacy within Resource Sharing

Applying Florida s Planning and Problem-Solving Process (Using RtI Data) in Virtual Settings

BENCHMARK TREND COMPARISON REPORT:

Technical Skills for Journalism

Online Master of Business Administration (MBA)

Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd. Hertfordshire International College

Procedures for Academic Program Review. Office of Institutional Effectiveness, Academic Planning and Review

THREE-YEAR COURSES FASHION STYLING & CREATIVE DIRECTION Version 02

The Oregon Literacy Framework of September 2009 as it Applies to grades K-3

Shank, Matthew D. (2009). Sports marketing: A strategic perspective (4th ed.). Upper Saddle River, NJ: Pearson/Prentice Hall.

BSc (Hons) Banking Practice and Management (Full-time programmes of study)

PROGRAMME SPECIFICATION

An Analysis of the Early Assessment Program (EAP) Assessment for English

Top Ten: Transitioning English Language Arts Assessments

Florida s Common Language of Instruction

ASSESSMENT OF STUDENT LEARNING OUTCOMES WITHIN ACADEMIC PROGRAMS AT WEST CHESTER UNIVERSITY

e-portfolios: Issues in Assessment, Accountability and Preservice Teacher Preparation Presenters:

1. Faculty responsible for teaching those courses for which a test is being used as a placement tool.

California State University, Chico College of Business Graduate Business Program Program Alignment Matrix Academic Year

Sample Performance Assessment

Freshman On-Track Toolkit

$0/5&/5 '"$*-*5"503 %"5" "/"-:45 */4536$5*0/"- 5&$)/0-0(: 41&$*"-*45 EVALUATION INSTRUMENT. &valuation *nstrument adopted +VOF

Full-time MBA Program Distinguish Yourself.

Week 4: Action Planning and Personal Growth

Higher Education Review (Embedded Colleges) of Kaplan International Colleges UK Ltd

Maintaining Resilience in Teaching: Navigating Common Core and More Online Participant Syllabus

Requirements for the Degree: Bachelor of Science in Education in Early Childhood Special Education (P-5)

Transcription:

Direct Measures of Student Learning Learning outcomes assessment programs must be based on clear and public statements of faculty s expectations for student achievement. The College must provide evidence that it systematically collects and examines direct indicators of student learning at the academic program level and uses the data to document and enhance student learning. Why use direct measures? Direct measures provide strong evidence of: 1. Student attainment of competencies or expected learning outcomes. 2. Quality and consistency within discipline/program across campuses and faculty. 3. Student attainment of college learning outcomes across applicable courses. How are direct measures used? 1. Identify common learning outcomes or competencies students are expected to attain in courses, course sequences, and/or programs. 2. Establish program of systematic direct assessment using measures that are most appropriate to discipline or program. 3. Build in systematic discipline or program review of assessment results. 4. Use review of results to improve instruction and curriculum as indicated. 1. Tests and Assessments In most cases, a test will be one part of a fully developed assessment plan. Tests are commonly used in association with cognitive goals in order to review student achievement with respect to a common body of knowledge associated with a discipline or program. Tests are traditionally used in assessment programming to measure whether students have acquired a certain process- and content-related knowledge. There are two primary testing alternatives; 1) locally developed/ faculty generated tests and assessments, and 2) commercially produced standardized tests and examinations. Locally developed tests and assessments are probably the most widely used method for evaluating student progress. For assessing the attainment of academic program learning outcomes, assessments designed by the instructors who set the educational goals and teach the courses is often the best approach. Cost benefits, interpretation advantages, and quick turnaround time all make using locally designed tests an attractive method for assessing student learning. Reliability and validity of the tests and assessments for their intended purpose must be established and documented before interpreting and using results. As with most assessments, information gleaned should be used in conjunction with results from other approaches. Locally developed tests and assessments designed for a specific curriculum or for targeted outcomes can often prove more valuable when assessing student achievement than commercial instruments. These tests will be closely aligned to curricular competencies and outcomes, focus on the missions, goals, and objectives of the departments and permit useful measurement of student behavior and learning. A wellconstructed and carefully administered test or assessment that is graded by two or more judges for the Direct Measures of Student LearningR3.docx, JB, 9/2/2009 1

specific purpose of determining program strengths and weaknesses remains one of the most popular instruments for assessing most programs or major coursework as well as general education or common college-wide learning outcomes. Commercially generated or state developed tests and licensure examinations are used to measure student competencies under controlled conditions. Tests are developed and measured nationally to determine the level of learning that students have acquired in specific fields of study. For example, nationally standardized multiple-choice tests are widely used and assist departments in determining programmatic strengths and weaknesses when compared to other programs and national data. Compilations of data on the performance of students who voluntarily take national or state licensure examinations provide faculty useful data that often leads to programmatic improvements. When using commercially generated tests, national or state standards are typically used to compare institutional pass rates, mean and median scores, and overall achievement of students with other institutions. In many cases, standardized testing is useful for benchmarking and demonstrating external validity. Care needs to be taken to ensure that the exams have been normed on similar student populations and that the comparison groups are appropriate. There are a number of important advantages to using commercial/standardized tests and examinations to measure student achievement; first, institutional comparisons of student learning are possible. Second, very little professional time is needed beyond faculty efforts to analyze examination results and develop appropriate curricular changes that address the findings. Third, in most cases, nationally developed tests are devised by experts in the discipline and should be current and well-constructed. Fourth, tests are traditionally given to students in large numbers and do not require faculty involvement when exams are taken by students. The Florida State Basic Skills Exit Test (College Preparatory Exit Test) is currently administered to students in the highest levels of college preparatory coursework at MDC. Results from this test can be used to evaluate consistency of instruction within and across campuses, and student attainment of specific learning outcomes defined for the college preparatory program. These results cannot be used to compare MDC s performance with other institutions however, since test forms and administration procedures are not standard across institutions. Some of the more commonly used national tests include: California Critical Thinking Skills Tests (CCTST), published by Insight Assessment, measure critical thinking in paper/pencil or on-line test administrations. These tests can be used to assess individual students or group level critical thinking and reasoning skills. More information is available at http://www.insightassessment.com/cctst%20family.html College Basic Academic Subjects Exam (College BASE) was developed by the University of Missouri- Columbia and is a criterion-referenced academic achievement test. The test measures skills in four broad curricular domains: English, mathematics, science, and social studies. More information is available at http://arc2.missouri.edu/cb/cbase_%20folder_for_the_web_final.pdf Direct Measures of Student LearningR3.docx, JB, 9/2/2009 2

The Collegiate Assessment of Academic Proficiency (CAAP), published by ACT, is measures the achievement levels of students in selected core academic skills. Test modules are available in Reading, Writing Skills, Writing Essay, Mathematics, Science, and Critical Thinking. Each test module may be used as a stand-alone exam or in combination with other CAAP exams. The modular format offers flexibility to select the assessment components that meet the College s mission, goals, and educational objectives. More information is available at www.act.org/caap Collegiate Learning Assessment Project (CLA), published by the Rand Corporation s Council for Aid to Education. These performance assessments measure value added by an institution to students skills. Students respond to open-ended prompts and write essays, which are used to assess the institution s contribution to students critical thinking, analytical reasoning, problem solving, and communication skills in authentic tasks. More information is available at http://www.collegiatelearningassessment.org/ ICT Literacy Assessment Information and Communication Technology Literacy, published by ETS is administered on-line. Students are presented with scenario-based tasks to measure cognitive and technical skills related to seven information literacy proficiencies. More information is available at www.ets.org/ictliteracy The Major Field Tests, published by ETS, are designed to assess the outcomes of higher education by measuring undergraduate and master s level learning in thirteen specific disciplines/majors. Tests (e.g. Chemistry, Education, Literature in English, Mathematics, Physics, etc.) reflect basic knowledge and understanding gained from courses. They go beyond measurement of factual knowledge, however, because they also evaluate students' ability to analyze and solve problems, understand relationships, and interpret material. A content review by appropriate faculty members should be undertaken to determine whether the content and coverage of the test is consistent with the content coverage expected of undergraduates majoring in that field. More information is available at http://www.ets.org/hea/mft/index.html Measures of Academic Proficiency and Progress (MAPP), published by ETS, replaces the Academic Profile. MAPP uses a multiple choice format to measure four core skill areas: critical thinking, reading, writing, and mathematics. An optional essay is available. Testing time is 2 hours for the standard length or 40 minutes for the abbreviated versions which are available to administer paper/pencil or on-line. Times may vary based on number of optional questions added by an institution. More information is available at www.ets.org/mapp Watson-Glaser II Critical Thinking Appraisal. This test dates back to the 1960 s but newer versions (online and paper/pencil) are available today from Pearson Talent Group. The latest version measures three factors of critical thinking: recognize assumptions, evaluate arguments, and draw conclusions. More information is available at http://www.talentlens.com/en/watson/ WorkKeys published by ACT, is marketed as a job skills assessment that measures real world skills. The assessments provide evidence of students skill levels for educators and employers. Foundational skills assessment areas are business writing, reading for information, listening, writing, applied mathematics, Direct Measures of Student LearningR3.docx, JB, 9/2/2009 3

locating information, observation, applied technology, and teamwork. Additional assessments are available specifically for health care professions. Assessments can be administered on-line or paper/pencil and most take 60 minutes or less to complete. More information is available at http://www.act.org/workkeys/ 2. Course-Embedded Assessment Assessment practices embedded in academic courses generate information about what and how students are learning within the program and classroom environment. Course-embedded assessment takes advantage of existing curricular offerings and use data instructors already collect and/or introduce new assessment measures into courses. The embedded methods most commonly used involve the collection of student data from assessment questions developed and/or selected by the discipline. These questions, intended to assess specific student learning outcomes, are incorporated or embedded into final exams, research reports, and/or term papers and projects. Alternatively, rubrics to evaluate the intended outcomes can be developed and applied to a variety of assessment artifacts. A sample of student responses or artifacts is typically collected and evaluated by two or more discipline faculty to determine the extent to which students are achieving the intended learning outcomes. This assessment activity is a separate process from that used by the course instructor to grade the exam, report, or term paper. There are a number of advantages to using course-embedded assessment. First, student information gathered from embedded assessment draws on collective and integrated educational experiences and familiarity with specific areas or disciplines. Second, embedded assessment often does not require additional time for data collection, since tools used can be derived from course assignments already planned as part of the requirements. Third, the presentation of assessment feedback to faculty and students can occur very quickly to facilitate ongoing program or course improvement. Finally, students tend to take this method seriously since course-embedded assessment is part of their course assignments or examinations. Course embedded assessment can be used in virtually any discipline or to evaluate general education programs. There are also challenges in using this approach. Faculty agreement on assessment questions, prompts, assignments, and/or rubrics is necessary. Time for a sufficient number of the faculty to participate in evaluating assessments or artifacts is also needed, as is appropriate training and/or norming to increase the validity of findings. 3. Portfolio Evaluation Portfolios used for assessment purposes are most commonly characterized by collections of student work that exhibit to the faculty and the student the student's progress and achievement in specific courses, course sequences, or programs.. Portfolios often include samples of student work such as research papers and other projects or reports, multiple choice or essay examinations, self-evaluations, personal essays, journals, computational exercises and problems, case studies, audiotapes, videotapes, and short-answer quizzes. This information may be gathered from in-class or out-of-class assignments and compiled electronically or in paper format. Information about the students' skills, knowledge, development, quality of writing, and critical thinking can be acquired through a comprehensive collection of work samples. A student portfolio can be assembled within a course, in a sequence of courses in the major, or throughout the program coursework. Direct Measures of Student LearningR3.docx, JB, 9/2/2009 4

The faculty determines what information or students' products should be collected and how these products will be used to evaluate or assess student learning outcomes. Collecting student work over time gives departments a unique opportunity to assess a students' progression in acquiring a variety of learning objectives. Using student portfolios also gives faculty the ability to determine the content and control the quality of the assessed materials. Student portfolios are typically learning experiences for students as they prepare and improve the artifacts and examples to be included in their portfolios. Electronic portfolios are currently used to demonstrate student learning outcomes within the Baccalaureate in Education programs at MDC using LiveText software. Additional information about LiveText is available at: http://college.livetext.com/college/portfolios.html. In addition, InterAmerican Campus developed electronic portfolio resources and templates as part of a Title V grant. More information on this project is available at: http://www.mdc.edu/iac/learningresources/epf/. FACTS.org includes information about a free electronic career portfolio resource at: http://facts23.facts.org/collegestudents.portfolios E-portfolio tools are expanding in the marketplace. A quick web search will identify numerous software packages and tools to support this assessment approach. 4. Pre-test/Post-test Evaluation Pre-test/post test assessment is a method where locally developed tests are administered at the beginning and at the end of courses or academic programs. These test results enable faculty to monitor student progression and learning throughout prescribed periods of time and sequences of courses. The results are often useful for determining where skill and knowledge deficiencies exist to inform curricular changes. By controlling for entry level skills, this method documents learning gains attained by students through the courses or programs. 5. Thesis or Capstone Project Evaluation A graduating student thesis, research project, performance paper, or other project that is structured by the department to give students an opportunity to demonstrate a mastery of an array of skills and knowledge appropriate to the program can be a useful assessment instrument. This method has the advantages of being authentic and integrative, allowing the student to put their best work forward. The project or thesis may also prove useful to students as they apply to upper division institutions or for employment. 6. Capstone Course Evaluation Capstone courses integrate knowledge, concepts, and skills associated with an entire sequence of study in a program. This method of assessment is unique because the courses themselves become the instruments for assessing student teaching and learning. Evaluation of students' work in these courses is used as a means of assessing student outcomes. For academic units where a single capstone course is not feasible or Direct Measures of Student LearningR3.docx, JB, 9/2/2009 5

desirable, a department may designate a small group of courses where competencies or outcomes expected upon completion of programs will be measured. Capstone courses provide students with a forum to combine various aspects of their programmatic experiences. For departments and faculty, the courses provide a forum to assess student achievement in a variety of knowledge and skills-based areas by integrating their educational experiences. Also, these courses can provide a final common experience for students in the discipline. 7. Videotape and Audiotape Evaluation Videotapes and audiotapes have been used by faculty as a pre-test/post-test or summative assessment of student skills and knowledge. Disciplines that have a performance component are most likely to use this assessment method (e.g., theatre, music, art, communication, and student teaching). This can be a very powerful assessment tool when video and audiotapes are evaluated by two or more faculty using common rubric to assess the extent to which students have attained the expected learning outcomes. Direct Measures of Student LearningR3.docx, JB, 9/2/2009 6