ASSESSMENT GLOSSARY 06/2018

Similar documents
Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Developing an Assessment Plan to Learn About Student Learning

SACS Reaffirmation of Accreditation: Process and Reports

ACADEMIC AFFAIRS GUIDELINES

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

Standards and Criteria for Demonstrating Excellence in BACCALAUREATE/GRADUATE DEGREE PROGRAMS

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

ASSESSMENT OF STUDENT LEARNING OUTCOMES WITHIN ACADEMIC PROGRAMS AT WEST CHESTER UNIVERSITY

STUDENT LEARNING ASSESSMENT REPORT

Department of Geography Bachelor of Arts in Geography Plan for Assessment of Student Learning Outcomes The University of New Mexico

Revision and Assessment Plan for the Neumann University Core Experience

Strategic Planning for Retaining Women in Undergraduate Computing

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Davidson College Library Strategic Plan

Master of Science (MS) in Education with a specialization in. Leadership in Educational Administration

Math Pathways Task Force Recommendations February Background

Volunteer State Community College Strategic Plan,

School Leadership Rubrics

Designing a Rubric to Assess the Modelling Phase of Student Design Projects in Upper Year Engineering Courses

Focus on. Learning THE ACCREDITATION MANUAL 2013 WASC EDITION

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

Procedures for Academic Program Review. Office of Institutional Effectiveness, Academic Planning and Review

Assessment of Student Academic Achievement

Assessment for Student Learning: Institutional-level Assessment Board of Trustees Meeting, August 23, 2016

NORTH CAROLINA STATE BOARD OF EDUCATION Policy Manual

SURVEY RESEARCH POLICY TABLE OF CONTENTS STATEMENT OF POLICY REASON FOR THIS POLICY

Lincoln School Kathmandu, Nepal

Final Teach For America Interim Certification Program

An Introduction to LEAP

Kelso School District and Kelso Education Association Teacher Evaluation Process (TPEP)

Individual Interdisciplinary Doctoral Program Faculty/Student HANDBOOK

Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd. Hertfordshire International College

ABET Criteria for Accrediting Computer Science Programs

EQuIP Review Feedback

STUDENT ASSESSMENT, EVALUATION AND PROMOTION

Lecturer Promotion Process (November 8, 2016)

Charter School Performance Accountability

Expanded Learning Time Expectations for Implementation

Indiana Collaborative for Project Based Learning. PBL Certification Process

TSI Operational Plan for Serving Lower Skilled Learners

The Characteristics of Programs of Information

Delaware Performance Appraisal System Building greater skills and knowledge for educators

D direct? or I indirect?

PROPOSAL FOR NEW UNDERGRADUATE PROGRAM. Institution Submitting Proposal. Degree Designation as on Diploma. Title of Proposed Degree Program

NC Global-Ready Schools

Writing a Basic Assessment Report. CUNY Office of Undergraduate Studies

Saint Louis University Program Assessment Plan. Program Learning Outcomes Curriculum Mapping Assessment Methods Use of Assessment Data

Data Glossary. Summa Cum Laude: the top 2% of each college's distribution of cumulative GPAs for the graduating cohort. Academic Honors (Latin Honors)

Teachers Guide Chair Study

CONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION. Connecticut State Department of Education

Pakistan Engineering Council. PEVs Guidelines

This Performance Standards include four major components. They are

College of Education & Social Services (CESS) Advising Plan April 10, 2015

State Budget Update February 2016

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

Self Assessment. InTech Collegiate High School. Jason Stanger, Director 1787 Research Park Way North Logan, UT

Barstow Community College NON-INSTRUCTIONAL

Freshman On-Track Toolkit

The Teaching and Learning Center

How to Judge the Quality of an Objective Classroom Test

Arizona s English Language Arts Standards th Grade ARIZONA DEPARTMENT OF EDUCATION HIGH ACADEMIC STANDARDS FOR STUDENTS

FACULTY OF PSYCHOLOGY

Program Assessment and Alignment

The University of North Carolina Strategic Plan Online Survey and Public Forums Executive Summary

Making the ELPS-TELPAS Connection Grades K 12 Overview

University of Colorado Skaggs School of Pharmacy and Pharmaceutical Sciences Programmatic Evaluation Plan

July 17, 2017 VIA CERTIFIED MAIL. John Tafaro, President Chatfield College State Route 251 St. Martin, OH Dear President Tafaro:

Multiple Measures Assessment Project - FAQs

Executive Summary. Laurel County School District. Dr. Doug Bennett, Superintendent 718 N Main St London, KY

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

Applying Florida s Planning and Problem-Solving Process (Using RtI Data) in Virtual Settings

A Systems Approach to Principal and Teacher Effectiveness From Pivot Learning Partners

Mathematics Program Assessment Plan

Assessment Essentials for Tribal Colleges

Senior Project Information

Strategic Goals, Objectives, Strategies and Measures

Chaffey College Program Review Report

Higher Education Review (Embedded Colleges) of Kaplan International Colleges UK Ltd

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

Learning Objectives by Course Matrix Objectives Course # Course Name Psyc Know ledge

Strategic Plan SJI Strategic Plan 2016.indd 1 4/14/16 9:43 AM

Promotion and Tenure Guidelines. School of Social Work

Guidelines for the Use of the Continuing Education Unit (CEU)

Referencing the Danish Qualifications Framework for Lifelong Learning to the European Qualifications Framework

Colorado State University Department of Construction Management. Assessment Results and Action Plans

Early Warning System Implementation Guide

Scoring Guide for Candidates For retake candidates who began the Certification process in and earlier.

PEDAGOGICAL LEARNING WALKS: MAKING THE THEORY; PRACTICE

UNIVERSIDAD DEL ESTE Vicerrectoría Académica Vicerrectoría Asociada de Assessment Escuela de Ciencias y Tecnología

VOL VISION 2020 STRATEGIC PLAN IMPLEMENTATION

Loyalist College Applied Degree Proposal. Name of Institution: Loyalist College of Applied Arts and Technology

Sample Performance Assessment

FORT HAYS STATE UNIVERSITY AT DODGE CITY

Inquiry Learning Methodologies and the Disposition to Energy Systems Problem Solving

ADMISSION TO THE UNIVERSITY

Karla Brooks Baehr, Ed.D. Senior Advisor and Consultant The District Management Council

Academic Program Assessment Prior to Implementation (Policy and Procedures)

Historical Overview of Georgia s Standards. Dr. John Barge, State School Superintendent

Program Elements Definitions and Structure

Engaging Faculty in Reform:

Transcription:

ASSESSMENT GLOSSARY The intent of this glossary is to provide a common vocabulary for the terms frequently used in the context of college outcome assessment. It is a work in progress by the Assessment Committee. The definitions provided in this list were adapted from various sources. A Academic Course Map: The specific order of courses a student needs to take each semester to finish a degree. At CCA, there are two-, three-, and four-year academic course maps. Accreditation: 1. An outward-focused activity in which an institution reports on its financial health; physical and technological infrastructure; staff and faculty capacities; and educational effectiveness. The purpose of accreditation is to provide public accountability to external audiences. The accrediting body of CCA is the Higher Learning Commission (HLC). 2. Accreditation is the recognition that an institution maintains standards requisite for its graduates to gain admission to other reputable institutions of higher learning or to achieve credentials for professional practice. The goal of accreditation is to ensure that education provided by institutions of higher education meets acceptable levels of quality. Anchor (also called Exemplars): A sample of student work (product or performance) used to illustrate each level of a scoring rubric; critical for training scorers of performances since it serves as a standard against which other student work is compared. Alignment: The process of intentionally connecting course, program, general education, and institutional learning outcomes. At the program level, alignment represents the ideal cohesive relationship between curriculum and outcomes. Checking alignment allows program faculty to determine whether the curriculum provides sufficient and appropriately sequenced opportunities for students to develop the knowledge, skills, and dispositions identified in SLOs. Assessment: One or more processes that identify, collect, and prepare data to evaluate the attainment of student outcomes. Effective assessment uses relevant direct, indirect, quantitative and qualitative measures as appropriate to the outcome being measured. Appropriate sampling methods may be used as part of an assessment process. Assessment Cycle: An assessment cycle includes a period for planning and submission of an assessment plan, a period for data collection, and lastly, data analysis and the preparation of an assessment report.

Assessment Plan: A collaboratively-developed planning document that establishes a multi-year plan for outcomes assessment. Assessment plans articulate when each student learning objective will be assessed; the types of direct and indirect evidence (aligned to each learning outcome) that will be collected and analyzed; plans for analyzing the data; procedures to guide discussion and application of results; and timelines and responsibilities. Assessment Report: A document that discusses assessment data, analysis, and application of results. B Backward Design (also called Reverse Design): Backward design is a method of designing educational curriculum by setting goals before choosing instructional methods and forms of assessment. Backward design of curriculum typically involves three stages: identifying outcomes, determining assessments, and developing learning activities and lesson plans to achieve outcomes. Benchmark (also called Threshold): 1. A standard or point of reference against which gathered data may be compared or assessed. 2. The rate for an accepted level of performance or success for the given outcome. The expected level is often expressed as a percentage in relation to the criterion. Budget Process: The process by which budgets are created and approved. C Capstone: A course or experience toward the end of a program in which students have the opportunity to demonstrate their cumulative knowledge, skills, and dispositions related to some or all of the learning outcomes. In capstone courses/experiences, students produce direct evidence of their learning. Examples of capstone assignments include: standardized assessments, exhibitions, presentations, performances, and/or research papers. Common Course Numbering System (CCNS): Established by the Colorado Community College System (CCCS), the CCNS establishes shared course numbers, course descriptions, and course outcomes across the colleges in the system. Closing the Loop: Closing the Loop encompasses analyzing results from outcome assessments, using results to make changes to improve student learning, and re-assessing outcomes in order to determine the effect those changes had on student learning. Constituents: People the institution serves, advocates for, or organizes.

Content or Content Area: The subject matter of a discipline. Course: A unit of instruction that has the following: a formalized syllabus; a description; a condensed outline or statement; an approval in accordance with board policy; and an instructor of record. Course-Level Outcomes (CLO) (also called objectives, competencies, goals): Statements which articulate, in measurable terms, what students should know and be able to demonstrate as a result of and at the conclusion of a course. These are established using CCNS outcomes, GT requirements, and departmental standards. These outcomes communicate course goals explicitly; and foster transfer of responsibility for learning from faculty to students. Course-Level Assessment: The intentional collection of evidence of student learning with which the instructor can assess mastery of one or more course-level outcomes. Through course-level assessment, faculty provide timely and useful feedback to students, use data to assign grades, and record data related to students achievement of the CLOs in question. Course-embedded assessment that occurs towards the end of a program can also yield data for program outcomes assessment efforts. Criteria: The discrete domains of a subject against which a learning performance is rated. For example, criteria included in an assessment of student writing might include accuracy of content, appropriate use of evidence to support argument, organization, and adherence to the conventions of academic English. Criterion-Referenced Testing: Refers to evaluating students against an absolute standard of achievement, rather than evaluating them in comparison with the performance of other students. A standard of performance is set to represent a level of expertise or mastery of skills or knowledge. Curriculum Mapping: The analytic process in which faculty examine the alignment between course-level, programlevel and institution-level outcomes. The primary purpose of curriculum mapping is to identify courses in which program and institution level outcomes are introduced (I), practiced (P), or should be demonstrated (D). Ideally, this analytic process results in a publicly available visual representation--a curriculum map; in addition to promoting transparency, curriculum mapping helps faculty identify courses from which to gather student work for the assessment of a particular learning outcome. Curriculum Map: A document that shows the alignment of learning outcomes at different levels. It is the outcome of curriculum mapping. D Degrees with Designation (DWDs): A Colorado Department of Higher Education (CDHE) approved associate of arts or associate of science degree within a specific academic discipline prearranged program path, which allows students to transfer degrees and enroll as juniors at any Colorado public four-year college or university.

Diagnostic Assessment: Information gathering at the beginning of a course or program. Diagnostic assessment can yield actionable information about students prior knowledge; additionally, diagnostic assessment activities provide information for students about what they will be expected to know and do at the conclusion of a course or program. Often takes the form of a pre-test. Direct Assessment of Learning: Gathers evidence, based on student performance, which demonstrates the learning itself rather than just the collection of grades data. Can be value added, related to standards, qualitative or quantitative, embedded or not, using local or external criteria. Examples: most classroom testing for grades is direct assessment (in this instance within the confines of a course), as is the evaluation of a research paper in terms of the discriminating use of sources. The latter example could assess learning accomplished within a single course or, if part of a senior requirement, could also assess cumulative learning. E Embedded Assessment: A means of gathering information about student learning that is built into and a natural part of the teaching-learning process. Often uses for assessment purposes classroom assignments that are evaluated to assign students a grade. Can assess individual student performance or aggregate the information to provide information about the course or program; can be formative or summative, quantitative or qualitative. Evaluation: A judgment about whether course/program/institutional goals were achieved. External Assessment: Use of criteria (rubric) or an instrument developed by an individual or organization external to the one being assessed. This kind of assessment is usually summative, quantitative, and often high-stakes, such as the SAT or GRE exams. In CTE programs, this type of assessment is often part of the credentialing process for the student. Formative Assessment: Information gathering strategies that provide actionable evidence related to students progress toward mastery of the learning outcomes during the term (or class period). An integral part of excellent instruction, regular formative assessment provides valuable information to faculty regarding instructional strategies that are/aren t producing student learning. Formative assessment also provides students with information about their progress in a course. G GT Competencies: Learning outcomes established by the Colorado Department of Higher Education (CDHE) that are required in courses that are part of the GT--Guaranteed Transfer--pathways. https://highered.colorado.gov/academics/transfers/gtpathways/criteria/competency.html GT Pathways: GT Pathways (Guaranteed Transfer) courses, in which the student earns a C- or higher, will always transfer and apply to GT Pathways requirements in AA, AS, and most bachelor s degrees at every public Colorado college and university. GT Pathways do not apply to some degrees (such as many engineering, computer science, nursing and others.

Guided Pathways: The whole package of academic pathways, programs, advising, and other support services that get students from application to graduation. Degrees and programs at CCA are organized into 6 different Guided Pathways. H High Stakes Assessment: The decision to use the results of assessment to set a hurdle that needs to be cleared for completing a program of study, receiving certification, or moving to the next level. Most often, the assessment so used is externally developed, based on set standards, carried out in a secure testing situation, and administered at a single point in time. Examples: at the secondary school level, statewide exams required for graduation; in postgraduate education, the bar exam. Holistic Scoring Method: A scoring method which assigns a single score based on an overall appraisal or impression of performance rather than analyzing the various dimensions separately. A holistic scoring rubric can be specifically linked to focused (written) or implied (general impression) criteria. Some forms of holistic assessment do not use written criteria at all but rely solely on anchor papers for training and scoring. I Indirect Evidence: Data from which it is possible to make inferences about student learning. Sources of indirect evidence include students perceptions of their own learning gathered through self-report surveys; focus groups; exit interviews; alumni and current student surveys; and graduation and retention data and reports. Indirect evidence alone is insufficient to make meaningful decisions about program or institutional effectiveness. Institutional Outcome: The knowledge, skills, abilities, and attitudes that students are expected to develop as a result of their overall experiences with any aspect of the college, including courses, programs, and student services. Institution-Level Assessment: Uses the institution as the level of analysis and focuses on Institutional Outcomes. Can be quantitative or qualitative, formative or summative, standards-based or value added, and used for improvement or for accountability. Inter-Rater Reliability: Also known as inter-rater agreement, or concordance. The degree of agreement among raters that gives a score of how much homogeneity, or consensus, there is in the ratings given by judges.

L LEAP: Launched in 2005, Liberal Education and America s Promise (LEAP) is a national public advocacy and campus action initiative. LEAP essential outcomes and rubrics were used during the GT competencies revision process. https://www.aacu.org/leap Learning Outcome (goal, competency, objective): Statements that describe the skills or knowledge learned by the students in a course, program, activity, etc. which can be demonstrated and measured. Local Assessment: Means and methods that are developed by an institution s faculty or staff based on their teaching approaches, students, and learning goals. An example would be an English Department s construction and use of a writing rubric to assess incoming freshmen s writing samples, which might then be used assign students to appropriate writing courses or might be compared to senior writing samples to get a measure of value-added. M Metric: The end product of measurement, also known as results. Must be contextualized to carry meaning. N Norming: A process of conversation and analysis through which assessors reach consistent agreement about the meaning and applicability of assessment criteria, such as a rubric. When such agreement is reached, the readers are said to be normed to the particular instrument. It is important to check for Inter-rater Agreement and re-norm as needed. Also known as calibration, this process promotes consistent application of assessment standards. Norm-Referenced Assessment: Measurement of relative performance; e.g., a measure of how well one student performs in comparison to other students completing the same evaluative activity. The usefulness of norm-referencing is limited for program assessment because it does not provide information about students performances on criteria related to Program or Course Learning Outcomes. P Performance Criteria: The standards by which student performance is evaluated. Performance criteria help assessors maintain objectivity and provide students with important information about expectations, giving them a target or goal to strive for.

Performance Indicator: A sign that something has happened (i.e., an indicator of learning). A performance indicator provides examples or concrete descriptions of what is expected at varying levels of mastery. Program: 1. Each and every individual degree, certificate and/or department including all of the core, required, and elective courses on the academic side as well as the college services on the non-academic side that support a student s completion efforts. 2. A coherent sequence of courses designed to prepare individuals for employment or further education in a specific occupational area. A certificate program is a program requiring less than 60 credit hours (usually less than 45 credit hours) and a degree program requires 60 or more college-level credit hours. A degree program can be designed for either employment or transfer. Certificate programs lead to employment or are stackable towards a degree program which allows students to complete a certificate while completing a degree. At CCA, the term program may be defined as a department, service, degree, or certificate. Program Assessment, Co-Curricular: Assessment of the co-curricular programs, activities, and learning experiences that complement the student learning experience. Program Assessment, Curricular: Assessment of the programs, activities, and learning that occur in the classroom leading toward completion at a certificate or degree. Program Review (also called Department Review in some cases): A cyclical process through which program staff/faculty engage in inquiry to support evidence-based decision making. The purpose of program review is to generate actionable and meaningful results to inform discussions about program effectiveness, sustainability, budget, and strategic planning. Best practices for program review call for the inclusion of multiple sources of indirect and direct evidence, gathered and analyzed over time, rather than all at once in advance of a self-study. Program-Level Outcomes (PLO) : Statements which articulate, in measurable terms, what students should know and be able to demonstrate as a result of and at the conclusion of a program. PLOs communicate program goals explicitly; and foster transfer of responsibility for learning from faculty to students. Program-level Assessment: The systematic and intentional collection and analysis of aggregated evidence (direct and indirect) of student learning to inform conversations about program effectiveness. The results of program-level assessment can be used in self-studies prepared as part of regular Program Review. Program-level assessment is inquiry-driven. For CCA assessment purposes, Programs may be defined as departments, services, degree, or certificate. Q Qualitative Assessment: Collects data that does not lend itself to quantitative methods but rather to interpretive criteria. Quantitative Assessment: Collects data that can be analyzed using quantitative methods.

R Rubric: An instrument that describes the knowledge and skills required to demonstrate mastery in an assignment. Rubrics often use scales that include four or more categories. Unlike checklists, rubrics are designed as scoring guides, which clearly articulate what mastery looks like at each performance level. Rubrics communicate the expectations of a given assignment or task, if shared beforehand, and structure how student work is evaluated. S Stakeholder: Any group or individual who can affect or is affected by the achievement of the college s objective. Stakeholders, External: Alumni, accrediting bodies, employers, governing entities, the community served, donors. Stakeholders, Internal: Students, parents, staff, faculty, instructors, administration. Strategic Planning: An organization s process of defining its strategy, or direction, and making decisions on allocating its resources to pursue this strategy. Student Involvement: The process of engaging students in every facet of the educational process for the purpose of strengthening their commitment to the college experience. Summative Assessment: A snapshot of student learning at a particular point-in-time, usually at the end of a course or program. Data from summative assessment can inform an individual faculty member s planning for the next quarter or program faculty interested in assessing students mastery of program learning outcomes at a particular time. V Validity: Describes how well an instrument measures what it is intended to measure. Also refers to the trustworthiness of conclusions drawn from analyses. It is important to consider validity when making claims about the effectiveness of a particular program or instructional approach. Value-Added: The effects educational providers have on students during their programs of study comprise the valueadded feature of academics. Participation in higher education has value-added impact when student learning and development occur at levels above those that occur through natural maturation, usually measured as longitudinal change or difference between pretest and posttest.