Elizabethtown Community and Technical College

Similar documents
Developing an Assessment Plan to Learn About Student Learning

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

SACS Reaffirmation of Accreditation: Process and Reports

Assessment and Evaluation

ACADEMIC AFFAIRS GUIDELINES

EQuIP Review Feedback

Assessment for Student Learning: Institutional-level Assessment Board of Trustees Meeting, August 23, 2016

What does Quality Look Like?

STUDENT LEARNING ASSESSMENT REPORT

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

E-3: Check for academic understanding

Standards and Criteria for Demonstrating Excellence in BACCALAUREATE/GRADUATE DEGREE PROGRAMS

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

CONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION. Connecticut State Department of Education

Writing a Basic Assessment Report. CUNY Office of Undergraduate Studies

Qualification Guidance

Kelso School District and Kelso Education Association Teacher Evaluation Process (TPEP)

NC Global-Ready Schools

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

Multiple Measures Assessment Project - FAQs

An Analysis of the Early Assessment Program (EAP) Assessment for English

SURVEY RESEARCH POLICY TABLE OF CONTENTS STATEMENT OF POLICY REASON FOR THIS POLICY

Georgia Department of Education

Self Assessment. InTech Collegiate High School. Jason Stanger, Director 1787 Research Park Way North Logan, UT

1. Faculty responsible for teaching those courses for which a test is being used as a placement tool.

July 17, 2017 VIA CERTIFIED MAIL. John Tafaro, President Chatfield College State Route 251 St. Martin, OH Dear President Tafaro:

D direct? or I indirect?

ASSESSMENT OF STUDENT LEARNING OUTCOMES WITHIN ACADEMIC PROGRAMS AT WEST CHESTER UNIVERSITY

Qualification handbook

Analyzing Linguistically Appropriate IEP Goals in Dual Language Programs

DOCTOR OF PHILOSOPHY BOARD PhD PROGRAM REVIEW PROTOCOL

ACCREDITATION STANDARDS

Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd. Hertfordshire International College

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

INSTRUCTOR USER MANUAL/HELP SECTION

Orientation Workshop on Outcome Based Accreditation. May 21st, 2016

HIGHLAND HIGH SCHOOL CREDIT FLEXIBILITY PLAN

Portfolio-Based Language Assessment (PBLA) Presented by Rebecca Hiebert

LITERACY-6 ESSENTIAL UNIT 1 (E01)

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

DATE ISSUED: 11/2/ of 12 UPDATE 103 EHBE(LEGAL)-P

College of Engineering and Applied Science Department of Computer Science

Procedures for Academic Program Review. Office of Institutional Effectiveness, Academic Planning and Review

Purpose of internal assessment. Guidance and authenticity. Internal assessment. Assessment

A Diagnostic Tool for Taking your Program s Pulse

MASTER S COURSES FASHION START-UP

VIEW: An Assessment of Problem Solving Style

Assuring Graduate Capabilities

Instructional Supports for Common Core and Beyond: FORMATIVE ASSESMENT

PROPOSAL FOR NEW UNDERGRADUATE PROGRAM. Institution Submitting Proposal. Degree Designation as on Diploma. Title of Proposed Degree Program

PREPARING FOR THE SITE VISIT IN YOUR FUTURE

How to Judge the Quality of an Objective Classroom Test

Audit Documentation. This redrafted SSA 230 supersedes the SSA of the same title in April 2008.

Teachers Guide Chair Study

Spring Valley Academy Credit Flexibility Plan (CFP) Overview

Executive Summary. Laurel County School District. Dr. Doug Bennett, Superintendent 718 N Main St London, KY

RED 3313 Language and Literacy Development course syllabus Dr. Nancy Marshall Associate Professor Reading and Elementary Education

Honors Mathematics. Introduction and Definition of Honors Mathematics

Assessment of Student Academic Achievement

A Survey of Authentic Assessment in the Teaching of Social Sciences

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

SETTING STANDARDS FOR CRITERION- REFERENCED MEASUREMENT

Oklahoma State University Policy and Procedures

Frequently Asked Questions and Answers

CONTINUUM OF SPECIAL EDUCATION SERVICES FOR SCHOOL AGE STUDENTS

Writing Effective Program Learning Outcomes. Deborah Panter, J.D. Director of Educational Effectiveness & Assessment

Technical Skills for Journalism

Final Teach For America Interim Certification Program

Chapter 2. University Committee Structure

Requirements for the Degree: Bachelor of Science in Education in Early Childhood Special Education (P-5)

Revision and Assessment Plan for the Neumann University Core Experience

Colorado s Unified Improvement Plan for Schools for Online UIP Report

Linking the Ohio State Assessments to NWEA MAP Growth Tests *

Department of Social Work Master of Social Work Program

Instructional Intervention/Progress Monitoring (IIPM) Model Pre/Referral Process. and. Special Education Comprehensive Evaluation.

Basic Skills Initiative Project Proposal Date Submitted: March 14, Budget Control Number: (if project is continuing)

Program Guidebook. Endorsement Preparation Program, Educational Leadership

College of Education & Social Services (CESS) Advising Plan April 10, 2015

School Inspection in Hesse/Germany

Providing Feedback to Learners. A useful aide memoire for mentors

Grade 4. Common Core Adoption Process. (Unpacked Standards)

TSI Operational Plan for Serving Lower Skilled Learners

Assessment Pack HABC Level 3 Award in Education and Training (QCF)

General study plan for third-cycle programmes in Sociology

The ELA/ELD Framework Companion: a guide to assist in navigating the Framework

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Focus on. Learning THE ACCREDITATION MANUAL 2013 WASC EDITION

Programme Specification

Higher Education / Student Affairs Internship Manual

History of CTB in Adult Education Assessment

Handbook for Graduate Students in TESL and Applied Linguistics Programs

Access Center Assessment Report

MSW POLICY, PLANNING & ADMINISTRATION (PP&A) CONCENTRATION

Evaluation of a College Freshman Diversity Research Program

Assessment of Generic Skills. Discussion Paper

School Leadership Rubrics

ASSISTANT DIRECTOR OF SCHOOLS (K 12)

Unit 3. Design Activity. Overview. Purpose. Profile

On-the-Fly Customization of Automated Essay Scoring

Indicators Teacher understands the active nature of student learning and attains information about levels of development for groups of students.

Transcription:

Elizabethtown Community and Technical College Higher education is unique in that we have a diverse workforce from many fields each having a unique language that those in that field understand and use with ease. As we cross disciplinary lines, the language we each use is often misunderstood leading to confusion and frustration. In an effort to improve the effectiveness of the institution and reduce frustration the Institutional Effectiveness Office has been in the process of creating a glossary of terms that are used often on campus and higher education in general. The purpose of the glossary is to create shared meaning of words and terms that will enable us to communicate more effectively with each other and external agencies that we interact with. The glossary is to be considered a work in progress and will be revised as needed. I encourage everyone to review the glossary and submit terms or words for consideration of inclusion in the glossary. Glossary ACCESS: has two components: the percentage of the community that is served by the college and the ratio of diversity in the student population as compared with the community. ACCOUNTABILITY: the public reporting of student, program or institutional data to justify decisions or policies. ANALYTICAL SCORING: evaluating student work across multiple dimensions of performance rather than from an overall impression (holistic scoring). In analytic scoring, individual scores for each dimension are scored and reported. For example, analytic scoring of a history essay might include scores of the following dimensions: use of prior knowledge, application of principles, use of original source material to support a point of view, and composition. An overall impression of quality may be included in analytic scoring. ANCHOR(S): a sample of student work that exemplifies a specific level of performance. Raters use anchors to score student work, usually comparing the student performance to the anchor. For example, if student work was being scored on a scale of 1-5, there would typically be anchors (previously scored student work), exemplifying each point on the scale. ASSESSMENT: the systematic collection of data and information across courses, programs and the institution with a focus on outcomes, especially student learning outcomes, but also includes process, especially in seeking ongoing improvement. AUTHENTIC ASSESSMENT: assessment that requires students to perform a task rather than take a test in a real-life context or a context that simulates a real-life context. Designed to judge students' abilities to use specific knowledge and skills and actively demonstrate what they know rather than recognize or recall answers to questions.

BASIC SKILLS: below college-level reading, writing, and mathematics. BENCHMARK: a sample of student work or a detailed description of a specific level of student performance that illustrates a category or score on a scoring rubric. CAPSTONE (course or experience used interchangeably): The capstone course is designed to be a culminating educational experience for the undergraduate student. The class provides for learning, but not in the traditional sense as no new skills are taught. The capstone course can be a self-directed, integrated, learning opportunity. The course is the singular opportunity to determine if the student has assimilated the various goals of his/her total education. An example would be business students working in teams within the community to develop a business plan for a business. The team might include students from accounting, marketing, finance, management, computer information, other technical fields based on type of business. COMPLIANCE: SACS Compliance categories are defined as Compliance. The institution concludes that it complies with each aspect of the requirement or standard and supports this judgment in a narrative response supported by documentation. Partial Compliance. The institution judges that it complies with some but not all aspects of the requirement or standard and supports this judgment in a narrative response supported by documentation justifying its claim of partial compliance, an explanation for its partial non-compliance, and a detailed action plan for bringing the institution into compliance that includes a list of documents to be presented to support compliance and a date for completing the plan. Non-Compliance. The institution determines that it does not comply with any aspect of the requirement or standard and provides a thorough explanation for its non-compliance and a detailed action plan for bringing the institution into compliance that includes a list of documents to be presented to support compliance and a date for completion of the plan. (See Appendix C, p. 47, for a description and examples of narratives for the Compliance Certification.) SACS HANDBOOK FOR REAFFIRMATION OF ACCREDITATION 2004 CLASSROOM ASSESSMENT TECHNIQUES (CAT's) http://honolulu.hawaii.edu/intranet/committees/facdevcom/guidebk/teachtip/assess- 1.htm This link provides useful information about CAT s COHORT: a group (of students). Examples may include all freshmen starting this semester, all students beginning in a specific course whose progress will be tracked throughout an identified set of courses or program i.e. basic math through college algebra

COMPETENCY: a combination of skills, ability and knowledge needed to perform a specific task at a specified criterion. Compliance Certification is the document used by the institution in attesting to its determination of the extent of its compliance with each of the Core Requirements and Comprehensive Standards (SACS) Comprehensive Standard is the part of the accrediting process that mandates a policy or procedure that the policy or procedure is in writing, approved through appropriate institutional processes, published in appropriate institutional documents accessible to those affected by the policy or procedure, and implemented and enforced by the institution. An example is 3.3.1 The institution identifies expected outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results. Core Requirements are basic qualifications that an institution must meet to be accredited with the Commission on Colleges. An example is 2.12 The institution has developed an acceptable Quality Enhancement Plan and demonstrates that the plan is part of an ongoing planning and evaluation process. (Quality Enhancement Plan) COURSE ASSESSMENT: assessment of student learning outcomes at the course level CRITERIA**: guidelines, rules, characteristics, or dimensions that are used to judge the quality of student performance. Criteria indicate what we value in student responses, products or performances. They may be holistic, analytic, general, or specific. Scoring rubrics are based on criteria and define what the criteria mean and how they are used. CRITERION-REFERENCED ASSESSMENT **: an assessment where an individual's performance is compared to a specific learning objective or performance standard and not to the performance of other students. Criterion-referenced assessment tells us how well students are performing on specific goals or standards rather that just telling how their performance compares to a norm group of students nationally or locally. In criterion- referenced assessments, it is possible that none, or all, of the examinees will reach a particular goal or performance standard. DEVELOPMENTAL EDUCATION: a term used to describe basic skills/remedial courses and support systems (e.g., placement testing and placement, counseling/advising, and such academic support services as tutoring, learning center and computer-assisted instruction or CAI). (At COD: one of the academic divisions, focusing on adult basic education, non-credit ESL and GED (high school equivalency).)

DIRECT ASSESSMENT: the measurement of actual student learning, competency or performance. Examples include essays, tests, speeches, recitals, capstone experiences and portfolios. DOMAIN: a set of skills or sub-skills in a particular educational area; for example, the specific skills that make up algebra or critical thinking. EMBEDDED ASSESSMENT: a method of sampling which allows broad assessment activities to be carried out within the course structure by embedding these activities within the course content, syllabus and assessment /grading practices, not separate from the course. This encourages students to be motivated and to perform to the best of their abilities. EQUITY: the extent to which an institution or program achieves a comparable level of outcomes, direct and indirect, for various groups of enrolled students. GENERAL EDUCATION: the content, skills and learning outcomes expected of students who achieve a college degree regardless of program or major. This includes both skills in such areas as writing, critical thinking, problem solving, quantitative reasoning, and information competency as well as content knowledge in a spectrum of learning outcomes including: communications, arts, humanities, mathematics, sciences and social sciences. HOLISTIC SCORING: a scoring process in which a score is based on an overall rating or judgment of a finished product compared to an agreed-upon standard for that task. INDIRECT ASSESSMENT: the measurement of variables that assume student learning such as retention/persistence, transfer and graduation rates, advisory boards and surveys. INPUT: the demographics and skills students bring with them as they enter a course, program or institution. INSTITUTIONAL EFFECTIVENESS: a term used by various components of the institution or the institution itself to review how effectively goals are achieved. ITEM: an individual question or exercise in an assessment or evaluative instrument. LONGITUDINAL COHORT ANALYSIS: a form of evaluation or assessment where a particular group (cohort) is defined on a set of predetermined criteria and followed over time(longitudinal) on one or more variables. MATRICULATION: a process to assist entering college students to be successful, including admissions, registration, orientation, placement testing, counseling, registration and evaluation

NORM-REFERENCED ASSESSMENT : an assessment where student performance or performances are compared to a larger group. Usually the larger group or "norm group" is a national sample representing a wide and diverse cross-section of students. Students, schools, districts, and even states are compared or rank-ordered in relation to the norm group. The purpose of a norm-referenced assessment is usually to sort students and not to measure achievement towards some criterion of performance. OPEN-RESPONSE ITEMS: items requiring short written answers. OUTCOME: results; what is expected to be produced after certain services or processes. (See student learning outcomes below.) OUTPUT*: anything an institution or system produces a value-neutral quantity measure usually measured in terms of volume of work accomplished often confused with a measure of quality of degrees, research, student services, etc. PERSISTENCE: the ongoing enrollment of students over multiple semesters or terms. PERFORMANCE-BASED ASSESSMENT: (also known as Authentic assessment): items or tasks that require students to apply knowledge in real-world situations. PERFORMANCE INDICATORS: a set of measures that are used to evaluate and report performance. PLACEMENT: the counseling/advising process, using multiple variables, usually including the results of a placement test, to assist entering college students enrolling in beginning college courses, especially remedial/basic skills courses. PLACEMENT TESTING: the process of assessing the basic skills proficiencies or competencies of entering college students. PLANNING UNIT: a sub unit of the organization that is linked together by skill, specialty, function or purpose to concentrate on specific aspects of the institution s mission i.e. student affairs, financial aid, and nursing program. This term may be used interchangeably with variations of terms linked to program review and program outcomes PORTFOLIO: a representative collection of a student's work, including some evidence that the student has evaluated the quality of his or her own work. A method of evaluating the work is important as is determining the reasons the student has chosen the work included in the portfolio. Principles of Accreditation: Foundations for Quality Enhancement is the primary source document describing the accreditation standards and process. Participants in the review process should consult it throughout the accreditation process. It contains the Core

Requirements and Comprehensive Standards with which institutions must comply in order to be granted candidacy, initial accreditation, or reaffirmation. The Principles of Accreditation contains four sections: Section 1 Principles and Philosophy of Accreditation Section 2 Core Requirements Section 3 Comprehensive Standards Section 4 Federal Regulations for Title IV Funding PROGRAM ASSESSMENT: assessing the student learning outcomes or competencies of students in achieving a certificate/degree beyond basic skills and general education. PROGRAM OUTCOMES: The results of the planning process where criteria for success was defined and assessed by institutional planning units to determine effectiveness. PROGRAM REVIEW: a process of systematic evaluation of multiple variables of effectiveness and assessment of program effectiveness including student learning outcomes of an instructional or student services program or other institutional unit as determined by the institution (SEE PLANNING UNIT). PROMPT: a short statement or question that provides students a purpose for writing; also used in areas other than writing. PERFORMANCE INDICATORS: a set of measures that are used to evaluate and report performance. Quality Enhancement Plan: (QEP) is a document developed by the institution that describes a course of action for institutional improvement crucial to enhancing educational quality that is directly related to student learning. The QEP is based upon a comprehensive analysis of the effectiveness of the institution in supporting student learning and accomplishing the mission of the institution. RATER: a person who evaluates or judges student performance on an assessment against specific criteria. RATER TRAINING: the process of educating raters to evaluate student work and produce dependable scores. Typically, this process uses anchors to acquaint raters with criteria and scoring rubrics. Open discussions between raters and the trainer help to clarify scoring criteria and performance standards, and provide opportunities for raters to practice applying the rubric to student work. Rater training often includes an assessment of rater reliability that raters must pass in order to score actual student work. RELIABILITY**: the degree to which the results of an assessment are dependable and consistently measure particular student knowledge and/or skills. Reliability is an indication of the consistency of scores across raters, over time, or across different tasks

or items that measure the same thing. Thus, reliability may be expressed as (a) the relationship between test items intended to measure the same skill or knowledge (item reliability), (b) the relationship between two administrations of the same test to the same student or students (test/retest reliability), or (c) the degree of agreement between two or more raters (rater reliability). An unreliable assessment cannot be valid. RETENTION: in California community colleges, the completion of a course or semester (Course Completion outside of California). Outside of California, used in the same manner as persistence: the reenrollment of students over multiple semesters or terms. RUBRIC: a rubric is a set of scoring guidelines for evaluating students' work. Typically a rubric will consist of a scale used to score students' work on a continuum of quality or mastery. Descriptors provide standards or criteria for judging the work and assigning it to a particular place on the continuum. Rubrics make explicit the standards by which a student's work is to be judged and the criteria on which that judgment is based. http://www.ncsu.edu/midlink/ho.html This site provides examples and godd information about rubrics SCAFFOLDING: giving support in order to help the performance of a task, whereby this support is faded. This contrasts with Modeling (to present a desired behavior or process so that it can be imitated by the learner) and Coaching (support to help the performance of a task aimed at improving the performance of the learner.) SCALE: values given to student performance. Scales may be applied to individual items or performances, for example, checklists, i.e., yes or no; numerical, i.e., 1-6; or descriptive, i.e., the student presented multiple points of view to support her essay. Scaled scores occur when participants' responses to any number of items are combined and used to establish and place students on a single scale of performance. STANDARDIZATION: a consistent set of procedures for designing, administering, and scoring an assessment. The purpose of standardization is to assure that all students are assessed under the same conditions so that their scores have the same meaning and are not influenced by differing conditions. Standardized procedures are very important when scores will be used to compare individuals or groups. STUDENT LEARNING OUTCOMES (SLO): the competencies and skills expected of students as they complete a course, program or institution. STANDARD: a predetermined criterion of a level of student performance a measure of competency set by experts representing a variety of constituents (e.g., employers/ educators/ students/ community members) criterion (standard) may be set within institution or externally by industry/ employers. TASK: an activity, exercise, or question requiring students to solve a specific problem or demonstrate knowledge of specific topics or processes.

VALIDITY: the extent to which an assessment measures what it is supposed to measure and the extent to which inferences and actions made on the basis of test scores are appropriate and accurate. For example, if a student performs well on a reading test, how confident are we that that student is a good reader? A valid standards-based assessment is aligned with the standards intended to be measured, provides an accurate and reliable estimate of students' performance relative to the standard, and is fair. An assessment cannot be valid if it is not reliable. VALUE ADDED*: a comparison of knowledge, skills, and developmental traits that students bring to the educational process with the knowledge, skills and developmental traits they demonstrate upon completion of the educational process. These terms and there definitions were derived from multiple public domain sources including The National Postsecondary Education Cooperative (nces.ed.gov/npec), CRESST Glossary, Graduate School of Education, UCLA. SACs