Program/ Departmental Self-Assessment Procedure and Action Plan

Similar documents
Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Revision and Assessment Plan for the Neumann University Core Experience

Saint Louis University Program Assessment Plan. Program Learning Outcomes Curriculum Mapping Assessment Methods Use of Assessment Data

ASSESSMENT OF STUDENT LEARNING OUTCOMES WITHIN ACADEMIC PROGRAMS AT WEST CHESTER UNIVERSITY

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

I. Proposal presentations should follow Degree Quality Assessment Board (DQAB) format.

Developing an Assessment Plan to Learn About Student Learning

HIGHLAND HIGH SCHOOL CREDIT FLEXIBILITY PLAN

STUDENT LEARNING ASSESSMENT REPORT

Linguistics Program Outcomes Assessment 2012

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

Procedures for Academic Program Review. Office of Institutional Effectiveness, Academic Planning and Review

Xenia High School Credit Flexibility Plan (CFP) Application

Chemistry 495: Internship in Chemistry Department of Chemistry 08/18/17. Syllabus

Standard IV: Students

POLICIES AND PROCEDURES

Department of Geography Bachelor of Arts in Geography Plan for Assessment of Student Learning Outcomes The University of New Mexico

Department of Education School of Education & Human Services Master of Education Policy Manual

ABET Criteria for Accrediting Computer Science Programs

DMA Timeline and Checklist Modified for use by DAC Chairs (based on three-year timeline)

UNIVERSIDAD DEL ESTE Vicerrectoría Académica Vicerrectoría Asociada de Assessment Escuela de Ciencias y Tecnología

UK Institutional Research Brief: Results of the 2012 National Survey of Student Engagement: A Comparison with Carnegie Peer Institutions

Comprehensive Student Services Program Review

VOL VISION 2020 STRATEGIC PLAN IMPLEMENTATION

Tentative School Practicum/Internship Guide Subject to Change

D direct? or I indirect?

Handbook for Graduate Students in TESL and Applied Linguistics Programs

Higher Education / Student Affairs Internship Manual

Examining the Structure of a Multidisciplinary Engineering Capstone Design Program

GRADUATE SCHOOL DOCTORAL DISSERTATION AWARD APPLICATION FORM

- COURSE DESCRIPTIONS - (*From Online Graduate Catalog )

Annual Report Accredited Member

Field Experience Management 2011 Training Guides

Senior Project Information

Institutional review. University of Wales, Newport. November 2010

College of Education & Social Services (CESS) Advising Plan April 10, 2015

Doctoral Student Experience (DSE) Student Handbook. Version January Northcentral University

Mathematics Program Assessment Plan

Department of Political Science Kent State University. Graduate Studies Handbook (MA, MPA, PhD programs) *

Week 4: Action Planning and Personal Growth

John Jay College of Criminal Justice, CUNY ASSESSMENT REPORT: SPRING Undergraduate Public Administration Major

SELF-STUDY QUESTIONNAIRE FOR REVIEW of the COMPUTER SCIENCE PROGRAM

Colorado State University Department of Construction Management. Assessment Results and Action Plans

BSc (Hons) Banking Practice and Management (Full-time programmes of study)

PROGRAMME SPECIFICATION KEY FACTS

University of Toronto

Standards and Criteria for Demonstrating Excellence in BACCALAUREATE/GRADUATE DEGREE PROGRAMS

Wildlife, Fisheries, & Conservation Biology

SASKATCHEWAN MINISTRY OF ADVANCED EDUCATION

Portfolio-Based Language Assessment (PBLA) Presented by Rebecca Hiebert

College of Science Promotion & Tenure Guidelines For Use with MU-BOG AA-26 and AA-28 (April 2014) Revised 8 September 2017

Spring Valley Academy Credit Flexibility Plan (CFP) Overview

HANDBOOK. Doctoral Program in Educational Leadership. Texas A&M University Corpus Christi College of Education and Human Development

Delaware Performance Appraisal System Building greater skills and knowledge for educators

A Systematic Approach to Programmatic Assessment

GRADUATE PROGRAM Department of Materials Science and Engineering, Drexel University Graduate Advisor: Prof. Caroline Schauer, Ph.D.

Table of Contents. Internship Requirements 3 4. Internship Checklist 5. Description of Proposed Internship Request Form 6. Student Agreement Form 7

The College of Law Mission Statement

Self Assessment. InTech Collegiate High School. Jason Stanger, Director 1787 Research Park Way North Logan, UT

National Survey of Student Engagement (NSSE)

NSU Oceanographic Center Directions for the Thesis Track Student

University of Oregon College of Education School Psychology Program Internship Handbook

Wolf Watch. A Degree Evaluation and Advising Tool. University of West Georgia

Reference to Tenure track faculty in this document includes tenured faculty, unless otherwise noted.

National Collegiate Retention and. Persistence-to-Degree Rates

The ELA/ELD Framework Companion: a guide to assist in navigating the Framework

Chart 5: Overview of standard C

National Collegiate Retention and Persistence to Degree Rates

MPA Internship Handbook AY

ACCREDITATION STANDARDS

GRADUATE CURRICULUM REVIEW REPORT

Santa Fe Community College Teacher Academy Student Guide 1

SAMPLE SYLLABUS. Master of Health Care Administration Academic Center 3rd Floor Des Moines, Iowa 50312

Oklahoma State University Policy and Procedures

Volunteer State Community College Strategic Plan,

Program Report for the Preparation of Journalism Teachers

DEPARTMENT OF KINESIOLOGY AND SPORT MANAGEMENT

Internship Program. Application Submission completed form to: Monica Mitry Membership and Volunteer Coordinator

Developing Students Research Proposal Design through Group Investigation Method

Distinguished Teacher Review

ACADEMIC AFFAIRS GUIDELINES

Honors Mathematics. Introduction and Definition of Honors Mathematics

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

Pakistan Engineering Council. PEVs Guidelines

MASTER OF EDUCATION DEGREE: PHYSICAL EDUCATION GRADUATE MANUAL

Secondary English-Language Arts

DOCTOR OF PHILOSOPHY IN POLITICAL SCIENCE

University of Toronto Mississauga Degree Level Expectations. Preamble

PROGRAMME SPECIFICATION

West Georgia RESA 99 Brown School Drive Grantville, GA

Contract Renewal, Tenure, and Promotion a Web Based Faculty Resource

1. Faculty responsible for teaching those courses for which a test is being used as a placement tool.

Creating an Information Literacy Plan

Indiana Collaborative for Project Based Learning. PBL Certification Process

TULSA COMMUNITY COLLEGE

Measurement & Analysis in the Real World

DegreeWorks Advisor Reference Guide

Strategic Planning for Retaining Women in Undergraduate Computing

STUDENT EXPERIENCE a focus group guide

University of New Hampshire Policies and Procedures for Student Evaluation of Teaching (2016) Academic Affairs Thompson Hall

Transcription:

Stark State College of Technology Financial Aid and Academic Records Assessment Biennial Report Department Name: Financial Aid and Academic Records Individual Completing Report: Amy Welty/ Pamela Arrington Date: 9/11/13 Purpose: Program/ Departmental Self-Assessment Procedure and Action Plan To self-identify the status of Program/Department in the outcomes assessment process as well as the action-steps and timetable for the development of assessment processes. Procedure: All programs and departments must complete the self-assessment process. Programs which do not demonstrate how the program/department meets each of the self-assessment criteria must submit an assessment plan documenting the proposed action steps and timelines along with the self-assessment form. A follow-up self-assessment report on the implementation of the assessment plan will be due the following academic year. Programs meeting effective assessment standards will be required to submit an assessment report on a biennial basis. Directions: Mark the appropriate response to the Yes/No items with an X. Provide a brief summary of action steps to meet the Criteria (for example, the department will meet twice a month over the next term to develop goals). Please note that it is critical that due diligence is given to the development of goals and associated outcome measures. Do not attempt to create goals, identify measures, and implement the assessment plan in the same term! Assessment Criteria 1. Goals Does the Department have specific student learning or academic/ student service goals which reflect the discipline or service area professional standards? Yes X 2. Outcome Measures Are direct and indirect outcome measures identified for each goal? Yes X 3. Research Is research systematically conducted to evaluate success or failure in achieving outcomes? Yes No X If no, what are the proposed action steps to meet the Criteria? 1

What is the proposed timetable for the action steps? 4. Findings Are research results analyzed and interpreted and findings determined? Yes No X If no, what are the proposed action steps to meet the Criteria? What is the proposed timetable for the action steps? 5. Review Process Are findings are discussed and reviewed by appropriate groups and individuals and recommendations made for action? Yes No X If no, what are the proposed action steps to meet the Criteria? What is the proposed timetable for the action steps? 6. Proposed Actions Are recommendations acted upon? Yes No X If no, what are the proposed action steps to meet the Criteria? What is the proposed timetable for the action steps? 7. Improvements Have actions result in documented improvements in student learning or academic/ student services? Yes No X If no, what are the proposed action steps to meet the Criteria? What is the proposed timetable for the action steps? 2

Assessment Measures Inventory Purpose: Instructions: To identify benchmarked outcome measures and the benchmarking level (internal, state, national, etc.). Enter the appropriate response for each question. Place an X in the box that corresponds to the level/type of benchmarking data that is available for each measure. The table can be appended as needed by adding or deleting rows. Assessment Measures for Goals (Outcome measures from assessment report) Is trend data available for the measure? (Yes or No) Has a performance benchmark(s) been identified for the measure? (Yes or No) Goal 1, Reports checklist N NA SSCT (Internal) Type of performance benchmark (check all that apply) State-level (OACC, OBR, etc) National (Professional Org., accrediting group, etc.) Goal 1, Employee satisfaction survey (items TBD) N N X Goal 2, student focus groups N N X Goal 2, faculty focus groups N N X Goal 2, Employee satisfaction survey (items TBD) N N X Goal 2User data (reports TBD) N N N Goal 3, Web registrations report Y Y X Goal 3ACT SOS items Y Y X X Goal 4, Reports checklist N NA X X X Goal 5, Reports checklist Written summary of activities (Emails, documentation, seminars, meetings, webinars, conferences, etc.) N NA X X X 3

Goal 6, Deadline met or not N NA X X X Goal 7, Alumni survey N Y X Goal 7, ACT SOS Y Y X X Goal 7, Orientation Survey Y Y X Goal 7, Employee satisfaction survey (items TBD) N N X Goal 7, CAS review FA N N X N Goal 7, CAS review N N X N (Registrar) Goal 7, OBR Academic NA NA records Audits (currently Suspended Goal 7, FA audits Y Y Y 4

Technical Competencies/Student Service Goals Goal 1: Goal 2: Goal 3: Goal 4: Goal 5: To provide information to the College community regarding at-risk students. To enhance the accessibility and utilization of the student and financial aid components of the Banner system. To assist students with self-sufficiency by promoting the use of mystarkstate and college email for general information and self service. To meet Ohio Board of Regents HEI system reporting deadlines. To keep abreast of changes in federal, state, and institutional policies and procedures. Goal 6: To develop and implement a records retention plan by June 30, 2012. Goal 7: To provide high quality, efficient, and courteous services to the College community. Summary Narrative Over the past few years, enrollment has increased, yet, staffing has only increased slightly to accommodate the growth. Consequently we have not made other changes that would assist us in serving students better. Therefore, we have seen a decline in the level of service we provide to our students via communication, clarity of forms, and effective use of mystarktstate. The federal processes continue to be complicated requiring us to find better ways to adequately assist students through the process. This will involve more intensive training of the staff. In addition, we need to continue to evaluate, assess and streamline our processes internally allowing staff the opportunity to educate the students on the process rather than processing paperwork. We need to invest our time in helping students understand the process, beginning with the benefits of mystarkstate for registration and financial aid purposes. We need to look at ways to reduce our wait times on the phone/in person while still maintaining quality student centered service to our students. In addition, we need to find more effective ways to communicate with students providing them with information that will educate them on their role and responsibility in the process. Our office is working to form stronger relationships and partnerships with the academic side of the college. We want to provide better training to faculty who assist students with advising questions and issues. The advisors need to have some general knowledge of financial aid regulations as it relates to the scheduling of classes and course completion. The whole process can be complicated when dealing with multiple regulations, but we can still provide the general knowledge and the tools necessary to effectively advise students. We will continue to seek effective ways to partner with the Deans and Department Chairs for assignment of advisors, scheduling of classes, and a comprehensive communication plan relating to curriculum changes. We will provide guidance and direction to the curriculum committee that enhances and supports student success and access. We need to be at the forefront of providing data to the administration regarding at risk students that can be utilized to promote success inside and outside the classroom. We need to review our systems to insure the accuracy of data and make this our number one priority. This has to be made a priority of every staff member to insure we report accurately to the state and federal agencies. 5

Assessment Results Report Purpose: The report is a summary compilation of key assessment methods, findings, review processes, actions, and improvements related to the academic/ student service or learning goals of the department/ unit on an annual basis. As a historical record of assessment activities, the report provides for and supports the systematic assessment of academic support outcomes. Instructions: Enter the outcome measure in the space provided. Please note that for each goal it is expected that a mix of quantitative and qualitative as well as direct and indirect measures are employed. Mark the term of assessment with an X (for example, if a survey is conducted in the fall term, mark fall for that measure). Provide a brief summary of key findings, either as bulleted points or in short paragraph form. Provide a brief summary on the review committee/ process (for example, Findings are reviewed by the Director and staff on a per term basis and recommendations are forward to the VP for further review). Provide a brief summary of any proposed actions for the next term/ academic year. Please note that not all findings result in actions. Provide a brief summary of any improvements from the previous year (this does not apply to new measures the first year). Finally, Goals and/ or Outcome Measures can be added (or deleted) as needed by copying and pasting. Goal 1: To provide information to the College community regarding at-risk students. Outcome Measure 1: Reports checklist (were the reports run) Terms of Assessment: Summer Fall Spring X Annual Findings: We need to communicate the at-risk students to campus partners in a timely manner. Review Committee/ Process: The Financial Aid and Academic Records Department Assessment Committee reviews the assessment results as collected. Proposed actions for next term/academic year: Review all at-risk programs and coordinate a communication plan both internally and externally. This will allow the college to increase student retention and success. Outcome Measure 2: Employee satisfaction survey (items TBD) Terms of Assessment: Summer Fall Spring x Annual Findings: We need to be more customer focused and provide more timely responses Review Committee/ Process: Through the new enrollment management plan we will evaluate our processes and job responsibilities. Proposed actions for next term/academic year: Review processes, communications, and equity in job responsibilities to assist with a quicker response to the college as a whole. 6

Goal 2: To enhance the accessibility and utilization of the student and financial aid components of the Banner system. Outcome Measure 1: student focus groups- did not do, will perform Fall 2013 Terms of Assessment: Summer Fall Spring Annual Findings: Review Committee/ Process: Proposed actions for next term/academic year: Outcome Measure 2: faculty focus groups Terms of Assessment: Summer Fall _x Spring x Annual Findings: Worked with the Department Chairs on an effective scheduling process, use and functionality of mystarkstate, reports and curriculum changes. Review Committee/ Process: Worked with the staff and curriculum to come up with improvements Proposed actions for next term/academic year: with the new Registrar on board, we will continue to work through effective processes and communication of the academic side. Outcome Measure 2: Employee satisfaction survey (items TBD) Terms of Assessment: Summer Fall Spring _x Annual Findings: We need to be more customer focused and provide more timely responses Review Committee/ Process: Through the new enrollment management plan we will evaluate our processes and job responsibilities. Proposed actions for next term/academic year: Review processes, communications, and equity in job responsibilities to assist with a quicker response to the college as a whole. Outcome Measure 2: User data (reports TBD) Not applicable Terms of Assessment: Summer Fall Spring Annual Findings: Review Committee/ Process: Proposed actions for next term/academic year: 7

Goal 3: To assist students with self sufficiency by promoting the use of mystarkstate and college email for general information and self service. Outcome Measure 1: Web registrations report-not available Terms of Assessment: Summer Fall Spring Annual Findings: Review Committee/ Process: Proposed actions for next term/academic year: Outcome Measure 2: ACT SOS items Terms of Assessment: Summer Fall x Spring Annual Findings: the findings were not very favorable. We saw a decline in our ratings and the services we offer. Review Committee/ Process: Review with Staff Proposed actions for next term/academic year: we need to look at and review our processes and provide better information to our students. Goal 4: To meet Ohio Board of Regents HEI system and other reporting deadlines. Outcome Measure 1: Reports checklist Terms of Assessment: Summer Fall Spring Annual Findings: During this assessment period, the reports to HEI have been completed within the timeframe posted by OBR. Review Committee/ Process: Registrar completed a list of necessary reports to be submitted to HEI in the proper time frame. Proposed actions for next term/academic year: This process has changed at the end of the assessment cycle. The HEI reporting is now being handled via Institutional Research. Goal 5: To keep abreast of changes in federal, state, and institutional policies and procedures. Outcome Measure 1: Written summary of activities (Emails, documentation, seminars, meetings, webinars, conferences, etc.) 8

Terms of Assessment: Summer Fall x Spring _x Annual Findings: the office staff attended many conferences, trainings, and webinars in order to stay abreast of changes Review Committee/ Process: this is reviewed every year by the senior staff to determine what the priorities of the year are and who should attend necessary trainings. Proposed actions for next term/academic year: Continue to investigate necessary trainings that will benefit the staff in order to service students effectively. Goal 6: To develop and implement a records retention plan by June 30, 2012. Outcome Measure 1: We have not met the deadline for this and will be working on it for the 2013-2014 year Terms of Assessment: Summer Fall Spring Annual Findings: Review Committee/ Process: Proposed actions for next term/academic year: Outcome Measure 2: Terms of Assessment: Summer Fall Spring Annual Findings: Review Committee/ Process: Proposed actions for next term/academic year: Goal 7: To provide high quality, efficient, and courteous services to the College community Outcome Measure 1: Alumni survey Terms of Assessment: Summer Fall Spring Annual Findings: Overall alumni are satisfied with the staff Review Committee/ Process: Senior staff reviewed the report Proposed actions for next term/academic year: We need to continue to provide more outreach to alumni and find ways to utilize them to assist current students get through hurdles toward graduation. 9

Outcome Measure 2: ACT SOS Terms of Assessment: Summer Fall _x Spring Annual Findings: the findings were not very favorable. We saw a decline in our ratings and the services we offer. Review Committee/ Process: Review with Staff Proposed actions for next term/academic year: we need to look at and review our processes and provide better information to our students. Repeat Goals and Outcome Measures format as needed Outcome Measure 2: Orientation Survey Terms of Assessment: Summer Fall x Spring _x Annual Findings: Over the past couple of years, the survey has shown satisfaction levels under 80%. Review Committee/ Process: Senior staff reviewed the surveys Proposed actions for next term/academic year: We need to review the communications to students, be clearer with our forms and review the portal to provide a more meaningful step by step process for our students. Outcome Measure 2: Employee satisfaction survey- See previous statement Terms of Assessment: Summer Fall Spring Annual Findings: We need to be more customer focused and provide more timely responses Review Committee/ Process: Through the new enrollment management plan we will evaluate our processes and job responsibilities. Proposed actions for next term/academic year: Review processes, communications, and equity in job responsibilities to assist with a quicker response to the college as a whole. Outcome Measure 2: CAS review FA Terms of Assessment: Summer Fall Spring Annual _x Findings: There are some areas that we need to focus on in terms of not meeting the specified standards. Review Committee/ Process: Reviewed by Registrar and Dean of the Department 10

Proposed actions for next term/academic year: We will look at each item and determine a course of action that will allow us to fall in line with the CAS standards. Outcome Measure 2: CAS review Registrar Terms of Assessment: Summer Fall Spring Annual x Findings: There are some areas that we need to focus on in terms of not meeting the specified standards. Review Committee/ Process: Reviewed by Registrar and Dean of the Department Proposed actions for next term/academic year: We will look at each item and determine a course of action that will allow us to fall in line with the CAS standards. Outcome Measure 2: OBR Audits (Currently suspended) Terms of Assessment: Summer Fall Spring Annual Findings: Review Committee/ Process: Proposed actions for next term/academic year: Outcome Measure 2: FA audits Terms of Assessment: Summer Fall Spring _x Annual Findings: There were zero findings with OBR audit Review Committee/ Process: Amy Welty Proposed actions for next term/academic year: Continue to insure that we have data accuracy and data integrity. Purpose: Assessment Report Review Rubric A rubric is a guide that differentiates between levels of development in outcomes assessment. The rubric is designed to clearly show departments/ units how the assessment report will be evaluated and where further action may be needed. Directions: 11

Mark the response to each item. If any item is not completed in its entirety the appropriate response is No. An Assessment Report review committee will use the same rubric to evaluate your assessment report. Are the goals for the department/ service area measureable? Comments: Yes x Is a mix of quantitative and qualitative measures used to assess outcomes for each goal? Comments: Was research conducted and findings determined for each goal? Yes x Yes No x Comments: Some of the goals were not relevant to our area anymore and they will be excluded as future goals. Also, one of our goals was not accomplished and we will be moving it to the next assessment cycle. Is there a review process in place for the department/ service area? Comments: Are action steps outlined where applicable? Comments: Was the self-assessment and action plan completed? Comments: Was the assessment measures inventory completed? Comments: Yes x Yes x Yes x Yes x 12

Key Assessment Terms Competencies/Goals are clear, meaningful statements of purpose or aspirations for the academic program or support service. Programs and services typically have several goals. Outcome Measures are direct or indirect measures of student learning or of support services. Direct measures provide evidence of actual learning, e.g. paper, exam, artistic performance. Indirect measures provide evidence about characteristics associated with learning, e.g., student perception surveys, focus group interviews, alumni surveys. See below for detailed examples. Research is the systematic collection and evaluation of outcomes data. Findings are the results of research. Review Process is the method(s) by which findings are discussed and reviewed by faculty, staff, and administrators. Proposed Actions are the result of the review process and are based on findings. Improvements are positive changes in student learning or support services as noted through the assessment process. It takes at least two iterations of the research and review process to document systematic improvement. Examples of Direct Measures of Student Learning/Services Scores and pass rates on standardized tests (licensure/certification as well as other published tests determining key student learning outcomes) Writing samples Score gains indicating the value added to the students learning experiences by comparing entry and exit tests (either published or locally developed) as well as writing samples Locally designed quizzes, tests, and inventories Portfolio artifacts (these artifacts could be designed for introductory, working, or professional portfolios) Capstone projects (these could include research papers, presentations, theses, dissertations, oral defenses, exhibitions, or performances) Case studies Team/group projects and presentations Oral examination Internships, clinical experiences, practica, student teaching, or other professional/content-related experiences engaging students in hands-on experiences in their respective fields of study (accompanied by ratings or evaluation forms from field/clinical supervisors) Service-learning projects or experiences Authentic and performance-based projects or experiences engaging students in opportunities to apply their knowledge to the larger community (accompanied by ratings, scoring rubrics or performance checklists from project/experience coordinator or supervisor) Graduates skills in the workplace rated by employers Online course asynchronous discussions analyzed by class instructors Whenever appropriate, scoring keys help identify the knowledge, skills, and/or dispositions assessed by means of the particular assessment instrument, thus documenting student learning directly. 13

Examples of Indirect Measures of Student Learning/Services Course grades provide information about student learning indirectly because of a series of reasons, such as: a) due to the focus on student performance or achievement at the level of an individual class, such grades do not represent an indication of learning over a longer course of time than the duration of that particular class or across different courses within a program; b) grading systems vary from class to class; and c) grading systems in one class may be used inconsistently from student to student Grades assigned to student work in one particular course also provide information about student learning indirectly because of the reasons mentioned above. Moreover, graded student work in isolation, without an accompanying scoring rubric, does not lead to relevant meaning related to overall student performance or achievement in one class or a program Comparison between admission and graduation rates Number or rate of graduating students pursuing their education at the next level Reputation of graduate or post-graduate programs accepting graduating students Employment or placement rates of graduating students into appropriate career positions Course evaluation items related to the overall course or curriculum quality, rather than instructor effectiveness Number or rate of students involved in faculty research, collaborative publications and/or presentations, service learning, or extension of learning in the larger community Surveys, questionnaires, open-ended self-reports, focus-group or individual interviews dealing with current students perception of their own learning Surveys, questionnaires, focus-group or individual interviews dealing with alumni s perception of their own learning or of their current career satisfaction (which relies on their effectiveness in the workplace, influenced by the knowledge, skills, and/or dispositions developed in school) Surveys, questionnaires, focus-group or individual interviews dealing with the faculty and staff members perception of student learning as supported by the programs and services provided to students Quantitative data, such as enrollment numbers Honors, awards, scholarships, and other forms of public recognition earned by students and alumni [Adapted from Maki, P.L. (2004). Assessing for learning: building a sustainable commitment across the institution. Sterling, VA: AAHE; and Suskie, L. (2004). Assessing student learning: A common sense guide. San Francisco, CA: Anker Publishing Company, Inc.] 14