Guide to Institutional Effectiveness: Administrative and Student Support, Research, and Community Service Units Dr. Sharon Huo Associate Provost Academic Affairs Dr. Theresa Ennis University Assessment Director
Outline SACSCOC Overview Core Requirements/Comprehensive Standards Helpful Hints Common Mistakes Rubric used by SACSCOC Evaluators Direct Versus Indirect Measures Benchmarking for Results IE Guide and Template IE samples Deadlines
SACSCOC Overview The Southern Association of Colleges and Schools Commission on Colleges (SACSCOC) is the regional body for the accreditation of degree-granting higher education institutions in eleven Southern states.
TTU Reaffirmation Timeline Date September 10, 2015 November 2015 Reaffirmation Activity Compliance Certification Report (Self Study Report) Off-site review January/February 2016 QEP Report Optional Focused Report April 2016 December 2016 SACSCOC On-Site Review Accreditation Action by SACSCOC Board of Trustees
SACSCOC Core Requirements 2.5 The institution engages in ongoing, integrated, and institution-wide researchbased planning and evaluation processes that (1) incorporate a systematic review of institutional mission, goals, and outcomes; (2) result in continuing improvement in institutional quality; and (3) Demonstrate the institution is effectively accomplishing its mission. (Institutional Effectiveness)
SACSCOC Comprehensive Standards (CS) 3.3.1 The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional effectiveness) 3.3.1.1 educational programs, to include student learning outcomes 3.3.1.2 Administrative support services 3.3.1.3 Academic and Student support services 3.3.1.4 Research within its mission, if appropriate 3.3.1.5 Community/public service within its mission, if appropriate
Most Cited Criteria 2011-2012 (Off-Site Review, N=157)* Rank Requirements and Standards % 1 CS 3.7.1 Faculty competence 93.0% 2 CS 3.3.1.1 IE - educational programs 65.6% 3 CR 2.11.1 Financial Resources 55.4% 4 CS 3.3.1.3 IE - academic and student support services 54.8% 5 CS 3.3.1.2 IE - administrative support services 54.6% 6 CS 3.3.1.5 IE - community/public service 53.5% 7 CS 3.5.1 General education competencies 52.9% 8 CR 2.8 Faculty 47.8% 9 CS 3.4.11 Academic program coordination 47.8% 10 CS 3.7.2 Faculty evaluation 44.6% *Preliminary Data
Administrative and Student Support Assessment Plans and Reports Well defined/explained focus and content of your plan Based on needs of unit Clear measurable goals/objectives Use of assessment for program changes
Helpful Hints Initial Outline or Roadmap for reviewers to follow: Charted overviews, Policy Outlines, and, Summary of Goals Organize Narrative by key terms in IE guidelines (Ongoing, Integrated, Institution-wide, researched-based, Systemic, Accomplishing Mission, Continuing Improvement 2.5 & 3.3.1) Reviewers look for numbers, percentages, and, comparative and longitudinal data. Combine Direct and Indirect measures. Use multiple assessments in each area. (Researched based. 2.5)
Helpful Hints Documentation must be ongoing and systematic. A minimum of 2 cycles should be included when comparing measures and making changes. (Ongoing. Systematic. 2.5) *most current available Include proof of analysis and integration of data and changes. Meeting minutes, agendas, email discussions. This shows leaders have shared, discussed, analyzed, and acted upon the results. (Analysis. Integrated. 3.3.1) Highlight sections pointing to proof. (Evidence of Improvement. 3.3.1) Practice, Policy, Product
Common Mistakes No overview or clear roadmap to guide the evaluator Multiple formats in the documentation Confusion with traditional/nontraditional learning and on/off campus learning Inconsistent names for the same program Inconsistent terminology throughout document Poorly align assessments with goals Mismatch between unit documentation and information in catalog or website
Common Mistakes Not enough focus on Modifications and Continuous Improvement or Program Changes/Actions due to Assessment Try to cover academic jargon or instruct on what IE is and is not Are not specific enough or too specific Write too much to cover the lack of substance Confuse personnel evaluation with unit evaluation
Common Mistakes Attribute lack of consistency to prior format, method, or person List portfolios, papers, or presentations as an assessment but have not developed a rubric for program evaluation Fail to close the loop: modification come from nowhere and are not tied to assessment results; No assessment results are cited (No results = no use = no improvement = no compliance); Nothing done about assessment results cited List only a summary of improvements: must include the why
Rubric for Evaluation INSTITUTIONAL EFFECTIVENESS AUDIT FORM COLLEGE/UNIVERSITY OUT- COMES/ GOALS ASSESS- MENTS CITES RESULTS USE OF RESULTS/ EVIDENCE OF IMPROVEMENT ANALYSI S CYCLE or YEAR 1 CYCLE or YEAR 2 CYCLE or YEAR 3 NOTES & NAMES/TYPES OF ASSESSMENT INSTITUTIONAL MISSION: 3.3.1.1 ACADEMIC PROGRAMS General Education GEN ED TOTALS Undergraduate Programs UNDERGRADUATE TOTALS Graduate Programs GRADUATE TOTALS Nontraditional Programs NONTRADITIONAL TOTALS Professional Programs PROF PROGRAM TOTALS 3.3.1.2 & 3.3.1.3 ACADEMIC & STUDENT SUPPORT SERVICES SUPPORT UNIT TOTALS 3.3.1.4 RESEARCH TOTALS 3.3.1.5 COMMUNITY/PUBLIC SERVICE TOTALS GRAND TOTAL Institutional Effectiveness Audit form 2004 Marila Palmer. Modified 2011. All rights reserved. NOTE: This spreadsheet includes enough room for only a few administrative & academic units. Rows may be expanded to include all units.
SAMPLE ASSESSMENTS Direct Measures Benchmarking/ unit standards Retention/ admission rates Internal and external reviews Focus groups/ discussion groups Usage tracking Budget tracking Training tracking Indirect Measures Surveys of current students Surveys of faculty members Surveys of internship supervisors Surveys of graduates Surveys of alumni Surveys of employers Surveys of transfer students Student satisfaction questionnaires Advisory board information
Codes for Categorizing Use of Results Code Revised Services Revised Process Implemented New Policy Implemented New Process Informed Budget Changed Assessment Changed Criteria Consultant Create/Modify Instruction Development/Training Description Modified way a service is offered; Modified frequency of service, ect. Changed reporting form or process, changed tabulation process New policy to improve service New process added, not simply a change in an existing one Requested fiscal or human resources Developed and implemented a new assessment method or modified one Modified criteria for success Engaged someone to study and recommend changes Changed workshop, training session in response to goal assessment Provided staff or development training
Benchmarking for Results Used to provide a standard for measuring, and to help identify where opportunities for improvement may reside TTU has some university-wide data that can be used for comparisons if your area does not Caution: Setting outcomes with percent increases overtime can sometimes lead to a ceiling effect in the progress of outcomes related to metrics. Suggested ways to set benchmarks for comparisons: Compare to rolling three year average Compare to national means of standardized tests/surveys Compare to cohorts of peers Others
IE Guide and Template Academic Year: Administrative/Student Support Unit: Submission Date: Contact (Person submitting this report):
IE Guide and Template I. Definition of Support Service Unit: (mission, vision, purpose) II. Goals/Objectives (objectives if applicable): (List each Goal/Objective related to goal) III. Assessments (Related to goals/objectives above): IV. Rationale for Goals, Assessments, and the Process of Data Analysis: (4 points to cover) V. Results: (Use current results compared to past results or benchmarks if applicable; Results for each goal should be reported, use appendices to extend data if needed): VI. Modifications and Continuing Improvement: Program Changes due to Assessments (The most important part of the report!) VII. Improvements to Assessment Plan:
IE Samples Alumni Relations (from Evaluator) Disability Services Student Government Association
Important Dates September-October: Present Guidelines November 22, 2013: Unit/Dept IE Reports Due to Academic Affairs following IE Template (AY 2012-13, One Word document for each unit/dept.) November 22, 2013 January 10, 2014: Review of Reports January 13, 2014: Reviews Returned to Units/Departments February 14, 2014: Final Report Revisions Due to Academic Affairs (One PDF for each unit/dept.)
Sources for this Information Principles of Accreditation: Foundations for Quality Enhancement (2011) http://www.sacscoc.org/pdf/principlesofaccreditation.pdf Through the Eyes of an Institutional Effectiveness Evaluator (2012 & 2013) Dr. Marila Palmer, SACSCOC Summer Institute Presentation, Atlanta GA