NOTES TO PROGRAMS Spring 2015 SPECIAL SSR EDITION

Similar documents
PREPARING FOR THE SITE VISIT IN YOUR FUTURE

Physician Assistant Program Goals, Indicators and Outcomes Report

Procedures for Academic Program Review. Office of Institutional Effectiveness, Academic Planning and Review

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

ACADEMIC AFFAIRS GUIDELINES

EQuIP Review Feedback

Application Guidelines for Interventional Radiology Review Committee for Radiology

DOCTOR OF PHILOSOPHY BOARD PhD PROGRAM REVIEW PROTOCOL


Revision and Assessment Plan for the Neumann University Core Experience

Self Assessment. InTech Collegiate High School. Jason Stanger, Director 1787 Research Park Way North Logan, UT

Focus on. Learning THE ACCREDITATION MANUAL 2013 WASC EDITION

ACCREDITATION STANDARDS

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

Student Handbook 2016 University of Health Sciences, Lahore

Update on the Next Accreditation System Drs. Culley, Ling, and Wood. Anesthesiology April 30, 2014

RC-FM Staff. Objectives 4/22/2013. Geriatric Medicine: Update from the RC-FM. Eileen Anthony, Executive Director; ;

CURRICULUM PROCEDURES REFERENCE MANUAL. Section 3. Curriculum Program Application for Existing Program Titles (Procedures and Accountability Report)

West Georgia RESA 99 Brown School Drive Grantville, GA

MBA 5652, Research Methods Course Syllabus. Course Description. Course Material(s) Course Learning Outcomes. Credits.

ASSESSMENT OF STUDENT LEARNING OUTCOMES WITHIN ACADEMIC PROGRAMS AT WEST CHESTER UNIVERSITY

Integrating Common Core Standards and CASAS Content Standards: Improving Instruction and Adult Learner Outcomes

Unit 7 Data analysis and design

Meet the Experts Fall Freebie November 5, 2015

Saint Louis University Program Assessment Plan. Program Learning Outcomes Curriculum Mapping Assessment Methods Use of Assessment Data

WP 2: Project Quality Assurance. Quality Manual

2. Related Documents (refer to policies.rutgers.edu for additional information)

Higher Education Review (Embedded Colleges) of Kaplan International Colleges UK Ltd

Chapter 2. University Committee Structure

State Parental Involvement Plan

Facing our Fears: Reading and Writing about Characters in Literary Text

California Professional Standards for Education Leaders (CPSELs)

Implementing Response to Intervention (RTI) National Center on Response to Intervention

Program Guidebook. Endorsement Preparation Program, Educational Leadership

CONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION. Connecticut State Department of Education

-Celebrating Your QI Success-

SPECIALIST PERFORMANCE AND EVALUATION SYSTEM

MIDDLE SCHOOL. Academic Success through Prevention, Intervention, Remediation, and Enrichment Plan (ASPIRE)

Developing an Assessment Plan to Learn About Student Learning

BEYOND FINANCIAL AID ACTION PLANNING GUIDE

Programme Specification

INTERNAL MEDICINE IN-TRAINING EXAMINATION (IM-ITE SM )

NIH Ruth L. Kirschstein National Research Service Awards for Individual Predoctoral Fellows (Parent F31)

CROSS-BATTERY ASSESSMENT, SLD DETERMINATION, AND THE ASSESSMENT- INTERVENTION CONNECTION

Challenging Texts: Foundational Skills: Comprehension: Vocabulary: Writing: Disciplinary Literacy:

ABET Criteria for Accrediting Computer Science Programs

Program Alignment CARF Child and Youth Services Standards. Nonviolent Crisis Intervention Training Program

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

Standards and Criteria for Demonstrating Excellence in BACCALAUREATE/GRADUATE DEGREE PROGRAMS

Proposing New CSU Degree Programs Bachelor s and Master s Levels. Offered through Self-Support and State-Support Modes

Academic Program Assessment Prior to Implementation (Policy and Procedures)

GRADUATE CURRICULUM REVIEW REPORT

b) Allegation means information in any form forwarded to a Dean relating to possible Misconduct in Scholarly Activity.

Secondary English-Language Arts

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

University of Toronto

Early Warning System Implementation Guide

The College Board Redesigned SAT Grade 12

Lecturing in the Preclinical Curriculum A GUIDE FOR FACULTY LECTURERS

Colorado s Unified Improvement Plan for Schools for Online UIP Report

Introduction 3. Outcomes of the Institutional audit 3. Institutional approach to quality enhancement 3

University of Colorado Skaggs School of Pharmacy and Pharmaceutical Sciences Programmatic Evaluation Plan

American College of Emergency Physicians National Emergency Medicine Medical Student Award Nomination Form. Due Date: February 14, 2012

Executive Summary. Laurel County School District. Dr. Doug Bennett, Superintendent 718 N Main St London, KY

BENCHMARK TREND COMPARISON REPORT:

K-12 Academic Intervention Plan. Academic Intervention Services (AIS) & Response to Intervention (RtI)

TU-E2090 Research Assignment in Operations Management and Services

Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd. Hertfordshire International College

MASTER OF ARTS IN APPLIED SOCIOLOGY. Thesis Option

Assessment of Student Academic Achievement

Strategic Planning for Retaining Women in Undergraduate Computing

Instructional Intervention/Progress Monitoring (IIPM) Model Pre/Referral Process. and. Special Education Comprehensive Evaluation.

INTERSCHOLASTIC ATHLETICS

OVERVIEW OF CURRICULUM-BASED MEASUREMENT AS A GENERAL OUTCOME MEASURE

Ohio Valley University New Major Program Proposal Template

MSW POLICY, PLANNING & ADMINISTRATION (PP&A) CONCENTRATION

Session 102 Specialty Update Nuclear Medicine 03/02/2013, 1:30PM 3:00PM

Tools to SUPPORT IMPLEMENTATION OF a monitoring system for regularly scheduled series

University of Waterloo School of Accountancy. AFM 102: Introductory Management Accounting. Fall Term 2004: Section 4

Examining the Structure of a Multidisciplinary Engineering Capstone Design Program

USC VITERBI SCHOOL OF ENGINEERING

The Site Visit: How to Prepare for It & What to Expect. STARTALK Conference Atlanta, GA May 3, 2012

CONTINUUM OF SPECIAL EDUCATION SERVICES FOR SCHOOL AGE STUDENTS

Planning a Dissertation/ Project

Writing Functional Ot Goals In Snf

Section 1: Program Design and Curriculum Planning

Major Milestones, Team Activities, and Individual Deliverables

CHALLENGES FACING DEVELOPMENT OF STRATEGIC PLANS IN PUBLIC SECONDARY SCHOOLS IN MWINGI CENTRAL DISTRICT, KENYA

SACS Reaffirmation of Accreditation: Process and Reports

UF-CPET SSI & STARTS Lesson Plan

Jason A. Grissom Susanna Loeb. Forthcoming, American Educational Research Journal

Getting Started in Developing the Portfolio

Delaware Performance Appraisal System Building greater skills and knowledge for educators

University of Michigan - Flint POLICY ON FACULTY CONFLICTS OF INTEREST AND CONFLICTS OF COMMITMENT

STUDENT ASSESSMENT AND EVALUATION POLICY

Curriculum Assessment Employing the Continuous Quality Improvement Model in Post-Certification Graduate Athletic Training Education Programs

HDR Presentation of Thesis Procedures pro-030 Version: 2.01

Measurement & Analysis in the Real World

and secondary sources, attending to such features as the date and origin of the information.

Transcription:

NOTES TO PROGRAMS Spring 2015 SPECIAL SSR EDITION WHY THE SSR (SELF-STUDY REPORT)? The cornerstone of ongoing PA program accreditation by the ARC-PA rests on continual program selfassessment and the ability of accredited programs to have a robust and systematic process of ongoing self-assessment to review the quality and effectiveness of their educational practices, policies and outcomes. Programs are asked to assess their compliance with the Standards, to identify areas in need of attention to bring the program into compliance and to take appropriate actions toward that end. STANDARDS, 4 TH EDITION, SECTION C, EVALUATION Section C of the Standards includes the following introduction: INTRODUCTION It is important for programs to have a robust and systematic process of ongoing self-assessment to review the quality and effectiveness of their educational practices, policies and outcomes. This process should be conducted within the context of the mission and goals of both the sponsoring institution and the program, using the Accreditation Standards for Physician Assistant Education (Standards) as the point of reference. A well-developed process occurs throughout the academic year and across all phases of the program. It critically assesses all aspects of the program relating to sponsorship, resources, students, operational policies, curriculum and clinical sites. The process is used to identify strengths and weaknesses and should lead to the development of plans for corrective intervention with subsequent evaluation of the effects of the interventions. Standards sections C1 and C2 address the process of self-assessment and the self-study report (SSR) documents the results of that process. C1 ONGOING PROGRAM SELF-ASSESSMENT C1.01 The program must implement an ongoing program self-assessment process that is designed to document program effectiveness and foster program improvement. ANNOTATION: A well designed self-assessment process reflects the ability of the program in collecting and interpreting evidence of student learning, as well as program administrative functions and outcomes. The process incorporates the study of both quantitative and qualitative performance data collected and critically analyzed by the program. The process provides evidence that the program gives careful thought to data collection, management and interpretation. It shows that outcome measures are used in concert with thoughtful evaluation about the results, the relevance of the data and the potential for improvement or change. C1.02 The program must apply the results of ongoing program self-assessment to the curriculum and other dimensions of the program. C2 SELF-STUDY REPORT

Notes to Programs SSR edition Spring 2015 page 2 C2.01 The program must prepare a self-study report as part of the application for continuing accreditation that accurately and succinctly documents the process and results of ongoing program selfassessment. The report must follow the guidelines provided by the ARC-PA and, at a minimum, must document: a) the program process of ongoing self- assessment, b) results of critical analysis from the ongoing self-assessment, c) faculty evaluation of the curricular and administrative aspects of the program, d) modifications that occurred as a result of self-assessment, e) self-identified program strengths and areas in need of improvement and f) plans for addressing areas needing improvement. ANNOTATION: The ARC-PA expects results of ongoing self-assessment to include critical analysis of student evaluations for each course and rotation, student evaluations of faculty, failure rates for each course and rotation, student remediation, student attrition, preceptor evaluations of students preparedness for rotations, student exit and/or graduate evaluations of the program, the most recent five-year first time and aggregate graduate performance on the PANCE, sufficiency and effectiveness of faculty and staff, faculty and staff attrition. DATA ANALYSIS AND THE SSR Analysis is defined as the study of compiled or tabulated data interpreting cause and effect relationships and trends, with the subsequent understanding and conclusions used to validate current practices or make changes as needed for program improvement. There are four key elements of analysis. 1) The first element is the regular and ongoing collection of data. For ease of use and interpretation, the collected data must be clearly displayed in tables and charts. 2) The second element is the analysis of data. This includes discussing and interpreting the cause and effect relationships and trends relating the data to the expectations or issues of the program. This is to be demonstrated by succinctly written narratives which highlight the cause and effects relationships and trends. 3) The third element is application of results and the development of conclusions based on the study and discussion of the data. These must be succinctly stated. 4) The fourth element is the development of an action plan to operationalize the conclusions. Actions plans, too, must be succinctly stated. Within the SSR, programs are asked to Provide Narrative about the analysis (interpretations and conclusions) based on data collected and displayed. In relation to this narrative, the ARC-PA expects the program to use the data it has collected and placed in the tables and templates (as provided by the ARC-PA or as provided by the program, if so asked), to discuss and interpret the cause and effect relationships and trends relating the data to the expectations or identified issues or concerns of the program. It expects the program to draw conclusions based on and related to the data and relationships of the data to the program expectations, issues or concerns. Programs also are asked to Provide Narrative detailing the actions (modifications or non-modifications) taken based on the analysis. The ARC-PA expects the program to present the modifications or nonmodifications it has chosen to make based on the conclusions it has drawn, as conveyed in the earlier question. It expects these to be supported by the program s analysis of data.

Notes to Programs SSR edition Spring 2015 page 3 RESPONSIBILITY FOR ANALYSIS It is the program s responsibility to demonstrate compliance with the Standards. Programs are expected to document analysis in a clear, coherent, succinct narrative that shows the cause and effect relationships and trends used to arrive at the conclusions and plans. It is not the obligation of the site visitors or commissioners to combine fragments of data and sentences which may represent analysis into a coherent demonstration of compliance. WHEN IS THE SSR REQUIRED? All accredited programs must submit a SSR as one component of an accreditation application. All programs scheduled for a comprehensive validation review for continuing accreditation must submit an SSR two years in advance of the program validation review by the commission. This SSR 2-years out is reviewed by the ARC-PA. A post review letter is sent to the program and institution indicating specific expectations to be addressed by the program when it submits the SSR which accompanies the accreditation application. THE SSR APPENDICES REQUIREMENTS The SSR report format for the Standards, 4 th edition includes multiple data templates as well as specific questions related to the analysis and actions for each topical area. Data templates have been developed to address areas within the annotations of the C1 and C2 standards. All programs are required to provide data over a three to four year period of time. In addition to the tabular presentation of data, programs are asked at least the two narrative questions above (about analysis and actions) and may be asked others based on the data requested in each template. Responses to questions should be clear and succinct. Responses within an SSR submitted as a component of an accreditation application may refer to other specifically referenced parts of the application only if appropriate (when in doubt the program should contact the ARC-PA office). Detailed additional data from which the analysis and actions stem are NOT to be included in the application except as noted in the application materials. Paper copies of each document supporting compliance must be readily available for site visitors at the time of the site visit and as requested by the commission. If documents are posted on the web, the specific web address for each document supporting compliance also must be available. Responses within an SSR submitted as an SSR 2-years out must be freestanding. Since there is no application to accompany the SSR, responses must be clear in their own right. References to past materials submitted to the ARC-PA are inappropriate. In addition to the data required in the SSR, programs should provide only enough data to support pertinent conclusions in the analysis. However, all source data should be available to site visitors and should be organized to demonstrate the method of analysis used by the program. For example, comments/data could be grouped by theme or to show trends over time. Minutes from committee meetings and/or faculty meetings should reflect the program s consideration of qualitative data and decisions based upon it. DOES THE SSR 2-YEARS OUT COUNT?? In an effort to support an ongoing and honest assessment process by programs and institutions, the SSR 2- years out is not graded. It is not used to make an accreditation decision. It is not reviewed by the commission on a commission agenda. It is not reviewed two years later when the program submits its application to include an SSR.

Notes to Programs SSR edition Spring 2015 page 4 It is used as a point in a continuum from which plans, actions and changes can be monitored by the program and the commission. THE SSR 2-YEARS OUT FEEDBACK LETTER The review of the SSR 2 years out is used to provide feedback to programs as they prepare their validation visit applications and SSRs. Areas identified in the letter are used to structure the subsequent validation site visit agenda to the program. The letter addresses what the commission may expect to see or what questions the commission may want the program to address in the SSR that accompanies its application and at the time of the validation visit. This letter is provided to the site visitors and the commission in order for them to assess whether the program has addressed the specific expectations noted. While each letter is customized based on the SSR submitted, the beginning of each letter remains the same and is included below: The purpose of this correspondence is to provide you with comments to consider in the final preparation of your application materials with SSR due DATE. Additionally, this feedback will serve to guide you and the commissioners as they start to work with ARC-PA staff in planning the site visit agenda for your validation visit. The feedback provided based on the review of this SSR does not address the quality of the document in its entirety and is not intended to provide consultation. It does not imply compliance or noncompliance with the Standards and is not used as a component of the application materials you will be submitting in the future. Comments provided will not be used directly to determine the outcome of the program s next validation review. However, if the commentary provided includes specific actions to be taken by the program in the presentation or content of the next SSR submitted for the validation review, the ARC- PA expects the program to take those actions. Areas that have not received any commentary still need to be considered by the program for analysis based on data collected and program outcomes. As always, the commission expects the program to apply the four key elements of analysis within a robust process of ongoing selfassessment. The program should not prepare any formal response to this review for submission to the ARC-PA. Each letter ends differently, but often includes a section as below: The commission expects the program to complete the next SSR according to the directions. It expects the report to include critical analysis of the data, discussing and interpreting the cause and effect relationships and trends and relating the data to the expectations or issues of the program. The data and analysis should logically lead to application of results and development of conclusions resulting in an action plan to operationalize the conclusions. The commission expects the report to provide appropriate follow up for all modifications, strengths, areas in need of improvement and plans.

Notes to Programs SSR edition Spring 2015 page 5 ESTABLISHING BENCHMARKS Establishing benchmarks is important to program self-assessment. Programs use internal and external evidence to establish minimum benchmarks for student performance. Likewise programs should use evidence to establish benchmarks for their own performance based on expected program outcomes. This approach contributes to evidence-based education. Internal evidence could include mean scores, trends over time, and/or correlation to other dimensions of program/student outcomes. External evidence can include national data and/or institutional data. Some programs establish benchmarks with multiple measures, e.g., mean scores, downward trends and abrupt changes of more than a specified amount. QUALITATIVE AND QUANTITATIVE DATA As noted in standard C1.01, the commission expects programs to use qualitative and quantitative data in their self-assessment processes. Student/preceptor/graduate comments, focus groups and feedback from student representatives are examples of potentially valuable qualitative data. Programs should define a method for analyzing qualitative data. Such methods could include summarizing comments with analysis by the number or percentage of comments with a specific theme, or noting trends in comments over time. These methods bring a quantitative aspect to qualitative data. Qualitative data also is filtered through the lens of the faculty s collective knowledge and experience, since faculty may have a different perspective than students. Programs aren t expected to adopt modifications based solely on qualitative feedback from students or other stakeholders. This filtering can be described as part of the program s self-assessment process and explained in the narrative. SSR RELATED RESOURCES In addition to these Notes, several resources related directly or indirectly to the SSR are available on the ARC-PA web site Accreditation Resources Page. The Power Point handout from the ARC-PA presentation at the PAEA Memphis meeting in October 2013 about Program-Defined Expectations as they relate to the Standards. The Data Analysis Resource (May 2015) document addressing the components of data analysis as they relate to the Standards, 4th edition and, Self-Study Report. A listing of parameters to be considered and correlated in relation to PANCE outcomes for the SSR or for required PANCE reports to the ARC-PA.