MEDICAL EDUCATION. Five-Year Summary of COMLEX-USA Level 2-PE Examinee Performance and Survey Data

Similar documents
RESEARCH ARTICLES Objective Structured Clinical Examinations in Doctor of Pharmacy Programs in the United States

The patient-centered medical

Psychometric Research Brief Office of Shared Accountability

Use of the Kalamazoo Essential Elements Communication Checklist (Adapted) in an Institutional Interpersonal and Communication Skills Curriculum

Next Steps for Graduate Medical Education

INTERNAL MEDICINE IN-TRAINING EXAMINATION (IM-ITE SM )

Pathways to Health Professions of the Future

Joint Board Certification Project Team

Demographic Survey for Focus and Discussion Groups

Chapter 9 The Beginning Teacher Support Program

Medical student research at Texas Tech University Health Sciences Center: Increasing research participation with a summer research program

Longitudinal Integrated Clerkship Program Frequently Asked Questions

GUIDELINES FOR COMBINED TRAINING IN PEDIATRICS AND MEDICAL GENETICS LEADING TO DUAL CERTIFICATION

Linking the Common European Framework of Reference and the Michigan English Language Assessment Battery Technical Report

Basic Standards for Residency Training in Internal Medicine. American Osteopathic Association and American College of Osteopathic Internists

UK Institutional Research Brief: Results of the 2012 National Survey of Student Engagement: A Comparison with Carnegie Peer Institutions

Update on the Next Accreditation System Drs. Culley, Ling, and Wood. Anesthesiology April 30, 2014

Surgical Residency Program & Director KEN N KUO MD, FACS

Pharmaceutical Medicine

Author's response to reviews

National Survey of Student Engagement Spring University of Kansas. Executive Summary

How to Judge the Quality of an Objective Classroom Test

Effective Recruitment and Retention Strategies for Underrepresented Minority Students: Perspectives from Dental Students

NATIONAL SURVEY OF STUDENT ENGAGEMENT

UIC HEALTH SCIENCE COLLEGES

The development of our plan began with our current mission and vision statements, which follow. "Enhancing Louisiana's Health and Environment"

Physician Assistant Program Goals, Indicators and Outcomes Report

PREPARING FOR THE SITE VISIT IN YOUR FUTURE

JOB OUTLOOK 2018 NOVEMBER 2017 FREE TO NACE MEMBERS $52.00 NONMEMBER PRICE NATIONAL ASSOCIATION OF COLLEGES AND EMPLOYERS

Colorado State University Department of Construction Management. Assessment Results and Action Plans

Transportation Equity Analysis

RCPCH MMC Cohort Study (Part 4) March 2016


Dr. Isadore Dyer, Association of American Medical Colleges

Process Evaluations for a Multisite Nutrition Education Program

Principal vacancies and appointments

Shelters Elementary School

OPAC and User Perception in Law University Libraries in the Karnataka: A Study

Global Health Kitwe, Zambia Elective Curriculum

READY OR NOT? CALIFORNIA'S EARLY ASSESSMENT PROGRAM AND THE TRANSITION TO COLLEGE

SPECIALIST PERFORMANCE AND EVALUATION SYSTEM

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

The Condition of College & Career Readiness 2016

Iowa School District Profiles. Le Mars

THE PENNSYLVANIA STATE UNIVERSITY SCHREYER HONORS COLLEGE DEPARTMENT OF MATHEMATICS ASSESSING THE EFFECTIVENESS OF MULTIPLE CHOICE MATH TESTS

Kansas Adequate Yearly Progress (AYP) Revised Guidance

Phase 3 Standard Policies and Procedures

KENTUCKY FRAMEWORK FOR TEACHING

NCEO Technical Report 27

Service-Learning Projects in a Public Health in Pharmacy Course 1

Redirected Inbound Call Sampling An Example of Fit for Purpose Non-probability Sample Design

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

Tools to SUPPORT IMPLEMENTATION OF a monitoring system for regularly scheduled series

BENCHMARK TREND COMPARISON REPORT:

U VA THE CHANGING FACE OF UVA STUDENTS: SSESSMENT. About The Study

Tun your everyday simulation activity into research

The One Minute Preceptor: 5 Microskills for One-On-One Teaching

Mayo School of Health Sciences. Clinical Pastoral Education Internship. Rochester, Minnesota.

Enrollment Trends. Past, Present, and. Future. Presentation Topics. NCCC enrollment down from peak levels

Student Admissions, Outcomes, and Other Data

Interprofessional educational team to develop communication and gestural skills

DOCTOR OF PHILOSOPHY IN POLITICAL SCIENCE

Medical College of Wisconsin and Froedtert Hospital CONSENT TO PARTICIPATE IN RESEARCH. Name of Study Subject:

American College of Emergency Physicians National Emergency Medicine Medical Student Award Nomination Form. Due Date: February 14, 2012

Learning Lesson Study Course

JUNE 15, :30 PM 9:15 PM

What Is The National Survey Of Student Engagement (NSSE)?

E C C. American Heart Association. Basic Life Support Instructor Course. Updated Written Exams. February 2016

Dentist Under 40 Quality Assurance Program Webinar

TRANSFER APPLICATION: Sophomore Junior Senior

ACCREDITATION STANDARDS

Greek Teachers Attitudes toward the Inclusion of Students with Special Educational Needs

Introduction to Questionnaire Design

Standards and Criteria for Demonstrating Excellence in BACCALAUREATE/GRADUATE DEGREE PROGRAMS

Loyola University Chicago ~ Archives and Special Collections

Status of Women of Color in Science, Engineering, and Medicine

Section 3.4 Assessing barriers and facilitators to knowledge use

Providing Feedback to Learners. A useful aide memoire for mentors

Teacher intelligence: What is it and why do we care?

ECON 365 fall papers GEOS 330Z fall papers HUMN 300Z fall papers PHIL 370 fall papers

Section 1: Program Design and Curriculum Planning

SASKATCHEWAN MINISTRY OF ADVANCED EDUCATION

Revision and Assessment Plan for the Neumann University Core Experience

COURSE SYLLABUS for PTHA 2250 Current Concepts in Physical Therapy

Samuel Enoka Kalama Intermediate School

Please complete these two forms, sign them, and return them to us in the enclosed pre paid envelope.

THE UNIVERSITY OF THE WEST INDIES Faculty of Medical Sciences, Mona. Regulations

A National Survey of Medical Education Fellowships

NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE)

Getting Results Continuous Improvement Plan

The Demographic Wave: Rethinking Hispanic AP Trends

OVERVIEW OF CURRICULUM-BASED MEASUREMENT AS A GENERAL OUTCOME MEASURE

Mathematics Program Assessment Plan

2. Related Documents (refer to policies.rutgers.edu for additional information)

Curriculum Assessment Employing the Continuous Quality Improvement Model in Post-Certification Graduate Athletic Training Education Programs

Welcome to the session on ACCUPLACER Policy Development. This session will touch upon common policy decisions an institution may encounter during the

ACADEMIC AFFAIRS GUIDELINES

Qualitative Site Review Protocol for DC Charter Schools

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Improving recruitment, hiring, and retention practices for VA psychologists: An analysis of the benefits of Title 38

Transcription:

Five-Year Summary of COMLEX-USA Level 2-PE Examinee Performance and Survey Data Erik E. Langenau, DO; Caitlin Dyer, MA; William L. Roberts, EdD; Crystal Wilson, MEd; and John Gimpel, DO, MEd The authors present data on examination format and examinee demographics, performance, and survey results for the Comprehensive Osteopathic Medical Licensing Examination-USA Level 2-Performance Evaluation (COMLEX-USA Level 2-PE) from the first five testing cycles (2004-2005 to 2008-2009). First-time examinees in the 2004-2005 testing cycle had a pass rate of 96.1%, compared with a pass rate of 94.7% for first-time examinees in the 2008-2009 testing cycle. Pass rates were fairly consistent across all testing cycles. Based on postexamination survey results from all testing cycles, the majority of examinees reported that the cases in COMLEX-USA Level 2-PE represented appropriate challenges for fourth-year osteopathic medical students. The majority of examinees also reported that comprehensive standardized patient-based examinations and exercises were administered through their colleges of osteopathic medicine. In addition, survey results indicated overall satisfaction among examinees with the administration of COMLEX- USA Level 2-PE. J Am Osteopath Assoc. 2010;110(3):114-125 June 25, 2009, marked the end of the fifth testing cycle for the Comprehensive Osteopathic Medical Licensing Examination-USA Level 2-Performance Evaluation (COMLEX- USA Level 2-PE), administered by the National Board of Osteopathic Medical Examiners (NBOME). This examination was developed to assess the fundamental clinical skill competence of osteopathic medical students who are preparing to enter graduate medical education and working to obtain medical licensure. From the National Board of Osteopathic Medical Examiners (NBOME) National Center for Clinical Skills Testing in Conshohocken, Pennsylvania. Financial Disclosures: None reported. Address correspondence to Erik E. Langenau, DO, Vice President for the National Center for Clinical Skills Testing, NBOME, 101 W Elm St, Suite 150, Conshohocken, PA 19428-2004. E-mail: elangenau@nbome.org Submitted August 24, 2009; accepted October 7, 2009. Testing medical students for clinical skills competence using the objective structured clinical examination format was first described in Scotland in 1975. 1 Objective structured clinical examinations and standardized patient-based examinations including simulations by standardized patients (SPs) portraying typical medical complaints within an ambulatory setting have since become the methods of choice for assessing clinical skills of medical students. 2-7 Since 2004, the COMLEX-USA Level 2-PE has been used to fulfill NBOME s mission of protecting the public and enhancing patient safety through the evaluation of clinical skills proficiency of graduates from colleges of osteopathic medicine (COMs). All students graduating from COMs after January 1, 2005, have been required to pass COMLEX-USA Level 2-PE in order to complete the entire COMLEX-USA series. 8 In addition, the American Osteopathic Association (AOA) Commission on Osteopathic College Accreditation (COCA) has instituted a policy requiring all osteopathic medical students graduating after December 1, 2007, to pass both the PE and the Cognitive Evaluation (CE, a multiple choice test format) components of COMLEX-USA Level 2 before graduation. 9 In the present report, we use data from the first five testing cycles (2004-2005 to 2008-2009) to describe the format of and requirements for COMLEX-USA Level 2-PE; the demographic characteristics of examinees; the pass/fail rates and trends in examinee performance on the two domains of COMLEX-USA Level 2-PE ( Humanistic Domain, Biomedical/Biomechanical Domain ); and the survey responses by examinees regarding the examination and their previous exposure to standardized patient encounters. Format Standardized patient-based examinations and objective structured clinical examinations are routinely used for formative assessments in osteopathic and allopathic medical schools 7,10-13 and graduate medical education programs, 14-17 as well as for summative assessments in medical schools 12,18,19 and licensure. 4 During COMLEX-USA Level 2-PE, examinees rotate through 12 stations, in which they evaluate SPs who have been trained to simulate a variety of typical clinical presentations. Each of the 12 stations includes a 14-minute patient- 114 JAOA Vol 110 No 3 March 2010

physician encounter (ie, case), followed by a 9-minute period in which the examinee completes a written note about the encounter the SOAP (Subjective, Objective, Assessment, Plan) Note. Cases are assigned under blueprint content categories (eg, cardiovascular, gastrointestinal, neuromusculoskeletal, respiratory, other clinical presentations) and are designed to vary in SP age, sex, and race/ethnicity. Cases also vary in clinical complaints, which could be acute or chronic in nature or provide opportunities for health promotion or disease prevention. At the start of the examination, students are randomly assigned to a pretest station, which may contain unscored material used to evaluate new SPs and cases. Research shows no benefit from the administration of unscored material at any time during the examination. 20 Examinee performance measures are collected from each of the test stations. A student is eligible to take the COMLEX-USA Level 2- PE if he or she has passed the COMLEX-USA Level 1, has completed his or her second academic year at a COM accredited by the COCA, and has been approved to take the examination by the COM s office of the dean. If an individual has already graduated from a COM, that individual must provide a verified copy of his or her diploma to take COMLEX- USA Level 2-PE. 8 Examinee Characteristics The number of COM matriculates and graduates has increased since 2004 as a result of expanded class sizes among existing osteopathic medical schools and the opening of new branch campuses. 21 The first-year enrollment for osteopathic medical students in 2004 was 3646, increasing to 4528 by 2007. 22 With this increase in the number of students entering the osteopathic medical profession, the total number of COMLEX- USA Level 2-PE examinees including first-time examinees and repeat examinees has also increased with each testing cycle (Figure 1). The COMLEX-USA Level 2-PE is offered year-round, though the number of examinees tested per month varies throughout each testing cycle (Figure 2). The number of firsttime examinees in the 2004-2005 testing cycle was greatest in months near the end of the cycle. During the past four testing cycles, however, there has been a steady increase in the number of first-time examinees at the beginning of each cycle. In response to the COCA requirement that students complete both COMLEX-USA Level 2-CE and Level 2-PE before graduation, the NBOME increased testing capacity so that students could take the COMLEX-USA Level 2-PE before January 31 during the 2007-2008 and 2008-2009 testing cycles. By taking this examination before January 31 of the expected year Examinees, No 5000 4500 4000 3500 3000 2500 2000 1500 2742 3005 3261 3751 4698 1000 500 0 2004-2005 2005-2006 2006-2007 2007-2008 2008-2009 COMPLEX-USA Level 2-PE Testing Cycle Figure 1. Number of examinees taking the Comprehensive Osteopathic Medical Licensing Examination-USA Level 2-Performance Evaluation (COMLEX-USA Level 2-PE), by testing cycle. The 2008-2009 testing cycle consisted of 13 months (June 1, 2008-June 30, 2009). The other testing cycles consisted of 12 months. JAOA Vol 110 No 3 March 2010 115

Examinees, No. 900 800 700 600 500 400 300 200 100 0 2004-2005 2005-2006 2006-2007 2007-2008 2008-2009 Jan Feb March April May June July Aug Month of Testing Cycle Sept Oct Nov Dec Figure 2. Number of first-time examinees taking the Comprehensive Osteopathic Medical Licensing Examination Level-USA 2-Performance Evaluation (COMLEX-USA Level 2-PE), by testing cycle and month. The 2008-2009 testing cycle consisted of 13 months (June 1, 2008- June 30, 2009), with 393 first-time examinees in June 2008 and 371 first-time examinees in June 2009. The other testing cycles consisted of 12 months. of graduation, most first-time failing students should have sufficient opportunity to reschedule and retest, receiving a second score report before graduation. Data from the 2008-2009 testing cycle, as shown in Figure 2, suggest that the majority of examinees schedule the examination early in the academic year to take advantage of the opportunity to retest if needed. Demographic characteristics of first-time examinees across all testing cycles are presented in Table 1. The percentage of examinees who were women was slightly higher than the percentage of examinees who were men in the third, fourth, and fifth testing cycles the opposite of the proportions in the first two cycles. The primary racial group in all testing cycles was white, followed by Asian and black or African American. English was the primary spoken language for all testing cycles. These examinee demographic data are consistent with data reported by the American Association of Colleges of Osteopathic Medicine for osteopathic medical student matriculates and graduates. 23 Competency Measures Examinees must pass two domains the Humanistic Domain and the Biomedical/Biomechanical Domain to pass COMLEX-USA Level 2-PE. Failure in either domain results in failure of the entire examination. Separately scored components are used to measure performance in each of the domains (Figure 3). The Humanistic Domain, as measured by NBOME s Global Patient Assessment (GPA) Tool, consists of the following six components: listening skills, respectfulness, empathy, professionalism, ability to elicit information, and ability to provide information. Each of these components is evaluated by the SP using the GPA Tool, a holistic Likert scale based scoring instrument. 24 For the 2008-2009 testing cycle, the generalizability coefficient for the Humanistic Domain, as measured by the GPA Tool, was 0.85 a coefficient that was similar to those measured in previous testing cycles. This reliability measure is also similar to measures found for other high-stakes clinical skills examinations. 25 The Biomedical/Biomechanical Domain is comprised of three weighted component scores that reflect medical and scientific knowledge synthesized with clinical skill performance across all SP stations. The components are as follows: (1) data gathering, which reflects the examinee s ability to obtain a medical history and perform a physical examination; (2) osteopathic manipulative treatment (OMT), which reflects the examinee s ability to integrate osteopathic principles into clinical practice and to use OMT; and (3) SOAP Notes, which reflect the examinee s written communication skills and ability to synthesize information, develop a differential diagnosis, and formulate a diagnostic and treatment plan. Data gathering, a clinical skill competency measure, is scored on a percent metric based on the number of checklist items correctly obtained during the medical history and the number of correctly performed physical examination maneuvers recorded by the SP. Although osteopathic principles are assessed throughout the examination, the examinee s performance of OMT is specifically assessed in 25% to 40% of the encounters. Performance of OMT is scored by osteopathic physician examiners who are trained to evaluate OMT skills using a holistic Likert scale based scoring instrument. The SOAP Notes are also scored by osteopathic physician examiners using another Likert scale based instrument. For the 2008-2009 testing cycle, the generalizability coefficient for the Biomedical/Biomechanical Domain score was 0.76. As measures of reliability, other clinical skills performance assessments show similar generalizability coefficients 116 JAOA Vol 110 No 3 March 2010

Table 1 Characteristics of First-Time Examinees in COMLEX-USA Level 2-PE Testing Cycles, No. (%) * Testing Cycle 2004-2005 2005-2006 2006-2007 2007-2008 2008-2009 Characteristic N=2720 N=2856 N=3099 N=3476 N=4353 Sex Men 1422 (52.3) 1485 (52.0) 1529 (49.3) 1715 (49.3) 1888 (43.4) Women 1256 (46.2) 1337 (46.8) 1557 (50.2) 1737 (50.0) 1984 (45.6) Race American Indian/Alaska Native 26 (1.0) 20 (0.7) 22 (0.7) 17 (0.5) 11 (0.3) Asian 404 (14.9) 390 (13.7) 418 (13.5) 554 (15.9) 588 (13.5) Black or African American 16 (0.6) 14 (0.5) 10 (0.3) 130 (3.7) 139 (3.2) Native Hawaiian/Pacific Islander 67 (2.5) 101 (3.5) 124 (4.0) 19 (0.5) 26 (0.6) White 2073 (76.2) 2201 (77.1) 2384 (76.9) 2561 (73.7) 2664 (61.2) Ethnicity Hispanic/Latino 83 (3.1) 96 (3.4) 108 (3.5) 125 (3.6) 157 (3.6) Not Hispanic/Latino 2448 (90.0) 2391 (83.7) 2614 (84.3) 2326 (66.9) 2294 (52.7) English as Primary Language Yes 2451 (90.1) 2626 (91.9) 2853 (92.1) 3175 (91.3) 3548 (81.5) No 211 (7.8) 199 (7.0) 232 (7.5) 271 (7.8) 226 (5.2) * Percentages are based only on examinees who responded and, therefore, may not total 100. Abbreviation: COMLEX-USA Level 2-PE, Comprehensive Osteopathic Medical Licensing Examination-USA Level 2-Performance Evaluation. for data gathering and postencounter notes. 25 The component scores for patient encounters in the COMLEX-USA Level 2-PE are averaged across all scored stations to compute the examinee s average performance during the testing day. The average performance rating provides a compensatory score within each domain of the examination, allowing poor performance in one station to be compensated with better performance in another station. After all scores are computed and equated to take into account the relative leniency or stringency of SPs and physician examiners, the pass/fail decision for COMLEX-USA Level 2-PE is based on standards applied to the cut score of each domain. Pass/fail standards are based on minimal competency for entry into graduate medical education. Figure 3. The Comprehensive Osteopathic Medical Licensing Examination-USA Level 2-Performance Evaluation (COMLEX-USA Level 2-PE) consists of two domains the Humanistic Domain and the Biomedical/Biomechanical Domain. The Humanistic Domain includes six separately scored components (listening skills, respectfulness, empathy, professionalism, ability to elicit information, ability to provide information), which are measured by the National Board of Osteopathic Medical Examiners Global Patient Assessment Tool. The Biomedical/Biomechanical Domain has three separately scored, weighted components: data gathering (reflecting the examinee s ability to obtain a medical history and perform a physical examination); osteopathic manipulative treatment (OMT, reflecting the examinee s ability to integrate osteopathic principles into clinical practice and to use OMT); and SOAP Notes (reflecting the examinee s written communication skills and ability to synthesize information, develop a differential diagnosis, and formulate a diagnostic and treatment plan). JAOA Vol 110 No 3 March 2010 117

Evaluation of SOAP Note Accuracy Beginning with the 2007-2008 testing cycle, the NBOME officially implemented screening, reviewing, and reporting processes for SOAP Note fabrication. The 2009-2010 Orientation Guide for COMLEX-USA Level 2-PE 26 contains the following description of SOAP Note fabrication: Falsification of the medical record (written SOAP Notes) in COMLEX-USA Level 2-PE by documenting medical history that was not elicited, or physical examination maneuvers or techniques that were not performed, is considered irregular conduct and will be thoroughly investigated and dealt with as specified in NBOME s Bulletin of Information. Examinees are informed that misrepresenting findings in the SOAP Note is a form of irregular conduct that can result in a failing score with an annotated score report and transcript. Additional information about the SOAP Note fabrication process can be obtained not only in NBOME s Orientation Guide, 26 but also in the instructional program video available at NBOME s Web site and in various local and national presentations to student groups. This information is also presented in each 50-minute orientation session that occurs immediately before the COMLEX-USA Level 2-PE is administered. Sandella et al 27 describes NBOME s two flagging and screening procedures for SOAP Note fabrication. In one procedure, physician examiners are instructed to report any inconsistencies in note documentation. In the other procedure, psychometric analysis is used to detect inconsistencies between data gathering and note documentation scores. When SOAP Notes have been flagged for potential fabrication, a comprehensive review is undertaken by the NBOME. When the NBOME determines that a substantial number of SOAP Notes contain information that was not elicited during the patient en counter, the examinee receives a failing score report with the annotation of irregular conduct. Results of SOAP Note fabrication reviews for the 2007-2008 and 2008-2009 testing cycles, listed by flagging procedure, are shown in Table 2. The data reveal that a total of 17 examinees have failed the COMLEX-USA Level 2-PE based on SOAP Note fabrication since NBOME s adoption of the policy in 2007. Passing Standards and Examinee Performance Based on The Standards for Educational and Psychological Test - ing, 28 published by the American Educational Research Association, American Psychological Association, and the National Council on Measurement in Education, pass/fail decisions for the two domains of the COMLEX Level 2-PE are reevaluated approximately every 3 years. 29 The NBOME uses a process of triangulation for determining cut points for Level 2-PE, a process that is common in medical testing organizations and highly recommended in high-stakes testing in general. 4,30 This process includes standard-setting surveys, standard-setting panel meetings using an examinee-centered method, and a comprehensive final review. 29,31,32 In examinee-centered methods, samples of actual performance covering the ability continuum of examinees are presented to standard-setting panelists, who provide expert judgments about the demonstration of minimal clinical skills to pass the examination. The standard-setting panels are selected to assure broad representation of the osteopathic medical profession in regard to geographic considerations, age, sex, race, and ethnicity. Panelists are selected to include representation from clinical practice, state medical licensing boards, graduate medical education programs, and COMs. The numbers and percentages of first-time examinees who passed or failed the COMLEX-USA Level 2-PE in each testing cycle are shown in Table 3. Of the 16,504 first-time examinees in all testing cycles, 15,604 (94.5%) passed the examination. Of the 2720 first-time examinees in the 2004-2005 testing cycle, 2613 (96.1%) passed. The pass rates for first-time examinees in the successive four testing cycles were somewhat less than in the 2004-2005 cycle ranging from 93.4% (in 2006-2007) to 95.3% (in 2005-2006). Of the 4353 first-time examinees in the most recent testing cycle (2008-2009), 4124 (94.7%) passed COMLEX-USA Level 2-PE. Domain fail rates shown in Table 3 are based on the total number of first-time examinees tested within each testing cycle. For example, 229 (5.3%) of the 4353 examinees in the 2008-2009 testing cycle failed the examination. Because passing each domain is required to pass the examination, these examinees could have failed the Biomedical/Biomechanical Domain, Table 2 Results of SOAP Note Fabrication Reviews by Flagging Procedure for the 2007-2008 and 2008-2009 COMLEX-USA Level 2-PE Testing Cycles Psychometric Analysis Physician Examiner Reports Testing Cycle N Flagged, No. Failures, No. Flagged, No. Failures, No. 2007-2008 3751 64 3 27 5 2008-2009 4698 96 2 181 * 7 Total Failures, No. 5 12 * New flagging instructions and procedures were given to physician examiners at the beginning of this testing cycle. Abbreviations: COMLEX-USA Level 2-PE, Comprehensive Osteopathic Medical Licensing Examination-USA Level 2-Performance Evaluation; SOAP, Subjective, Objective, Assessment, Plan. 118 JAOA Vol 110 No 3 March 2010

Table 3 Pass and Fail Rates of First-Time Tested on COMLEX-USA Level 2-PE by Testing Cycles, No. (%) Testing Cycle 2004-2005 2005-2006 2006-2007 2007-2008 2008-2009 Examination Outcome N=2720 N=2856 N=3099 N=3476 N=4353 Pass, Overall 2613 (96.1) 2722 (95.3) 2896 (93.4) 3249 (93.5) 4124 (94.7) Fail, Overall 107 (3.9) 134 (4.7) 203 (6.6) 227 (6.5) 229 (5.3) Fail, by Domain * Biomedical/biomechanical 83 (3.1) 112 (3.9) 126 (4.1) 83 (2.4) 81 (1.9) Humanistic 20 (0.7) 15 (0.5) 55 (1.8) 121 (3.5) 115 (2.6) Both 4 (0.1) 7 (0.2) 22 (0.7) 23 (0.7) 33 (0.8) *Domain sample size and percentages reflect overall fail percentages within each testing cycle. Abbreviation: COMLEX-USA Level 2-PE, Comprehensive Osteopathic Medical Licensing Examination-USA Level 2-Performance Evaluation. the Humanistic Domain, or both. Results show that 81 (1.9%) of all examinees in 2008-2009 failed the Biomedical/Biomechanical Domain, 115 (2.6%) failed the Humanistic Domain, and 33 (0.8%) failed both domains. New standards were introduced in the 2007-2008 testing cycle, yielding a technical adjustment to the cut points on each domain. Applying new standards resulted in little change between the overall fail rate for 2006-2007 and that for 2007-2008 (percentage difference=0.1%). However, adjustment to the cut points under the new standards resulted in a reversal in fail rates for each domain in 2007-2008 (Table 3). Fail rates for the Biomedical/Biomechanical Domain in 2006-2007 and earlier were consistently higher than those for the Humanistic Domain. Fail rates were estimated for each domain under the new standards, which predicted a lower fail rate for the Biomedical/Biomechanical Domain and a higher fail rate for the Humanistic Domain bringing the fail rates of the two domains closer together. As shown in Figure 4, the overall fail rate for first-time examinees across all testing cycles was 5.4%. For those taking the examination a 25 second time, the fail rate was 12.9%. For those taking the examination for the third time or more, 20 the fail rate was 21.2%. 15 Onsite Surveys On the day of the examination, two onsite surveys are administered to examinees one before and one after the examination. The surveys evaluate examinees previous educational experience with SPs and their perceptions of the COMLEX-USA Level 2-PE and testing center. The surveys are given to both first-time and repeat examinees. Examinees Who Failed, % 10 5 0 Standardized Patient Survey The standardized patient survey, given to examinees before the examination, includes questions about examinees previous experience with and exposure to SPs. Each examinee must include his or her name on this survey. Questions address the number of SP encounters that the examinees experienced during their osteopathic medical school training and the nature of their COMs comprehensive SP-based examinations. As depicted in Table 4, 2984 (63.5%) examinees in the 2008-2009 testing cycle reported that SPs were used at their COMs for nongraded teaching purposes a finding consistent with the rate (63.9%) reported in the 2007-2008 testing cycle. More than 94% of examinees in the 2004-2005, 2005-2006, and 2006-2007 testing cycles noted that SPs were used at their COMs for teaching purposes. Thus, a noticeable decrease has occurred in the percentages of examinees responding that SPs are used for teaching purposes at COMs. This decrease may reflect a curricular shift in which COMs began using SPs more 5.4 12.9 21.2 First take Second take Three or more takes Figure 4. Percentages of examinees who failed the Comprehensive Osteopathic Medical Licensing Examination Level 2-Performance Evaluation (COMLEX-USA Level 2- PE), by number of takes for all five testing cycles (2004-2005 to 2008-2009). JAOA Vol 110 No 3 March 2010 119

Table 4 Examinee Responses to Standardized Patient Survey in All COMLEX-USA Level 2-PE Testing Cycles, No. (%)* Testing Cycle 2004-2005 2005-2006 2006-2007 2007-2008 2008-2009 Survey Item N=2742 N=3005 N=3261 N=3751 N=4698 1. Are SPs used at your school for teaching purposes? Yes 2600 (94.8) 2891 (96.2) 3144 (96.4)...... No 129 (4.7) 103 (3.4) 86 (2.6)...... 2. Are SPs used at your school for non-graded teaching purposes? Yes......... 2396 (63.9) 2984 (63.5) No......... 1138 (30.3) 1151 (24.5) 3. Is performance on SP encounters incorporated into your evaluation/grade? Yes 2092 (76.3) 2377 (79.1) 2688 (82.4) 3136 (83.6) 3626 (77.2) No 624 (22.8) 611 (20.3) 525 (16.1) 404 (10.8) 515 (11.0) 4. Does your school administer its own SP examination? Yes 2343 (85.4) 2622 (87.3)......... No 373 (13.6) 340 (11.3)......... 5. Does your school administer its own comprehensive SP examination (ie, an assessment similar to Level 2-PE)? Yes...... 2658 (81.5) 2908 (77.5) 3420 (72.8) No...... 540 (16.6) 494 (13.2) 514 (10.9) 5A. If yes, are you required to pass in order to graduate? Yes 1642 (59.9) 1761 (58.6) 1819 (55.8) 2278 (60.7) 2705 (57.6) No 766 (27.9) 881 (29.3) 652 (20.0) 659 (17.6) 772 (16.4) 5B. If yes, have you completed this [examination]? Yes...... 2516 (77.2) 2890 (77.0) 3430 (73.0) No...... 291 (8.9) 222 (5.9) 223 (4.7) * Percentages are based only on examinees who responded and, therefore, may not total 100. Question 1 was rephrased as question 2 as of testing cycle 2007-2008. Question 4 was rephrased as question 5 as of testing cycle 2006-2007. Abbreviations: COMLEX-USA Level 2-PE, Comprehensive Osteopathic Medical Licensing Examination-USA Level 2-Performance Evaluation; SP, standardized patient;... indicates no data available. for assessment purposes rather than teaching. As shown in Table 5, the median number of school-based SP encounters experienced by examinees during the first year of osteopathic medical school has increased from 3 in the 2004-2005 testing cycle to 5 in the 2008-2009 testing cycle. A similar trend is seen for the second year of osteopathic medical school, with the median number of SP encounters increasing from 5 in 2004-2005 to 8 in 2008-2009. A total of 3420 (72.8%) examinees in the 2008-2009 testing cycle agreed that their COMs administered comprehensive SP examinations that were similar to [COMLEX-USA] Level 2- PE (Table 4). This percentage represented a decline from the 81.5% who agreed to that statement in 2006-2007, the first testing cycle in which the similar to Level 2-PE wording was used. In the 2008-2009 testing cycle, 3430 (73%) examinees responded that they had completed the comprehensive SPbased examination at their respective COMs, and 2705 (57.6%) responded that they were required to pass the examination before graduation. As shown in Table 4, each of these percentages has remained relatively consistent over all testing cycles. Respondents in the 2008-2009 testing cycle who noted that their COMs administered comprehensive SP-based exam- 120 JAOA Vol 110 No 3 March 2010

Table 5 School-Based Standardized Patient Encounters Experienced by COMLEX-USA Level 2-PE Examinees During First 2 Years of Osteopathic Medical School, Median No. Testing Cycle Osteopathic Medical 2004-2005 2005-2006 2006-2007 2007-2008 2008-2009 School Year N=2742 N=3005 N=3261 N=3751 N=4698 First Year 3.0 4.0 4.0 5.0 5.0 Second Year 5.0 6.0 6.0 8.0 8.0 Abbreviation: COMLEX-USA Level 2-PE, Comprehensive Osteopathic Medical Licensing Examination-USA Level 2-Performance Evaluation. Table 6 Standardized Patient (SP) Encounters Experienced by COMLEX-USA Level 2-PE Examinees in Comprehensive SP-Based Examinations Administered by COMs, Median No. Testing Cycle 2004-2005 2005-2006 2006-2007 2007-2008 2008-2009 N=2742 N=3005 N=3261 N=3751 N=4698 Individual SP Encounters 8.0 8.0 5.0 6.0 5.0 Abbreviations: COM, college of osteopathic medicine; COMLEX-USA Level 2-PE, Comprehensive Osteopathic Medical Licensing Examination-USA Level 2-Performance Evaluation. inations reported that they experienced a median number of 5 SP encounters at the COMs. As shown in Table 6, this number and the numbers from the 2006-2007 and 2007-2008 testing cycles represented decreases in SP encounters at the COMs, compared with the 2004-2005 and 2005-2006 testing cycles. Postexamination Survey The postexamination survey is an anonymous 22-item survey given immediately after the examination. It is meant to monitor examinee satisfaction with the NBOME s National Center for Clinical Skills Testing and administration of the COMLEX- USA Level 2-PE. For each of the postexamination survey items, examinees are asked to respond using a four-point Likert scale: 1 (strongly disagree), 2 (disagree), 3 (agree), and 4 (strongly agree). Written comments are also solicited from examinees. The postexamination survey data and written comments are used to monitor examinee satisfaction and to make improvements to the COMLEX-USA Level 2-PE and testing center. Overall, the responses of the 4698 examinees from the 2008-2009 testing cycle were favorable regarding the examination and testing center, as illustrated by a majority of strongly agree or agree re-sponses to survey items (Table 7). As shown in Table 7, 3846 (81.8%) examinees agreed or strongly agreed that the cases represented appropriate challenges for a 4th year student, 3734 (79.4%) examinees agreed or strongly agreed that the [standardized] patients portrayed the case in a believable manner, and 3557 (75.7%) examinees agreed or strongly agreed that the [standardized] patients responses to the physical [examination] were realistic. Survey items that elicited somewhat lower levels of agreement among respondents included those related to the adequacy of the SOAP Note abbreviations list and the sufficiency of the time allotted to complete SOAP Notes and SP encounters. As illustrated in Figure 5, responses to these selected survey items and the item on whether the cases represented appropriate challenges remained fairly consistent across all testing cycles. Responses collected during the 2008-2009 testing cycle suggest that the National Center for Clinical Skills Testing and the administration of the COMLEX-USA Level 2-PE were favorably received by examinees. In all testing cycles, examinees generally appeared to be satisfied with their testing experiences. Written Feedback Examinees are encouraged to provide written feedback regarding their experiences with the COMLEX-USA Level 2-PE. Comments in the surveys analyzed in the present study ranged from congratulatory to persecutory. The following are samples of selected examinee comments from surveys in the 2008-2009 testing cycle: [Patients] were very realistic and cases were challenging but not overly difficult or confusing. Proctors were very friendly and helpful. I enjoyed the experience. Test center was incredibly easy to find, lots of places to stay [within] walking distance. Lunch was good too. Thank you! Excellent SPs very believable. I felt like it was a normal rotation. I feel more time should be allotted for the SP and SOAP Note portions since we are to perform OMT on some patients. I feel this time limit makes me feel rushed during note writing and the physical examination. The proctors were very helpful and pleasant to be around. JAOA Vol 110 No 3 March 2010 121

Table 7 Examinee Responses to Post examination Survey in 2008-2009 COMLEX-USA Level 2-PE Testing Cycle, No. (%) (N=4698) * Survey Item Strongly Disagree Disagree Agree Strongly Agree 1. The cases represented appropriate challenges for a 4th year student. 23 (0.5) 78 (1.7) 1988 (42.3) 1858 (39.5) 2. The equipment in the exam stations was easy to locate. 28 (0.6) 29 (0.6) 1173 (25.0) 2738 (58.3) 3. The list of abbreviations for the SOAP Note was sufficient. 370 (7.9) 893 (19.0) 1471 (31.3) 1230 (26.2) 4. The designated area provided to write the SOAP Note was comfortable. 53 (1.1) 182 (3.9) 1740 (37.0) 1985 (42.2) 5. The designated area provided to write the SOAP Note was quiet. 30 (0.6) 68 (1.4) 1417 (30.2) 2462 (46.9) 6. The temperature of the test center was comfortable. 39 (0.8) 222 (4.7) 1513 (32.2) 2202 (46.9) 7. The external noise level in the exam rooms was acceptable. 26 (0.6) 66 (1.4) 1455 (31.0) 2424 (51.6) 8. The instructions provided in the orientation were clear. 24 (0.5) 35 (0.7) 1171 (24.9) 2746 (58.4) 9. The orientation provided ample opportunity to ask questions. 31 (0.7) 64 (1.4) 1255 (26.7) 2628 (55.9) 10. The proctors were helpful. 22 (0.5) 27 (0.6) 997 (21.2) 2927 (62.3) 11. The receptionist greeted me in a friendly manner. 73 (1.6) 154 (3.3) 1177 (25.0) 2571 (54.7) 12. The refreshments provided were satisfactory. 61 (1.3) 180 (3.8) 1406 (29.9) 2322 (49.4) 13. It was easy to find my way to the test center in Conshohocken, Pennsylvania. 95 (2.0) 290 (6.2) 1572 (33.5) 2024 (43.1) 14. The patients portrayed the case in a believable manner. 33 (0.7) 196 (4.2) 1778 (37.8) 1956 (41.6) 15. Everything the patients reported was consistent with the case. 34 (0.7) 344 (7.3) 2003 (42.6) 1579 (33.6) 16. The patients affect appropriately matched the chief complaint. 42 (0.9) 259 (5.5) 2049 (43.6) 1613 (34.3) 17. The patients responses to the physical exam were realistic. 39 (0.8) 369 (7.9) 2105 (44.8) 1452 (30.9) 18. The patients answered my questions completely. 48 (1.0) 282 (6.0) 2012 (42.8) 1627 (34.6) 19. All equipment provided in the exam room functioned properly. 25 (0.5) 38 (0.8) 1313 (27.9) 2593 (55.2) 20. The doorway information sheet made the task (clinical challenge) clear. 27 (0.6) 83 (1.8) 1639 (34.9) 3962 (47.1) 21. The time allowed to complete the SOAP Note was sufficient. 245 (5.2) 633 (13.5) 1655 (35.2) 1420 (30.2) 22. The time allowed to complete the encounter with the SP was sufficient. 247 (5.3) 667 (14.2) 1747 (37.2) 1298 (27.6) * Percentages are based only on examinees who responded and, therefore, may not total 100%. Abbreviations: COMLEX-USA Level 2-PE, Comprehensive Osteopathic Medical Licensing Examination Level 2-Performance Evaluation; SOAP, Subjective, Objective, Assessment, Plan; SP, standardized patient. 122 JAOA Vol 110 No 3 March 2010

4 3.5 3 2.5 2 1.5 1 0.5 0 Examinees Mean Responses The cases represented appropriate challenges for a 4th year student The list of abbreviations for the SOAP Note was sufficient The time allowed to complete the SOAP Note was sufficient The time allowed to complete the encounter with the SP was sufficient 2004-2005 2005-2006 2006-2007 2007-2008 2008-2009 Survey Item Figure 5. Examinee responses to selected postexamination survey items, by Comprehensive Osteopathic Medical Licensing Examination Level 2-Performance Evaluation (COMLEX-USA Level 2-PE) testing cycle. Examinee responses: 1 (strongly disagree), 2 (disagree), 3 (agree), 4 (strongly agree). Abbreviation: SP, standardized patient. I think that the use of OMT should not be a requirement but instead an additional way for students to gain points. I did not have any 3rd or 4th year rotations in which it was actually used so I felt ill prepared. Thank you, I feel a great sense of accomplishment and realize what I can do better. This has given me more confidence in seeing and treating patients. Please thank the SPs for their time and participation. The proctors were so helpful and friendly that it felt almost like being in a real clinic. In other words, no high stress of being watched and monitored. Makes a big difference when stress level is not too overwhelming. Thank you! Such written comments, which are reviewed regularly by the NBOME, have led to improvements in the administration of the COMLEX-USA Level 2-PE, including adjusting directions posted on the Web site, adding a list of hotel accommodations in the testing area, and streamlining the registration process. Examinees comments have also triggered in-depth reviews and analyses of various operational issues, such as the use of electronic medical records for data entry and the use of alternative testing sites. Comment Passing standards and passing rates have remained relatively stable over all five testing cycles for the COMLEX-USA Level 2- PE. The introduction of adjusted standards for the 2007-2008 testing cycle resulted in the fail rates for each of the domains becoming closer in magnitude with the proportion of examinees failing the Humanistic Domain increasing slightly while the overall fail rate remained stable. This outcome was predicted under the NBOME policy with application of the new standards. Beginning in the 2007-2008 testing cycle, the NBOME formally began evaluating SOAP Notes about patient encounters for accuracy and integrity. 27 We can only speculate why examinees include clinical findings on SOAP Notes that are not elicited or obtained during patient encounters. Examinees intent or motivation cannot be determined with the current evaluation processes. Nevertheless, a note containing inaccurate information that was not elicited during a patient encounter may lead to dangerous outcomes for the patient. Thus, consistent with its mission to protect the public, the NBOME has taken a strong stance against SOAP Note fabrication and inaccurate documentation. Examinees overwhelmingly reported that cases presented in the COMLEX-USA Level 2-PE were realistic and represented an appropriate level of challenge for fourth-year osteopathic medical students (Table 7). This finding was consistent with previous findings. 2 However, in contrast to previous reports and pilot studies indicating that 14 minutes is sufficient for a patient-physician encounter, 2,33 19.5% of examinees in the present study disagreed or strongly disagreed that sufficient time was allotted for this aspect of the examination. Despite these survey results, examinees spend an average of 12.1 minutes with SPs during each clinical encounter for COMLEX-USA JAOA Vol 110 No 3 March 2010 123

Level 2-PE well below the 14 minutes allotted for the encounter. 33 A minority of examinees in the present study also reported insufficient time for completing SOAP Notes about the encounters (Table 7). In a COMLEX-USA Level 2-PE prototype pilot study, 2 7 minutes was shown to be sufficient for SOAP note completion well below the 9 minutes allotted for examinees to complete SOAP Notes. A minority of examinees in the present study reported that the list of common abbreviations to use with SOAP Notes was inadequate (Table 7). Rather than publishing a comprehensive list of acceptable medical abbreviations, which would be impractical, The Joint Commission in compliance with National Patient Safety Goal Requirement 2B has published a list of abbreviations that are not to be used for medical documentation. 34 Consistent with this recommendation, the NBOME advises examinees to avoid abbreviations on SOAP Notes that might lead to medical error, and the list provided to examinees is meant as a guide for commonly used abbreviations. Between the 2004-2005 and 2008-2009 testing cycles, examinees reported increases in the median number of SP encounters in both the first and second years of osteopathic medical school (Table 5). This increase implies that osteopathic medical students are being exposed to a greater number of SP encounters. Yet, because the reported number of SPs used for teaching purposes has decreased (Table 4), this pattern may represent a shift in using SPs for summative, rather than formative, assessments at individual COMs. Although SPs were predominantly used for teaching clinical skills several years ago, 7 results of the present study are consistent with recent reports that COMs are now using SPs more for assessment rather than teaching. 13 Addressing this shift in SP use more thoroughly requires a more detailed investigation of current COM-based SP-based programs. Examinees reported a reduction in the median number of encounters used in comprehensive SP examinations at COMs (8 in 2004-2005 vs 5 in 2008-2009), as shown in Table 6. As a measure of reliability, generalizability coefficients decrease substantially when examinations include fewer than 10 encounters. 25,35 The selected number of encounters per student may depend on SP availability, class size, and examination administration costs, as well as the level of reliability that individual COMs are willing to accept for such an examination. The present study recorded a decrease over time in the number of examinees who reported that their COMs administered comprehensive SP examinations (85.4% in 2004-2005 vs 72.8% in 2008-2009), as shown in Table 4. Although this trend might be explained by a genuine reduction in the use of comprehensive SP examinations at COMs, it is more likely explained by COMs consolidating resources and outsourcing such examinations to their affiliated branch campuses. Alternatively, the trend could be the result of student misinterpretation of the survey item or misunderstanding of the schoolsponsored SP experiences. A formal analysis of this trend would require a detailed survey to be administered directly to the COMs. Hauer et al 36 demonstrated that allopathic medical schools have designed new school-based examinations or modified existing examinations to simulate the United States Medical Licensing Examination Step 2 Clinical Skills. Gimpel et al 13 reported that the use of standardized patient programs and mechanical simulators within osteopathic medical schools increased between 2001 and 2005. Data from the present study support the finding that SPs are being used by COMs for both teaching and evaluative purposes. A more detailed schoolbased analysis would provide additional insight regarding how clinical skills teaching curricula are designed and how SPs are used for formative and summative assessments. Conclusion For the first five testing cycles, examinee performance has been relatively consistent, and examinees have reported general satisfaction with administration and assessment methodologies of the COMLEX-USA Level 2-PE. As the osteopathic medical profession continues to grow, the NBOME looks forward to administering examinations for many years to come. These examinations will allow the NBOME to meet its mission of protecting the public by providing the means to assess competencies for osteopathic medicine and related healthcare professions. Acknowledgment We thank Taunya Cossetti, executive assistant for the NBOME, for her assistance in reviewing and editing the information provided in the present report. References 1. Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J. 1975; 1(5955):447-451. http://www.ncbi.nlm.nih.gov/pmc/articles/pmc1672423 /?tool=pubmed. Accessed December 3, 2009. 2. Gimpel JR, Boulet JR, Errichetti AM. Evaluating the clinical skills of osteopathic medical students. J Am Osteopath Assoc. 2003;103(6):267-279. http://www.jaoa.org/cgi/reprint/103/6/267. Accessed December 3, 2009. 3. Swanson DB, Norman GR, Linn RL. Performance-based assessment: lessons from the health professions. Educ Res. 1995;24(5):5-11. 4. Boulet JR, Smee SM, Dillon GF, Gimpel JR. The use of standardized patient assessments for certification and licensure decisions. Simul Healthc. 2009;4(1):35-42. 5. Adamo G. Simulated and standardized patients in OSCEs: achievements and challenges 1992-2003. Med Teach. 2003;25(3):262-270. 6. Boulet JR, De Champlain AF, McKinley DW. Setting defensible performance standards on OSCEs and standardized patient examinations. Med Teach. 2003;25(3):245-249. 7. Errichetti AM, Gimpel JR, Boulet JR. State of the art in standardized patient programs: a survey of osteopathic medical schools. J Am Osteopath Assoc. 2002;102(11):627-631. http://www.jaoa.org/cgi/reprint/102/11/627. Accessed December 3, 2009. 8. National Board of Osteopathic Medical Examiners. Bulletin of Informa- 124 JAOA Vol 110 No 3 March 2010

tion 2009-2010. Chicago, IL: National Board of Osteopathic Medical Examiners; 2009. http://www.nbome.org/docs/comlexboi.pdf. Accessed July 15, 2009. 9. AOA Commission on Osteopathic College Accreditation. Accreditation of Colleges of Osteopathic Medicine: COM Accreditation Standards and Procedures. Chicago, IL: American Osteopathic Association; 2009. http://www.doonline.org/pdf/sb03-standards%20of%20accreditation%20july%202009.pdf. Accessed December 21, 2009. 10. Brazeau C, Boyd L, Crosson J. Changing an existing OSCE to a teaching tool: the making of a teaching OSCE. Acad Med. 2002;77(9):932. 11. Barrows HS. An overview of the uses of standardized patients for teaching and evaluating clinical skills. AAMC [review]. Acad Med. 1993;68(6):443-451. 12. Townsend AH, McLlvenny S, Miller CJ, Dunn EV. The use of an objective structured clinical examination (OSCE) for formative and summative assessment in a general practice clinical attachment and its relationship to final medical school examination performance. Med Educ. 2001;35(9):841-846. 13. Gimpel JR, Weidner AC, Boulet JR, Wilson CD, Errichetti AM. Standardized patients and mechanical simulators in teaching and assessment at colleges of osteopathic medicine. J Am Osteopath Assoc. 2007;107(12):557-561. http://www.jaoa.org/cgi/content/full/107/12/557. Accessed December 3, 2009. 14. Altshuler L, Kachur E, Krinshpun S, Sullivan D. Genetics objective structured clinical exams at the Maimonides Infants & Children s Hospital of Brooklyn, New York. Acad Med. 2008;83(11):1088-1093. 15. Kligler B, Koithan M, Maizes V, Hayes M, Schneider C, Lebensohn P, et al. Competency-based evaluation tools for integrative medicine training in family medicine residency: a pilot study. BMC Med Educ. 2007;18(7):7. http://www.ncbi.nlm.nih.gov/pmc/articles/pmc1855050/?tool=pubmed. Accessed December 3, 2009. 16. Aeder L, Altshuler L, Kachur E, Barrett S, Hilfer A, Koepfer S, et al. The Culture OSCE introducing a formative assessment into a postgraduate program [published online ahead of print April 18, 2007]. Educ Health (Abingdon). 2007;20(1):11. 17. Cohen R, Reznick RK, Taylor BR, Provan J, Rothman A. Reliability and validity of the objective structured clinical examination in assessing surgical residents. Am J Surg. 1990;160(3):302-305. 18. Barrows HS, Williams RG, Moy RH. A comprehensive performance-based assessment of fourth-year students clinical skills. J Med Educ. 1987;62(10):805-809. 19. Prislin MD, Fitzpatrick CF, Lie D, Giglio M, Radecki S, Lewis E. Use of an objective structured clinical examination in evaluating student performance. Fam Med. 1998;30(5):338-344. 20. Roberts WL, McKinley DW, Boulet JR. Effect of first encounter pretest on pass/fail rates of a clinical skills medical licensure examination [published online ahead of print September 13, 2009]. Adv Health Sci Educ Theory Pract. 21. Shannon SC, Teitelbaum HS. The status and future of osteopathic medical education in the United States. Acad Med. 2009;84(6):707-711. 22. American Association of Colleges of Osteopathic Medicine. Fast Facts About Osteopathic Medical Education: Enrollment Growth in the Nation s COMs. Chevy Chase, MD: American Association of Colleges of Osteopathic Medicine; 2009. http://publish.aacom.org/about/fastfacts/documents/fast- Facts/FF-Enrollment-NationCOMs.pdf. Accessed December 21, 2009. 23. American Association of Colleges of Osteopathic Medicine. 2006 Annual Statistical Report on Osteopathic Medical Education. Chevy Chase, MD: American Association of Colleges of Osteopathic Medicine; 2006. http://www.aacom.org/resources/bookstore/2006statrpt/documents/asrome20 06.pdf. Accessed December 21, 2009. 24. Weidner AC, Gimpel JR, Boulet JR, Solomon M. Using standardized patients to assess the communication skills of graduating physicians for the Comprehensive Osteopathic Medical Licensing Examination (COMLEX) Level 2- Performance Evaluation (Level 2-PE). Teaching Learning Med. 2010;22(1):8-15. 25. Clauser BE, Harik P, Margolis MJ. A multivariate generalizability analysis of data from a performance assessment of physicians clinical skills. J Educ Meas. 2006;43(3):173-191. 26. National Board of Osteopathic Medical Examiners. 2009-2010 Orientation Guide for COMLEX-USA Level 2-PE. Chicago, IL: National Board of Osteopathic Medical Examiners; 2009. http://www.nbome.org/docs/peorientation Guide.pdf. Accessed December 21, 2009. 27. Sandella JM, Roberts WL, Gallagher LA, Gimpel JR, Langenau EE, Boulet JR. Patient note fabrication and consequences of unprofessional behavior in a high-stakes clinical skills licensing examination. Acad Med. 2009;84(10):S70- S73. 28. American Educational Research Association, American Psychological Association, National Council on Measurement in Education. The Standards for Educational and Psychological Testing. Washington, DC: American Educational Research Association, American Psychological Association, National Council on Measurement in Education; 1999. 29. Cohen AS, Kane MT, Crooks TJ. A generalized examinee-centered method for setting standards on achievement tests. Appl Meas Educ. 1999;12(4):343-366. 30. Norcini JJ, Shea JA. The credibility and comparability of standards. Appl Meas Educ. 1997;10(1):39-59. 31. Clauser BE, Clyman SG. A contrasting-groups approach to standard setting for performance assessments of clinical skills. Acad Med. 1994;69(10 suppl):s42- S44. 32. Hambleton RK, Jaeger RM, Plake BS, Mills CN. Setting performance standards on complex educational assessments. Appl Psychol Meas. 2000;24(4):355-366. 33. Sandella JM, Roberts WL, Wilson CD. The relationship between time spent in the encounter and student performance in a high-stakes standardized patient assessment. Poster presented at: Meeting of the Society for Simulation in Healthcare; January 2009; Lake Buena Vista, FL. 34. The official do not use list of abbreviations; April 1, 2009. The Joint Commission Web site. http://www.jointcommission.org/patientsafety/donot UseList/. Accessed December 21, 2009. 35. Margolis MJ, Clauser BE, Swanson DB, Boulet JR. Analysis of the relationship between score components on a standardized patient clinical skills examination. Acad Med. 2003;78(10 suppl):s68-s71. 36. Hauer KE, Teherani A, Kerr KM, O Sullivan PS, Irby DM. Impact of the United States Medical Licensing Examination Step 2 Clinical Skills exam on medical school clinical skills assessment. Acad Med. 2006;81(10 suppl):s13-s16. JAOA Vol 110 No 3 March 2010 125