NATIONAL BOARD OF MEDICAL EXAMINERS Subject Examination Program. Comprehensive Clinical Science Examination. Score Interpretation Guide

Similar documents
INTERNAL MEDICINE IN-TRAINING EXAMINATION (IM-ITE SM )

Linking the Ohio State Assessments to NWEA MAP Growth Tests *

NCEO Technical Report 27

Creating an Online Test. **This document was revised for the use of Plano ISD teachers and staff.

Interpreting ACER Test Results

Examinee Information. Assessment Information

Dr. Isadore Dyer, Association of American Medical Colleges

PROFESSIONAL TREATMENT OF TEACHERS AND STUDENT ACADEMIC ACHIEVEMENT. James B. Chapman. Dissertation submitted to the Faculty of the Virginia

Houghton Mifflin Online Assessment System Walkthrough Guide

Psychometric Research Brief Office of Shared Accountability

TIMSS ADVANCED 2015 USER GUIDE FOR THE INTERNATIONAL DATABASE. Pierre Foy

Norms How were TerraNova 3 norms derived? Does the norm sample reflect my diverse school population?

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

CAAP. Content Analysis Report. Sample College. Institution Code: 9011 Institution Type: 4-Year Subgroup: none Test Date: Spring 2011

Algebra 1, Quarter 3, Unit 3.1. Line of Best Fit. Overview

How to Judge the Quality of an Objective Classroom Test

Aalya School. Parent Survey Results

TABE 9&10. Revised 8/2013- with reference to College and Career Readiness Standards

Abu Dhabi Indian. Parent Survey Results

Abu Dhabi Grammar School - Canada

Attendance/ Data Clerk Manual.

School of Innovative Technologies and Engineering

CONSTRUCTION OF AN ACHIEVEMENT TEST Introduction One of the important duties of a teacher is to observe the student in the classroom, laboratory and

STA 225: Introductory Statistics (CT)

Table of Contents PROCEDURES

The New York City Department of Education. Grade 5 Mathematics Benchmark Assessment. Teacher Guide Spring 2013

Degree Qualification Profiles Intellectual Skills

General Information about NMLS and Requirements of the ROC

Assessment and Evaluation for Student Performance Improvement. I. Evaluation of Instructional Programs for Performance Improvement

Miami-Dade County Public Schools

Guide for Test Takers with Disabilities

Spring 2014 SYLLABUS Michigan State University STT 430: Probability and Statistics for Engineering

English Language Arts Summative Assessment

I N T E R P R E T H O G A N D E V E L O P HOGAN BUSINESS REASONING INVENTORY. Report for: Martina Mustermann ID: HC Date: May 02, 2017

Requirements for the Degree: Bachelor of Science in Education in Early Childhood Special Education (P-5)

Regions Of Georgia For 2nd Grade

Using SAM Central With iread

Extending Place Value with Whole Numbers to 1,000,000

Institution-Set Standards: CTE Job Placement Resources. February 17, 2016 Danielle Pearson, Institutional Research

Indiana Collaborative for Project Based Learning. PBL Certification Process

Content Language Objectives (CLOs) August 2012, H. Butts & G. De Anda

We re Listening Results Dashboard How To Guide

Measures of the Location of the Data

License to Deliver FAQs: Everything DiSC Workplace Certification

Why OUT-OF-LEVEL Testing? 2017 CTY Johns Hopkins University

Scholastic Leveled Bookroom

Introduction to the Practice of Statistics

Developing an Assessment Plan to Learn About Student Learning

AP Statistics Summer Assignment 17-18

Kindergarten Iep Goals And Objectives Bank

TRAINEESHIP TOOL MANUAL V2.1 VERSION April 1st 2017 * HOWEST.BE

Independent Assurance, Accreditation, & Proficiency Sample Programs Jason Davis, PE

OVERVIEW OF CURRICULUM-BASED MEASUREMENT AS A GENERAL OUTCOME MEASURE

School Size and the Quality of Teaching and Learning

Preparing for the School Census Autumn 2017 Return preparation guide. English Primary, Nursery and Special Phase Schools Applicable to 7.

Iowa School District Profiles. Le Mars

Delaware Performance Appraisal System Building greater skills and knowledge for educators

CLEVELAND STATE UNIVERSITY James J. Nance College of Business Administration Marketing Department Spring 2012

MASTER S THESIS GUIDE MASTER S PROGRAMME IN COMMUNICATION SCIENCE

Making the ELPS-TELPAS Connection Grades K 12 Overview

A Guide to Adequate Yearly Progress Analyses in Nevada 2007 Nevada Department of Education

STUDENT MOODLE ORIENTATION

Preferences...3 Basic Calculator...5 Math/Graphing Tools...5 Help...6 Run System Check...6 Sign Out...8

2013 TRIAL URBAN DISTRICT ASSESSMENT (TUDA) RESULTS

History of CTB in Adult Education Assessment

LEARNING AGREEMENT FOR STUDIES

ESTABLISHING A TRAINING ACADEMY. Betsy Redfern MWH Americas, Inc. 380 Interlocken Crescent, Suite 200 Broomfield, CO

Cooper Upper Elementary School

Review of Student Assessment Data

Automating Outcome Based Assessment

Wonderworks Tier 2 Resources Third Grade 12/03/13

Shelters Elementary School

GradinG SyStem IE-SMU MBA

Foothill College Fall 2014 Math My Way Math 230/235 MTWThF 10:00-11:50 (click on Math My Way tab) Math My Way Instructors:

Biological Sciences, BS and BA

MyUni - Turnitin Assignments

Guidelines for the Iowa Tests

Field Experience Management 2011 Training Guides

Student Learning Objectives Overview for New Districts

Your School and You. Guide for Administrators

CERTIFIED TEACHER LICENSURE PROFESSIONAL DEVELOPMENT PLAN

Annual Report to the Public. Dr. Greg Murry, Superintendent

Technical Manual Supplement

i>clicker Setup Training Documentation This document explains the process of integrating your i>clicker software with your Moodle course.

Course Groups and Coordinator Courses MyLab and Mastering for Blackboard Learn

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

Thomas Jefferson University Hospital. Institutional Policies and Procedures For Graduate Medical Education Programs

Curriculum Assessment Employing the Continuous Quality Improvement Model in Post-Certification Graduate Athletic Training Education Programs

A Case Study: News Classification Based on Term Frequency

Learning Microsoft Office Excel

INSTRUCTOR USER MANUAL/HELP SECTION

ECON 365 fall papers GEOS 330Z fall papers HUMN 300Z fall papers PHIL 370 fall papers

The Moodle and joule 2 Teacher Toolkit

Statewide Framework Document for:

Massachusetts Department of Elementary and Secondary Education. Title I Comparability

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Learning Microsoft Publisher , (Weixel et al)

Pre-Algebra A. Syllabus. Course Overview. Course Goals. General Skills. Credit Value

Using Blackboard.com Software to Reach Beyond the Classroom: Intermediate

Measurement & Analysis in the Real World

Transcription:

NATIONAL BOARD OF MEDICAL EXAMINERS Examination Interpretation Guide NBME subject examinations provide medical schools with a tool for measuring examinees' understanding of the clinical sciences. Although these examinations are designed to be broadly appropriate as part of overall examinee assessment, course objectives vary across schools, and the congruence between subject examination content and course objectives should be considered when interpreting test scores and determining grading standards. Specifically, subject examination scores should not be used alone, but rather in conjunction with other indicators of examinee performance in determination of grades. Subject Examination s The subject examination score is scaled to a mean of 70 and a standard deviation of 8. A Comprehensive Clinical Science Examination () score of 70 is approximately equivalent to a score of 200 on the United States Medical Licensing Examination (USMLE ) Step 2 Clinical Knowledge (CK). The vast majority of scores range from 45 to 95, and although the scores have the "look and feel" of percent-correct scores, they are not. Because the and USMLE Step 2 CK cover very similar content, this scale provides a useful tool for comparing the scores of your examinees with those of a large, nationally representative group taking the licensing examination at the end of the second year of medical school. Additional information about the relationship between scores and USMLE Step 2 CK scores is provided in the table on the next page. Unlike percent-correct scores, subject examination scores are statistically equated across test administrations. s are statistically adjusted for shifts in test difficulty across different forms of the examination. This makes it possible to track school and examinee performance over time. Precision of s Measurement error is present on all tests, and the standard error of measurement (SEM) provides an index of the (im)precision of scores. The SEM indicates how far an examinee s score on the examination might stray from his/her true proficiency level across repeated testing using different sets of items covering the same content. Using the SEM, it is possible to calculate a score interval that will encompass about two thirds of the observed scores for a given true score by adding and subtracting the SEM from that score. For this examination, the SEM is approximately 4 points. For example, if an examinee s true proficiency on the examination is 60, the score he/she achieved on the examination will usually (two times out of three) fall between 56 and 64 (60-4 and 60 + 4). and Performance Feedback Summary information on the examinee group tested, examination purpose and number of items scored is provided on each page of the feedback. The Roster of Scaled s reports a total test scaled score for each examinee. Reported scores also appear in a comma separated text file that can be downloaded and used to export scores. If there were at least 2 examinees, Scaled Descriptive Statistics for reported scores are provided along with a Frequency Distribution of the total test scaled score. If there were at least 10 examinees, a Content Area Item Analysis Report is provided. If there were at least 15 examinees, a School Summary Performance Profile is provided. An Examinee Performance Profile, which graphically displays content areas of strength and weakness, is provided for each examinee. ccse_2011_1001

NATIONAL BOARD OF MEDICAL EXAMINERS Examination Approximate USMLE Performance Equivalents The table below provides approximate performance equivalents for USMLE Step 2 CK, making it possible for you to translate the scores of your examinees to the scale used for USMLE Step 2 CK. Specific information on USMLE Step 2 CK and the current minimum passing score is available on the USMLE web site at www.usmle.org. To use the table, locate an examinee s score in the associated column and note the entry in the column labeled "Step 2 CK Equivalent". For example, if an examinee s score is 60, the corresponding entry of 175 indicates that the examinee's performance on the is approximately equivalent to a Step 2 CK score of 175. Note: This examination is not intended to predict performance on USMLE. Rather, it is intended to be used to determine an examinee s relative areas of strength and weakness in general topic areas. Approximate Step 2 CK Equivalents Norms for Examinee Performance The table below provides norms to aid in the interpretation of examinee performance. The norms reflect the performance of a group comprised of 3,002 examinees from LCME-accredited medical schools who took the paper version of the for the first time during the 2008-09, 2009-10 and 2010-11 academic years (8/1/2008 7/31/11). This group had a mean of 70 and a standard deviation of 11 on the score scale. To use the table, locate a student s score in the associated column and note the entry in the column labeled "Percentile Rank". For example, if an examinee s score is 70, the corresponding percentile rank of indicates that % of the national group of examinees taking this examination had scores at or below 70. 2008-2011 Academic Years Norms Percentile Rank Percentile Rank 94 92 90 88 84 82 70 Step 2 CK Equivalent 260 255 250 245 240 235 230 225 220 215 210 205 200 64 62 60 56 54 52 50 48 46 44 Step 2 CK Equivalent 195 190 185 1 175 170 165 160 155 150 145 140 135 92 91 90 88 87 85 84 82 81 79 77 75 97 96 95 95 94 92 90 88 85 79 75 70 70 65 64 63 62 60 57 56 55 54 64 57 50 46 42 37 35 31 28 24 23 19 16 14 12 10 9 8 ccse_2011_1001

Scaled Descriptive Statistics Scaled score descriptive statistics (mean, standard deviation, lowest score, and highest score) for the examination administered on the specified test date(s) are listed below. Please refer to the Interpretation Guide for information about how to interpret the scores. Medical s MSS - 2009 # 01 # d Items 1 Category Description Test Mean.4 Standard Deviation 12.4 Low 44 High Page 1 of 5

Roster of Scaled s The roster of scores contains scaled score(s) for each examinee who tested on the specified test date(s). These same scores also appear in a comma separated value text file that can be downloaded and used to export scores into your local database. Please refer to the Interpretation Guide for information about how to interpret the scores. Medical s MSS - 2009 # 01 # d Items 1 ID 0046 0045 0044 0043 0042 0041 0040 0039 0038 0037 0036 0035 0034 0033 0032 0031 0030 0029 0028 0027 Name Page 2 of 5

Roster of Scaled s Medical s MSS - 2009 # 01 # d Items 1 ID 0026 0025 0024 0023 0022 0021 0020 0019 0018 0017 0016 0015 0014 0013 0012 0011 0010 009 008 Name 44 98 63 Page 3 of 5

Roster of Scaled s Medical s MSS - 2009 # 01 # d Items 1 ID 007 006 005 004 003 002 001 Name 87 Page 4 of 5

Scaled Frequency Distribution Medical s MSS - 2009 # 01 # d Items 1 Scaled 44 63 87 98 Cumulative Frequency Frequency Count % Count % 6 13 8 17 9 19 10 21 14 30 18 38 20 43 23 49 24 51 27 57 29 62 31 35 36 77 39 40 85 41 87 42 43 91 44 100 Page 5 of 5