Reporting Your Evidence

Similar documents
EQuIP Review Feedback

Developing an Assessment Plan to Learn About Student Learning

and secondary sources, attending to such features as the date and origin of the information.

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

STUDENT LEARNING ASSESSMENT REPORT

Upward Bound Program

Master of Science (MS) in Education with a specialization in. Leadership in Educational Administration

DOCTOR OF PHILOSOPHY BOARD PhD PROGRAM REVIEW PROTOCOL

Understanding and Interpreting the NRC s Data-Based Assessment of Research-Doctorate Programs in the United States (2010)

EXECUTIVE SUMMARY. Online courses for credit recovery in high schools: Effectiveness and promising practices. April 2017

Miami-Dade County Public Schools

Colorado s Unified Improvement Plan for Schools for Online UIP Report

Unit 7 Data analysis and design

Queensborough Public Library (Queens, NY) CCSS Guidance for TASC Professional Development Curriculum

08-09 DATA REVIEW AND ACTION PLANS Candidate Reports

Arkansas Tech University Secondary Education Exit Portfolio

Writing a Basic Assessment Report. CUNY Office of Undergraduate Studies

Providing Feedback to Learners. A useful aide memoire for mentors

College of Education & Social Services (CESS) Advising Plan April 10, 2015

Secondary English-Language Arts

PROGRAMME SPECIFICATION KEY FACTS

SACS Reaffirmation of Accreditation: Process and Reports

BENCHMARK TREND COMPARISON REPORT:

West Georgia RESA 99 Brown School Drive Grantville, GA

Introduction to the Revised Mathematics TEKS (2012) Module 1

Degree Qualification Profiles Intellectual Skills

MATH 205: Mathematics for K 8 Teachers: Number and Operations Western Kentucky University Spring 2017

CORRELATION FLORIDA DEPARTMENT OF EDUCATION INSTRUCTIONAL MATERIALS CORRELATION COURSE STANDARDS / BENCHMARKS. 1 of 16

Tun your everyday simulation activity into research

EDUC-E328 Science in the Elementary Schools

Chapter 9 The Beginning Teacher Support Program

Highlighting and Annotation Tips Foundation Lesson

Week 4: Action Planning and Personal Growth

Distinguished Teacher Review

University of Richmond Teacher Preparation Handbook

Procedures for Academic Program Review. Office of Institutional Effectiveness, Academic Planning and Review

Evaluation of Teach For America:

ACADEMIC AFFAIRS GUIDELINES

Chapters 1-5 Cumulative Assessment AP Statistics November 2008 Gillespie, Block 4

(Still) Unskilled and Unaware of It?

Effective practices of peer mentors in an undergraduate writing intensive course

Integrating Common Core Standards and CASAS Content Standards: Improving Instruction and Adult Learner Outcomes

The Oregon Literacy Framework of September 2009 as it Applies to grades K-3

Guru: A Computer Tutor that Models Expert Human Tutors

ASSESSMENT OVERVIEW Student Packets and Teacher Guide. Grades 6, 7, 8

Multiple Measures Assessment Project - FAQs

Linking the Ohio State Assessments to NWEA MAP Growth Tests *

Higher education is becoming a major driver of economic competitiveness

SSIS SEL Edition Overview Fall 2017

CHAPTER V: CONCLUSIONS, CONTRIBUTIONS, AND FUTURE RESEARCH

Algebra 1, Quarter 3, Unit 3.1. Line of Best Fit. Overview

A Study of Successful Practices in the IB Program Continuum

Essentials of Ability Testing. Joni Lakin Assistant Professor Educational Foundations, Leadership, and Technology

Assessment of Student Academic Achievement

The Condition of College & Career Readiness 2016

An Empirical Analysis of the Effects of Mexican American Studies Participation on Student Achievement within Tucson Unified School District

Program Report for the Preparation of Journalism Teachers

Getting Results Continuous Improvement Plan

Bureau of Teaching and Learning Support Division of School District Planning and Continuous Improvement GETTING RESULTS

Mandatory Review of Social Skills Qualifications. Consultation document for Approval to List

Foundation Certificate in Higher Education

STEM Academy Workshops Evaluation

St. Martin s Marking and Feedback Policy

Indiana Collaborative for Project Based Learning. PBL Certification Process

Abstract. Janaka Jayalath Director / Information Systems, Tertiary and Vocational Education Commission, Sri Lanka.

State Parental Involvement Plan

School Leadership Rubrics

Biological Sciences, BS and BA

Section 3.4. Logframe Module. This module will help you understand and use the logical framework in project design and proposal writing.

Course Specification Executive MBA via e-learning (MBUSP)

TU-E2090 Research Assignment in Operations Management and Services


Governors and State Legislatures Plan to Reauthorize the Elementary and Secondary Education Act

Summary / Response. Karl Smith, Accelerations Educational Software. Page 1 of 8

Field Experience Verification and Mentor Teacher Evaluation Form

NCEO Technical Report 27

Practices Worthy of Attention Step Up to High School Chicago Public Schools Chicago, Illinois

Field Experience Management 2011 Training Guides

Indicators Teacher understands the active nature of student learning and attains information about levels of development for groups of students.

Gr. 9 Geography. Canada: Creating a Sustainable Future DAY 1

Visit us at:

Cooper Upper Elementary School

University of the Arts London (UAL) Diploma in Professional Studies Art and Design Date of production/revision May 2015

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

STA 225: Introductory Statistics (CT)

Loyola University Chicago Chicago, Illinois

Standard 5: The Faculty. Martha Ross James Madison University Patty Garvin

Grade 6: Correlated to AGS Basic Math Skills

OVERVIEW OF CURRICULUM-BASED MEASUREMENT AS A GENERAL OUTCOME MEASURE

Business. Pearson BTEC Level 1 Introductory in. Specification

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Program Guidebook. Endorsement Preparation Program, Educational Leadership

Navitas UK Holdings Ltd Embedded College Review for Educational Oversight by the Quality Assurance Agency for Higher Education

English Language Arts Summative Assessment

Learning Objectives by Course Matrix Objectives Course # Course Name Psyc Know ledge

Programme Specification

SPCH 1315: Public Speaking Course Syllabus: SPRING 2014

Testimony to the U.S. Senate Committee on Health, Education, Labor and Pensions. John White, Louisiana State Superintendent of Education

KOMAR UNIVERSITY OF SCIENCE AND TECHNOLOGY (KUST)

Aalya School. Parent Survey Results

Transcription:

Reporting Your Evidence Glenda Breaux, Formative Evaluation Specialist for the Inquiry Brief Pathway glenda.breaux@caepnet.org

Session Description This workshop will provide participants with guidelines for reporting evidence and exercises in which they will have the opportunity to work through various challenges in reporting quantitative and qualitative evidence.

Agenda We will begin with an overview of the CAEP Evidence Guide and the principles of good evidence it contains. Next we will review some formatting options for reporting quantitative and qualitative data and discuss how reporting choices affect the strength of the case. The handouts contain examples and exercises that address common issues we have seen and questions we have been asked about reporting evidence.

CAEP Evidence Guide Relevant Verifiable Representative Cumulative Actionable Valid Consistent Directly related Sufficiently documented for later confirmation by outsiders Captures the typical state of affairs Multiple sources are additive Directly informs planning/decision-making Aligned, Unbiased, Informs understanding Accurate within/across sources & over time

General Strategies for Reporting Relevance: Label/tag each result by component.

Relevance: Label/tag each result by component.

Verifiability: Written Documentation of Events

Representative: Compare sample to population Describe the sampling procedure The audit trail is depicted in Figure A.1. We entered the audit trail with a modified random sample of currently enrolled students and recent program graduates chosen from within the target numbers outlined in the table below. These target numbers were established to reflect the approximate proportion of students enrolled by level, endorsement, and gender. This distribution list was given to our Field Placement Director who went into the files and randomly pulled files from within each of the categories above. The files for this group of 40 candidates and program completers served as the starting point for a number of different probes that relate to the quality control dimensions.

Representativeness: Compare sample to population A different EPP provided comparisons in a series of tables such as the one below that compared the sample to the population of completers by program option, ethnicity, gender. Graduation Semester Gender Candidates N=90 Sample N=45 Fall 2013 Male 10 5 Female 20 10 Spring 2014 Male 20 10 Female 40 20

Representativeness Ideally, the table would contain totals and the table or narrative would directly compare the percentages for each characteristic rather than leaving that to the reader. A cross-tabulation or Excel Pivot Table that lists the nested characteristics in a single table would be best if a table format is used. Many helpful tutorials on creating pivot tables can be found with a Google search.

Cumulativeness: Does it all add up Since multiple measures are used, the conclusions the program draws about whether the component or standard is met should refer to all of the measures used in support, as below: EPP goal: completers should be able to effectively teach content Proposed EPP measures: content GPA, methods GPA, licensure test score, student teaching assessment items Results: [data tables that show means (s.d.), ranges, pass rates, etc.] Conclusion: Candidates performance in InTASC-aligned courses, on the licensure test, and on InTASC- and state standards-aligned student teaching evaluations each show cohort mastery and indicate they are prepared to teach effectively.

Actionability: Data/evidence is sufficiently finegrained and disaggregated to tell you what happened, in which subgroup, and what you might need to do to change the outcome. Actionable finding: According to Praxis II results disaggregated by program, the level of subject matter preparation is inconsistent across licensure areas: The first-attempt pass rate was lower for elementary education candidates, where a significant percentage struggled on the mathematics and social studies subtest. Action Plan for Continuous Improvement: Review session, or tutoring, or new/different course requirements in these areas for elementary candidates.

Validity and Consistency: Data/Evidence is trustworthy and stable. These topics will be covered in more detail at another session in the conference, but it is important to note that: If, for example, you conduct a content analysis for assessing alignment, report the actual results of the analysis, not just the conclusion that it alignment was found to be sufficient. If, for example, you use grades as evidence, report how the grades were calculated (e.g., 50% weight for tests/papers, 30% weight for discussion/participation, etc.) to the extent that know this.

Validity and Consistency: Data/Evidence is trustworthy and stable. Present rank correlations that show that candidates with higher content area grades tend to have higher education course grades, etc. If ratings are used, report the qualifications of the raters that convince you that they are able to rate accurately. If they rate multiple candidates, report the extent to which they give the same performance the same score. If multiple raters are used, report the extent to which they assigned the same ratings to the same candidate performance.

Reporting Qualitative Data Context is especially important. Be sure to report the who, what, when, where, how, and why? See the methods description from a Inquiry Brief below: Analysis of Interview Transcripts Our analysis of the data generated by the focus group interview began with a process of collectively open coding or reading the data line by line and attaching labels to what we believed was taking place. In the introduction to his coding manual for qualitative research, Saldaña (2009) captures the idea of a code more fully: A code in qualitative inquiry is most often a word or short phrase that symbolically assigns a summative, salient, essence-capturing, and/or evocative attribute for a portion of language-based or visual data (p. 3). Similarly, we assigned attributes to portions of the data until larger categories started to take form. An overview of our coding process is depicted in Figure 2; however, the steps described were not necessarily intended to be formulaic, but rather to serve as a guide, thereby loosening the grip on this research technique.

Reporting Qualitative Data (cont.) They went on to present the coding categories,the distribution of data across categories, and exemplar statements. They reported the dominant themes and the criteria they used to determine dominance (e.g., repetition, emphasis, etc.). They then drew conclusions from the results. This is a powerful description of a rigorous qualitative analytic process of conceptualization, review, coding, categorization, classification, summarization. They also describe another process where a priori codes were applied to textual data.

Reporting Quantitative Data Disaggregate data by licensure area As appropriate, disaggregate by different modes of instruction (e.g., in-person vs. online), different delivery locations (e.g., main campus vs. branch campus), different levels (e.g., UG vs. MAT) Report Ns, score ranges, means, and standard deviations Report passing or goal scores and the meaning of score levels (e.g., student teaching evaluations might be scored on a scale of 1=unacceptable to 4=exemplary, with the goal of a mean scores of 3=proficient for each group) Report response rates for surveys Report benchmarks when available in the form of means or ranges for other candidates in the state, or the nation, or in some other comparison group

Tips Whether the data/evidence is quantitative or qualitative, it may be helpful to make a chart and record the way in which each piece and each set supporting a component/standard reflects the qualities of good evidence described in the CAEP Evidence Guide(p. 35-38). To the extent that you can, take advantage of CAEP s phase-in period. Our own experience transitioning from the legacy pathways to CAEP illustrates how the steepest climb is at the start of the journey and planning time leads to better outcomes.

Feedback Opportunity Engaged feedback is vital to CAEP. You will have an opportunity to complete a survey at the end of the conference. Surveys will be sent via email on Friday, April 10. We encourage your participation. Thank You!