Continuous Improvement Planning Document

Similar documents
Developing an Assessment Plan to Learn About Student Learning

Writing a Basic Assessment Report. CUNY Office of Undergraduate Studies

Karla Brooks Baehr, Ed.D. Senior Advisor and Consultant The District Management Council

Contract Language for Educators Evaluation. Table of Contents (1) Purpose of Educator Evaluation (2) Definitions (3) (4)

KENTUCKY FRAMEWORK FOR TEACHING

CONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION. Connecticut State Department of Education

STUDENT LEARNING ASSESSMENT REPORT

Week 4: Action Planning and Personal Growth

Kelso School District and Kelso Education Association Teacher Evaluation Process (TPEP)

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Midterm Evaluation of Student Teachers

PREPARING FOR THE SITE VISIT IN YOUR FUTURE

West Georgia RESA 99 Brown School Drive Grantville, GA

Providing Feedback to Learners. A useful aide memoire for mentors

School Leadership Rubrics

Georgia State University Department of Counseling and Psychological Services Annual Report

EQuIP Review Feedback

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Queen's Clinical Investigator Program: In- Training Evaluation Form

Practice Learning Handbook

The Art and Science of Predicting Enrollment

World s Best Workforce Plan

Practice Learning Handbook

eportfolio Guide Missouri State University

PROFESSIONAL INTEGRATION

PROGRAMME SPECIFICATION KEY FACTS

Colorado s Unified Improvement Plan for Schools for Online UIP Report

Governors and State Legislatures Plan to Reauthorize the Elementary and Secondary Education Act

Requirements for the Degree: Bachelor of Science in Education in Early Childhood Special Education (P-5)

Presentation 4 23 May 2017 Erasmus+ LOAF Project, Vilnius, Lithuania Dr Declan Kennedy, Department of Education, University College Cork, Ireland.

Indiana Collaborative for Project Based Learning. PBL Certification Process

Goal #1 Promote Excellence and Expand Current Graduate and Undergraduate Programs within CHHS

Final Teach For America Interim Certification Program

School Performance Plan Middle Schools

Evaluation Off Off On On

Apps4VA at JMU. Student Projects Featuring VLDS Data. Dr. Chris Mayfield. Department of Computer Science James Madison University

The Keele University Skills Portfolio Personal Tutor Guide

Institutional Program Evaluation Plan Training

Language Arts Methods

Bellehaven Elementary

Program Assessment and Alignment

Accountability in the Netherlands

KAHNAWÀ: KE EDUCATION CENTER P.O BOX 1000 KAHNAW À:KE, QC J0L 1B0 Tel: Fax:

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Student Assessment and Evaluation: The Alberta Teaching Profession s View

Research Design & Analysis Made Easy! Brainstorming Worksheet

Assessment. the international training and education center on hiv. Continued on page 4

Programme Specification. MSc in Palliative Care: Global Perspectives (Distance Learning) Valid from: September 2012 Faculty of Health & Life Sciences

Virginia Commonwealth University Retrospective Concussion Diagnostic Interview - Blast. (dd mmm yyyy)

Cuero Independent School District

A Math Adventure Game Pi and the The Lost Function Episode 1 - Pre-Algebra/Algebra

REQUIRED TEXTS Woods, M. & Moe, A.J. (2011). Analytical Reading Inventory with Readers Passages (9 th edition). Prentice Hall.

Student Handbook 2016 University of Health Sciences, Lahore

Biological Sciences, BS and BA

Clinical Mental Health Counseling Program School Counseling Program Counselor Education and Practice Program Academic Year

Learning Objectives by Course Matrix Objectives Course # Course Name Psyc Know ledge

Strategic Plan Dashboard

A Guide to Student Portfolios

State Parental Involvement Plan

Understanding Language

Student Assessment Policy: Education and Counselling

Training Programme for Doctoral Thesis Supervisors in University of Turku

The ELA/ELD Framework Companion: a guide to assist in navigating the Framework

TEAM Evaluation Model Overview

Tentative School Practicum/Internship Guide Subject to Change

Mooresville Charter Academy

ISD 2184, Luverne Public Schools. xcvbnmqwertyuiopasdfghjklzxcv. Local Literacy Plan bnmqwertyuiopasdfghjklzxcvbn

ADDIE: A systematic methodology for instructional design that includes five phases: Analysis, Design, Development, Implementation, and Evaluation.

New Jersey Department of Education World Languages Model Program Application Guidance Document

Program Report for the Preparation of Journalism Teachers

Kannapolis Charter Academy

Demystifying The Teaching Portfolio

Using Team-based learning for the Career Research Project. Francine White. LaGuardia Community College

International School of Kigali, Rwanda

Implementing Pilot Early Grade Reading Program in Morocco

University of Richmond Teacher Preparation Handbook

African American Male Achievement Update

POLICIES AND PROCEDURES

NORTH CAROLINA STATE BOARD OF EDUCATION Policy Manual

Getting Ready for the Work Readiness Credential: A Guide for Trainers and Instructors of Jobseekers

Making the ELPS-TELPAS Connection Grades K 12 Overview

International: Three-Year School Improvement Plan to September 2016 (Year 2)

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

Inquiry Learning Methodologies and the Disposition to Energy Systems Problem Solving

FY year and 3-year Cohort Default Rates by State and Level and Control of Institution

ACBSP Related Standards: #3 Student and Stakeholder Focus #4 Measurement and Analysis of Student Learning and Performance

Revision and Assessment Plan for the Neumann University Core Experience

ASSESSMENT REPORT FOR GENERAL EDUCATION CATEGORY 1C: WRITING INTENSIVE

Application Guidelines for Interventional Radiology Review Committee for Radiology

Examinee Information. Assessment Information

Expanded Learning Time Expectations for Implementation

Vowel Alternations and Predictable Spelling Changes

INSIGHTS INTO THE IMPLEMENTATION OF MATHEMATICAL LITERACY

Advances in Assessment The Wright Institute*

Communication Skills for Architecture Students

ACS THE COMMON CORE, TESTING STANDARDS AND DATA COLLECTION

EFFECTIVE CLASSROOM MANAGEMENT UNDER COMPETENCE BASED EDUCATION SCHEME

Field Experience and Internship Handbook Master of Education in Educational Leadership Program

Orleans Central Supervisory Union

Early Warning System Implementation Guide

Transcription:

Continuous Improvement Planning Document Candidate and Completer Data Evidence Lesson Plan Assessment What does it assess? Who does the evaluating? Standards 1-8: planning for instruction Primary Instructors of 220, 321, 346/7 Using Data on Candidate Progress (formative) and Performance (summative) Collected how Analyzed how? What do we hope to learn? often? in EDUC 220 (baseline for an assessment cohort), 321 (midpoint for assessment cohort), and 346/7 (endpoint for assessment cohort) Scores received by candidates on each indicator will be aggregated by class to see which indicators pose difficulty at each stage in the program Each course assessment point has a different acceptable score because we anticipate growth through the program and therefore raise the bar for acceptability--candidates in EDUC 220 are expected to be between developing and proficient (average score of 2.5 on all indicators); candidates in EDUC 321 are expected to be proficient (average score of 3); candidates in EDUC 346/7 are expected to be between proficient and exemplary (average score of 3.5). Scores taken as a whole to see if patterns emerge across the Who will participate in analysis? who deliver the program insight as to which standards (1-8) are most Analysis should also yield insight into the progress of individual candidates so that we may offer additional support for struggling learners.

Dispositions Assessment Diversity Reflection Standards 9-10: professional dispositions Candidates; Instructors of 220, 321, 346/7 Standards 1-2: understanding of diversity as in EDUC 220, 321, and 346/7 in EDUC 220, 321, and 346/7 program in indicators that pose difficulty Lesson plans submitted by candidates in 321 and 346/7 will also be compared to their previous lesson plan scores to evaluate individual and group progress Individual candidates selfassessment scores on the indicators will be compared to their scores by their instructor If discrepancies exist, or if a candidate has scored unacceptable on any indicator, a meeting will be scheduled with two faculty members and the candidate to (1) discuss concerns and (2) write a plan of action. Each year, documentation of these meetings (along with any previous dispositional assessments submitted) will be shared at meetings of the and with the Teacher Preparation Committee during application review and prior to student Scores received by candidates on each indicator will be aggregated by course to see which indicators pose difficulty at each of three information about individual candidates dispositional performance so that we can address concerns with candidates and implement action plans. Consistent concerns (especially those that persist despite intervention) may lead to counseling the candidate out of the program. insight as to which aspects of diversity (responsibility; kinds of diversity; skills, knowledge, dispositions

Clinical and Field Evaluation relates to the profession of teaching Instructors of 220, 321, 346/7 Standards 1-10: planning and in EDUC 220, 321, and two in 346/7 stages in the program (beginning, middle, endpoint in a time series) Each course assessment point has a different acceptable score because we anticipate growth through the program and therefore raise the bar for acceptability-- candidates in EDUC 220 are expected to be between developing and proficient (average score of 2.5 on all indicators); candidates in EDUC 321 are expected to be proficient (average score of 3); candidates in EDUC 346/7 are expected to be between proficient and exemplary (average score of 3.5). Scores taken in aggregate to see if patterns emerge across the program in indicators that pose difficulty Diversity assessments submitted by candidates in 321 and 346/7 will also be compared to their previous diversity assessments scores to evaluate individual and group progress Scores received by candidates on each of the 20 indicators will be aggregated by class to see which indicators pose difficulty at each stage in the program needed to teach diverse students well; and importance of self awareness) are most Analysis should also yield insight into the progress of individual candidates so that we may offer additional support for struggling learners (or counsel them out of the program). insight as to which of the indicators are most

Electronic Portfolio Evidence Employer Survey implementation of instruction as well as professional deportment Host teachers (220, 321, 346/7) and College Supervisors (346/7) competence in each of the 10 Standards (Summative) What does it assess? Who does the evaluating? Each class has a different acceptable score (see above) Scores taken in aggregate to see if patterns emerge across the program in indicators that pose difficulty Field/Clinical Evaluations of candidates in 321 and 346/7 will also be compared to their previous Field/Clinical Evaluations to evaluate individual and group progress. Each spring in Scores received by candidates on EDUC 346/7 each of ten indicators will be (not a timeseries emerge in indicators that pose aggregated to see if patterns assessment) difficulty Candidates at this stage are expected to be at least proficient on all indicators. Using Data on Completer and Impact** Analyzed how? Collected how often? employers of asked to Ratings on each of the questions will be aggregated to see if patterns emerge in indicators that pose difficulty Who will participate? Analysis should also yield insight into the progress of individual candidates so that we may offer additional support for struggling learners (or counsel them out). insight as to which standards (1-10) are most What do we hope to learn? Analysis of survey data and interview notes should yield insight into areas of strength and weakness among of our program,

Completer Survey Completer Focus Group Interview Employers Completers impact on P-12 learning Program effectiveness Completers (selfassessment) complete this survey asked to complete this survey asked to participate in a focus group interview with Data will be disaggregated by program area to see how program perform. Ratings on each question will be aggregated to see if patterns exist in indicators that pose difficulty Four of the questions specifically ask to reflect on the degree to which R-MC prepared them for success in each category. These responses are numerical and narrative. Responses will be compiled separately and used to reflect on program effectiveness. Questions focus on how Completers measure student learning and use assessment to inform pedagogical changes Notes from focus group interview will be compiled in order to provide additional insight into strengths and areas related to measuring student our teaching approaches to Analysis of survey data should yield insight into areas of strength and weakness among of our program, and where we need to adjust our teaching approaches to Analysis of program effectiveness questions should provide specific insight into the degree to which felt R-MC prepared them and yield specific recommendations for program improvement. Analysis of focus group feedback should yield insight into how well Completers are prepared to teach diverse learners and measure student learning. Areas of strength and weakness will be identified

Completers impact on P-12 learning learning and using data to drive instruction, as well as R-MC program effectiveness. and used to make program changes. Program effectiveness Completer VA Teacher Summative Performance Evaluation (T- LIPES) Completer Observations Clinical/ Field Evaluation Completers impact on P-12 learning School-based administrators invited to submit copies of their final formal evaluations recent (last year) are asked if we may send a member of the Scores on final evaluations and final recommendations by employers (for continued employment, areas of growth) are considered alongside Employer Surveys and Completer Surveys to assess the preparedness of our for the work of Section 7 on Virginia Teacher Evaluations focuses specifically on teachers impact on student learning. Scores on this section will be compiled separately to assess completer impact on P-12 learning. Scores on each of the indicators will be aggregated to see if there are patterns in the indicators that pose difficulty for in Analysis of formal evaluations should yield insight into areas of strength and weakness among of our program, our teaching approaches to Analysis of scores on Section 7 will yield insight into impact on student learning. Analysis of evaluations should yield insight into areas of strength and weakness among of our program, and where we need to adjust our teaching approaches to

to their classroom to observe their Data will be disaggregated by program to see how perform. *Progress Note: Because we spent AY 2017-2018 designing, validating, establishing inter-rater reliability, and implementing new assessments, our pilot cycle of data was completed in May 2018. Actual data gathering and analyses to be used for program improvements begins in AY 2018-2019. **Data-Gathering Note: Because the state of Virginia does not allow the release of K-12 student data, we are unable to assess the impact of our on learning as in end-of-year exams (SOLs). We believe that the combined data we collect on completer effectiveness (employer surveys and interviews, formal evaluations, completer surveys and focus group interview, and observations) will shed light on their impact on student learning.