CONTINUOUS IMPROVEMENT PLAN

Similar documents
Assessment of Student Academic Achievement

Master of Science (MS) in Education with a specialization in. Leadership in Educational Administration

Envision Success FY2014-FY2017 Strategic Goal 1: Enhancing pathways that guide students to achieve their academic, career, and personal goals

Davidson College Library Strategic Plan

Core Strategy #1: Prepare professionals for a technology-based, multicultural, complex world

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Developing an Assessment Plan to Learn About Student Learning

Standards and Criteria for Demonstrating Excellence in BACCALAUREATE/GRADUATE DEGREE PROGRAMS

Linguistics Program Outcomes Assessment 2012

VOL VISION 2020 STRATEGIC PLAN IMPLEMENTATION

State Budget Update February 2016

Testimony to the U.S. Senate Committee on Health, Education, Labor and Pensions. John White, Louisiana State Superintendent of Education

The Characteristics of Programs of Information

Common Core Postsecondary Collaborative

AIS KUWAIT. School Improvement Plan (SIP)

NORTH CAROLINA STATE BOARD OF EDUCATION Policy Manual

Governors and State Legislatures Plan to Reauthorize the Elementary and Secondary Education Act

Program Change Proposal:

Lincoln School Kathmandu, Nepal

STUDENT LEARNING ASSESSMENT REPORT

Early Warning System Implementation Guide

Colorado State University Department of Construction Management. Assessment Results and Action Plans

Strategic Planning for Retaining Women in Undergraduate Computing

CONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION. Connecticut State Department of Education

Title II of WIOA- Adult Education and Family Literacy Activities 463 Guidance

The completed proposal should be forwarded to the Chief Instructional Officer and the Academic Senate.

Strategic Goals, Objectives, Strategies and Measures

Cultivating an Enriched Campus Community

Self Assessment. InTech Collegiate High School. Jason Stanger, Director 1787 Research Park Way North Logan, UT

Ministry of Education, Republic of Palau Executive Summary

I. Proposal presentations should follow Degree Quality Assessment Board (DQAB) format.

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

$0/5&/5 '"$*-*5"503 %"5" "/"-:45 */4536$5*0/"- 5&$)/0-0(: 41&$*"-*45 EVALUATION INSTRUMENT. &valuation *nstrument adopted +VOF

ESTABLISHING A TRAINING ACADEMY. Betsy Redfern MWH Americas, Inc. 380 Interlocken Crescent, Suite 200 Broomfield, CO

Massachusetts Juvenile Justice Education Case Study Results

Procedures for Academic Program Review. Office of Institutional Effectiveness, Academic Planning and Review

For the Ohio Board of Regents Second Report on the Condition of Higher Education in Ohio

AB104 Adult Education Block Grant. Performance Year:

MINNESOTA STATE UNIVERSITY, MANKATO IPESL (Initiative to Promote Excellence in Student Learning) PROSPECTUS

Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd. Hertfordshire International College

The following faculty openings are managed by our traditional hiring process:

Assessment. the international training and education center on hiv. Continued on page 4

Field Experience and Internship Handbook Master of Education in Educational Leadership Program

DOCTOR OF PHILOSOPHY BOARD PhD PROGRAM REVIEW PROTOCOL

FORT HAYS STATE UNIVERSITY AT DODGE CITY

DESIGNPRINCIPLES RUBRIC 3.0

Learning Objectives by Course Matrix Objectives Course # Course Name Psyc Know ledge

Chart 5: Overview of standard C

Karla Brooks Baehr, Ed.D. Senior Advisor and Consultant The District Management Council

Introduction: SOCIOLOGY AND PHILOSOPHY

Appendix IX. Resume of Financial Aid Director. Professional Development Training

FACULTY OF PSYCHOLOGY

SPECIALIST PERFORMANCE AND EVALUATION SYSTEM

A Guide to Student Portfolios

Barstow Community College NON-INSTRUCTIONAL

Basic Skills Plus. Legislation and Guidelines. Hope Opportunity Jobs

George Mason University Graduate School of Education

HSC/SOM GOAL 1: IMPROVE HEALTH AND HEALTHCARE IN THE POPULATIONS WE SERVE.

City of Roseville 2040 Comprehensive Plan Scope of Services

TRI-STATE CONSORTIUM Wappingers CENTRAL SCHOOL DISTRICT

eportfolio Guide Missouri State University

ACCREDITATION STANDARDS

ACADEMIC ALIGNMENT. Ongoing - Revised

PROPOSAL FOR NEW UNDERGRADUATE PROGRAM. Institution Submitting Proposal. Degree Designation as on Diploma. Title of Proposed Degree Program

Social Emotional Learning in High School: How Three Urban High Schools Engage, Educate, and Empower Youth

Loyalist College Applied Degree Proposal. Name of Institution: Loyalist College of Applied Arts and Technology

Program Guidebook. Endorsement Preparation Program, Educational Leadership

Colorado s Unified Improvement Plan for Schools for Online UIP Report

Secondary English-Language Arts

CALIFORNIA DEPARTMENT OF EDUCATION

Strategic Plan SJI Strategic Plan 2016.indd 1 4/14/16 9:43 AM

Expanded Learning Time Expectations for Implementation

The Teaching and Learning Center

DRAFT Strategic Plan INTERNAL CONSULTATION DOCUMENT. University of Waterloo. Faculty of Mathematics

Requirements for the Degree: Bachelor of Science in Education in Early Childhood Special Education (P-5)

SACS Reaffirmation of Accreditation: Process and Reports

State Parental Involvement Plan

Saint Louis University Program Assessment Plan. Program Learning Outcomes Curriculum Mapping Assessment Methods Use of Assessment Data

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

Student Support Services Evaluation Readiness Report. By Mandalyn R. Swanson, Ph.D., Program Evaluation Specialist. and Evaluation

Tentative School Practicum/Internship Guide Subject to Change

EDIT 576 DL1 (2 credits) Mobile Learning and Applications Fall Semester 2014 August 25 October 12, 2014 Fully Online Course

An Industrial Technologist s Core Knowledge: Web-based Strategy for Defining Our Discipline

An Introduction to LEAP

Update on Standards and Educator Evaluation

Contract Language for Educators Evaluation. Table of Contents (1) Purpose of Educator Evaluation (2) Definitions (3) (4)

Education: Professional Experience: Personnel leadership and management

Comprehensive Student Services Program Review

California Professional Standards for Education Leaders (CPSELs)

Mary Washington 2020: Excellence. Impact. Distinction.

Final Teach For America Interim Certification Program

This Performance Standards include four major components. They are

University of Colorado Skaggs School of Pharmacy and Pharmaceutical Sciences Programmatic Evaluation Plan

Educational Quality Assurance Standards. Residential Juvenile Justice Commitment Programs DRAFT

National and Regional performance and accountability: State of the Nation/Region Program Costa Rica.

Volunteer State Community College Strategic Plan,

SSTATE SYSIP STEMIC IMPROVEMENT PL A N APRIL 2016


Executive Summary. Laurel County School District. Dr. Doug Bennett, Superintendent 718 N Main St London, KY

Transcription:

CONTINUOUS IMPROVEMENT PLAN PAST PRACTICES Assessment in the College was relatively dynamic during the period between accreditation visits. In the past, accreditation was largely a function of an external consultant hired prior to the site visit. This practice, coupled with a number of changes in leadership, not surprisingly yielded a 2007 NCATE visit where the College was cited for deficiencies in assessment. A number of faculty worked to improve assessments for the focused visit that occurred the following year but the changes were at the programmatic level. The ongoing assessments that were conducted were largely compliance driven: Title II, Professional Education Data System (PEDS), Colorado Department of Education (CDE), and Specialty Professional Associations (SPAs). The identification of assessment as an area needing improvement and the College s understanding that better assessments and a unit-wide assessment plan are needed to move it forward are the primary factors that led to Assessment being the area of focus for the Continuous Improvement Plan. Currently the College is undergoing a number of important transitions related to assessment. The most important is a movement from compliance-driven assessment with little long-term vision to an internally-directed, program-improvement framework. The second shift is a movement from NCATE to CAEP. The former marks not only a new use in assessment data, but also new assessments and audiences. The third change is a shift from the College s use of internally validated assessments to more nationally and externally-validated measures. Understandably, the College is working to adjust to these differences and integrate substantial cultural and practical shifts. Departments often functioned as autonomous units with limited integration of assessment, data sharing, or concrete steps toward achieving shared goals. This culture has begun to visibly shift under a newly revitalized Assessment and Accreditation Committee supported by improved business practices, dedicated personnel and program support, and increased availability of data and new tools. Among these important steps are: the creation of an integrated database capable of tracking students from inquiry to the workplace (Cherwell) and an Assessment and Operations Specialist supporting data flow and assessment needs. Finally, the University has begun to realize some of the benefits of the multi-million dollar, system-wide student record system (ISIS) capable of supporting advanced business analytics. As a result of this evolution, the Assessment and Accreditation Committee and College as a whole have begun to identify key needs in order to support this change. Among these needs are regular reports and new assessment tools in order to answer vital questions in a more timely and transparent manner. More importantly, they have begun to address broader questions of candidate achievement and program improvement. These needs were highlighted by the convergence of two key events that occurred over the past 14 months. The first were SPA reports. As part of the first step toward national program recognition and CAEP accreditation, the majority of programs submitted SPA reports. Because of the multitude of loosely integrated data streams and a limited assessment framework, the work proved exceptionally taxing. The difficulty in producing strong SPA reports highlighted the need 1

for better tools, planning, and reporting to departments and program leaders. Nearly simultaneously, the University of Colorado Board of Regents initiated a program prioritization process based largely on the work of Dickeson (2010). Similar to the SPA process, the campus struggled to provide the required data and was not positioned to provide the program-specific metrics required to successfully benefit from the prioritization process. Although the College was able to provide program-specific data, it was a cumbersome and time-consuming process. CURRENT STATE Concurrent with this shift in how data are collected and utilized are a number of key drivers of change. The most important is Colorado s entry into the use of value added measures (VAM) tied to teacher effectiveness. While the legitimacy of tying student growth data to individual teachers continues to be researched, the State s data collection will provide critical data to Educator Preparation Programs. An outgrowth of Senate Bill 191, this system not only requires districts to use measures of student growth for teacher retention, it links this data back to EPPs for the first time. The data, when fully available, will allow EPPs to longitudinally track the workforce participation and results of program completers in a systematic way never previously possible. With this increase in data quality and availability, there will surely come increased scrutiny of EPPs to account for differences in workforce outcomes. In addressing the implementation with the Colorado Department of Education, and similar to other states, it will take multiple iterations before full implementation. The wealth of information provided through this framework is hard to overstate. These data will help inform the College on teacher impact on student growth. Unlike other measures such as employer surveys, portfolios, or alumni surveys, the data returned to the College will provide a view of teachers as their districts see them. It will shift the foundation for a comprehensive framework of conversations between school district superintendents and human resource directors and the College to a more outcome-centric view. Given the diversity of districts served by the College from high-risk, urban to exceptionally small rural the College can better address the needs of regional employers. The data will serve as a lever toward measuring and improving program efficacy. The second key driver is the rapid growth of both UCCS and the College. There are a number of new programs entering the assessment system, often bringing new perspectives on the role and nature of assessment. These include: UCCSTeach a UTeach replication site with their first program completer in Fall of 2013; a new, cross-disciplinary Bachelor of Innovation in Inclusive Early Childhood Education (BI:IECE) started fall of 2013; a proposed Bachelor of Arts in Inclusive Elementary Education to begin in 2015; and school-district level cohorts in principal preparation and educational leadership begun in 2013. All of these have leaders eager for feedback on how to improve and grow their programs. As a result, there is a renewed and pressing need for candidate and course-level data to inform questions about programs as well as build an infrastructure to support unprecedented levels of growth. Another driver of change were the college-level conversations that emerged over the past three years. The impact is seen in departmental meetings, cross-department discussions, and program creations. The lessons from SPA and the program prioritization process also have impacted this change. The combination of experiences highlighted the demand for quality, timely data and the 2

need to build the requirements of accreditation into the regular operations of programs and departments. The first steps in this process were identifying regular reports to inform practice. In conjunction with this need, the College has taken tangible steps to automate processes. The combination of the revitalized Assessment and Accreditation Committee and new data streams have laid the foundation for departments to work collaboratively in strategic investigations for regular program improvement. A final critical component in the move toward an integrated assessment framework is the reconstitution of the College Advisory Board (CAB). The CAB has members from a number of stakeholder groups beyond the College, including district and community representatives. As a result, CAB provides an authentic audience for the work of the assessment committee in addition to serving as a source of feedback. Fully integrating the CAB into the assessment framework represents a key component in the shift from predominantly internally-generated assessments with largely internal audiences to a variety of internal and external assessments reported to a larger audience. FUTURE STATE In order to accomplish the critical changes needed to move assessment and accreditation initiatives forward and institutionalize practices, the College has already taken a number of concrete steps. The first is regular reports of data on key internal measures. These data will provide new and more complete final measures of program impact and will serve as a springboard for more detailed conversations and investigations into candidate quality and content knowledge. Because of the large numbers of programs that utilize a cohort model, the College is able to link course-taking patterns, year of completion, and instructor to groups as well as individual students. Thus, if a program makes a change within a given year, it is reasonably easy to trace the effects through cohorts The biggest changes include incorporating new, externally validated measures into programs. Educator preparation is implementing a number of new instruments in the near future. These include: Student Perception Survey (SPS) - jointly developed by the Gates Foundation, Tripod Project, and Colorado Legacy Foundation. The instrument is directly aligned to the Colorado Teacher Quality Standards and measures multiple domains of student perceptions including learning environment and classroom management. edtpa- developed at Stanford to predict classroom readiness for candidates, it is a performance-based assessment of teaching. COE educators will also score candidates work to compare their predicted scores with nationally certified examiners. UCCSTeach, TELP, and ALP are piloting the use of edtpa in 2014. The Classroom Assessment Scoring System (CLASS) - developed by the University of Virginia s Curry School of Education. The domains for the instrument target the craft of teaching and will be administered multiple times to measure candidate growth. Both Special Education and UCCSTeach are currently piloting CLASS. The Novice Teacher Project is a collaborative statewide effort is being piloted by a number of institutions and led by the University of Denver. The Novice Teacher Core Competencies: Impacts on Student Achievement and Effectiveness of Teacher 3

Preparation. The participating EPPs have received data from the 2013 administration of the survey and are anticipating 2 nd year data sometime in summer 2014. Internally developed measures will remain an important part of the College s assessment framework. The College already utilizes an exit survey for program completers. Utilizing workforce data from SB 191, the College is building a number of new instruments including: a post-employment survey to be administered annually for the first five years after exiting a program and an employer survey to be administered annually. The post-employment survey is being jointly developed with the Office of Alumni Relations to provide feedback for the University s strategic plan without creating redundant surveys. Paramount among the internally developed measures are the key assignments linked to SPA standards in each course throughout the College. Departments and programs are understandably at different places culturally and technically with this process. Leadership and Special Education are front runners in this process, having already implemented initial rounds of key assignments. These assignments are aligned to ELCC and CEC standards as well as the appropriate Colorado standards. To facilitate this process, the key assignments and rubrics are incorporated into the Blackboard platform faculty already use. This also allows for automation of reporting and ease of sharing rubrics between faculty members teaching the same course. In the case of TELP And ALP, assignments are linked to the Colorado Standards in both Blackboard and Task Stream. Reflecting the collaborative nature of educator preparation, the College has initiated an annual dialogue with faculty in the College of Letters Arts and Sciences (LAS) who teach in those content areas. This dialogue is designed to serve a number of functions including dissemination of SPA requirements regarding content knowledge; establishing a framework for sharing internally and externally collected data about candidates and completers; and identifying areas for improved articulation between the two Colleges. The College has utilized a modified version of the Plan/Do/Study/Act cycle outlined in the CAEP Evidence Guide for introducing its major College-wide initiatives. These include a multitude of departmental projects such as key assignment reporting, new courses, online offerings, and the implementation and adoption of external measures such as CLASS. 4

WHEN WHAT WHAT'S NEXT Spring 2014 Full implementation of the student exit survey Departments utilize data in reports to A&A Committee, College Advisory Board, and Dean. Recommend changes to instrument if needed edtpa piloted Programs will analyze data to determine program weaknesses, strengths. Make programmatic adjustments based on findings CLASS piloted Scores and process will be analyzed by programs. Cross-departmental faculty will make recommendations for 'next steps' 2014-15 Academic Year CAEP BOE on-site visit November 16-18 Consider lessons learned from visit, make changes if needed Pilot for Student Perception Survey (SPS) Programs will utilize data to inform candidates of their performance and analyze data for programmatic use Full implementation of: CLASS and edtpa Analyze and disseminate findings, make programmatic adjustments if necessary Pilot employer survey: TELP, ALP, SELP, LEAD Analyze and disseminate findings, adjust instrument, process, consider programmatic adjustments based on results 5

Pilot alumni survey: TELP, ALP, SELP, LEAD, School counselors Analyze and disseminate findings, adjust instrument, process, consider programmatic adjustments based on results 2015-16 Academic Year First wave of SB 191 data Determine accuracy of data per program; establish a process for analyzing data, dissemination. COE discussion of data's implications Initial analysis and reflection on: CLASS and edtpa Full implementation of: Student Perception Survey (SPS) Analyze and disseminate findings, consider Analyze and disseminate findings, consider Unified COE employer survey: TELP, ALP, SELP, LEAD Finalize instrument and conduct surveys. Analyze and disseminate findings, adjust instrument, process. Consider programmatic adjustments based on results Unified COE alumni survey: TELP, ALP, SELP, LEAD, School counselors Finalize instrument and conduct surveys. Analyze and disseminate findings, adjust instrument, process. Consider programmatic adjustments based on results 2016-2017 Academic Year Second wave of SB 191 data first year two begin examining growth Continue to monitor data's accuracy. Analyze and disseminate findings, determine patterns that need to be addressed, consider 6

Continuous improvement adjustments for: CLASS and edtpa Continue to collect and analyze findings, disseminate results and consider programmatic adjustments if needed 2017-2018 Academic Year Initial analysis and reflection on: SB 191 data Continue to monitor data's accuracy. Analyze and disseminate findings, determine patterns that need to be addressed, consider Initial analysis and reflection on: Student Perception Survey (SPS) Analyze and disseminate findings, consider Full implementation of surveys: TELP, ALP, SELP, LEAD, School counselors Finalize instrument and conduct surveys. Analyze and disseminate findings, adjust instrument, process. Consider programmatic adjustments based on results Ongoing monitoring of: CLASS and edtpa Analyze and disseminate findings, consider Ongoing Reporting Title II CAEP annual report SPA updates PLACE, PRAXIS, edtpa, course grades, enrollment, degree completions, Placements for internships, faculty course loads, Faculty Report of Professional Activities (FRPA), budget data 7

CONCEPTUAL DIAGRAM OF THE ASSESSMENT FRAMEWORK Figure 1 is a visual representation of the assessment and accreditation framework for the College. It shows the flow of aggregated data from support personnel for the College to departments. Departments and programs within departments make use of the data both internally and to generate an annual report to the Assessment and Accreditation Committee and externally for reporting purposes such as SPAs. The annual report is framed around the CAEP, CACREP, and SPA standards and is presented to the Assessment and Accreditation Committee at the February meeting. The Assessment and Accreditation Committee provides feedback to departments and the Associate Dean aggregates the reports into an annual report for the College. This report is presented to the CAB and Dean at their April meeting. The Dean and Associate Dean are integral to this process at a number of junctures, providing feedback to the departments on the Colorado, CAEP, CACREP, and SPA standards as well as the College s own initiatives such as mission/vision and candidate learning outcomes. The Associate Dean is also central in providing feedback on data quality and collection to support personnel throughout the process. The values and mission of the College are infused throughout all our work rom program design, courses, and assessments and ensure that innovation, equity and inquiry are woven throughout programs. They also are considered when determining that program completers are competent representatives of the College of Education as well as their chosen professions. Among these internal measures are specific visions regarding equity, diversity, and social justice. I. Preliminary Response to the Continuous Improvement Plan (CIP) Use the Rubric for Evaluating the Capacity and Potential in the CIP to provide feedback on: a. The EPPs capacity for initiating, implementing and complete the CIP. The EPP acknowledges the importance of assessment leadership in the process of continuous improvement and does outline basic information in the form of five key features related to the success of the CIP: the Assessment and Accreditation Committee, improved business practices, dedicated personnel, program support, and increased availability of data and tools. While the timetable for implementation of assessments, the collection of assessment data, and the subsequent review of assessment data, the CIP demonstrates only a moderate level of commitment by the EPP. The CIP stresses the importance of assessment and offers a narrative explaining how the EPP s culture of assessment has evolved since the previous accreditation visit but detailed information related to the overall key personnel and resources needed to carry out the CIP are needed. The EPPs capacity for initiating, implementing, and completing the CIP is emerging. b. The potential of the CIP to have a positive impact on the EPP and its candidates.

While the EPP emphasizes a shift in assessment culture, CIP goals are very loosely defined in the plan. The EPP is dedicated to moving from a compliance driven to a program improvement driven framework with more nationally and externally-validated measures, but the CIP offers limited information as to how the EPP will achieve this goal. Specific goals for EPP programs would add focus and clarity to the plan. The potential of the CIP to have a positive impact on the EPP and its candidates is emerging. c. The proposed use of data and evidence. As evidenced by the CIP, the EPP plans to implement assessments and collect data in a systematic method. The CIP does not, however, include any mention of collecting and analyzing baseline and final data. Plans for baseline data and yearly objectives are needed beyond planned implementation of assessments. Further, the CIP should include a plan to gather final evidence based on processes and outcomes in addition to moving beyond piloted assessments. The proposed use of data and evidence is undefined. d. The potential of the EPP to demonstrate a higher level of excellence beyond what is required in the standards. The CIP offers a clear plan to implement current and piloted assessments as well as to regularly collect assessment data in a systematic fashion. By the conclusion of the 2017 academic year, the EPP will have implemented more than five assessment measures, allowing for the collection of data related to candidate performance, impact on P-12 learners, and EPP operations. The EPP includes planned instances of internal and external stakeholder review of assessment data and indicates that it will respond with program enhancements accordingly. The CIP should include a statement as to how the CIP will lead to a higher level of excellence beyond what is required for most of the CIP s focal areas. The CIP shows some promise; there are more than two elements at the emerging level and two elements that are undefined, needing clarification and/or enhancements. The CIP offers basic information on personnel and resources expected to support successful implementation. Further, the CIP outlines basic plans to implement multiple measures. Assessment implementation appears to be planned, but no yearly indicators are identified. CIP goals are very loosely defined and do not address specific programs. There does not appear to be any mention of baseline or final data; while this CIP would not include final data as it projects multiple academic terms in the future, the CIP does not mention any plans for comparing multiple data sets aside from statements concerning the review of data annually. The EPP identifies the commitment to an evolving culture of assessment; it does not, however, include a statement as to how the CIP will lead to a higher level of excellence beyond what is required for most of the CIP s focal areas.

Please click here to see the updated Continuous Improvement Plan Matrix Updated FIGURE 1 Assessment and Accreditation Visual Representation Dean A & A: Generates annual report aligned to CAEP, CACREP, and SPA standards. Provides structure and guidance for departments. Associate Dean CAB Regular Reporting Title II CAEP Annual Report CACREP SPA updates PLACE, PRAXIS, edtpa, course grades, enrollment, degree completions, Placements for internships, faculty course loads, FRPA, budget data Curriculum & Instruction Counseling Leadership Special Education Data support: COE Technology Support, Finance and Budget Specialist, Office of Assessment and Accreditation, Faculty