Integrated Analytics for Student Success Nancy Whitaker and Tracy Hribar University of Wisconsin-Parkside Innovation Grant

Similar documents
Field Experience Management 2011 Training Guides

VOL VISION 2020 STRATEGIC PLAN IMPLEMENTATION

Delaware Performance Appraisal System Building greater skills and knowledge for educators

eportfolio Trials in Three Systems: Training Requirements for Campus System Administrators, Faculty, and Students

ACADEMIC AFFAIRS GUIDELINES

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

ASSESSMENT REPORT FOR GENERAL EDUCATION CATEGORY 1C: WRITING INTENSIVE

SECTION 12 E-Learning (CBT) Delivery Module

Strengthening assessment integrity of online exams through remote invigilation

Learning Objectives by Course Matrix Objectives Course # Course Name Psyc Know ledge

AB104 Adult Education Block Grant. Performance Year:

Higher Education / Student Affairs Internship Manual

CONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION. Connecticut State Department of Education

eportfolio Guide Missouri State University

FACULTY Tk20 TUTORIALS: PORTFOLIOS & FIELD EXPERIENCE BINDERS

Statewide Strategic Plan for e-learning in California s Child Welfare Training System

Field Experience and Internship Handbook Master of Education in Educational Leadership Program

RDGED 722: Reading Specialist Practicum Field Experience Handbook

For the Ohio Board of Regents Second Report on the Condition of Higher Education in Ohio

Arkansas Tech University Secondary Education Exit Portfolio

FY16 UW-Parkside Institutional IT Plan Report

Kelso School District and Kelso Education Association Teacher Evaluation Process (TPEP)

Early Warning System Implementation Guide

University Library Collection Development and Management Policy

Requirements for the Degree: Bachelor of Science in Education in Early Childhood Special Education (P-5)

THE WEB 2.0 AS A PLATFORM FOR THE ACQUISITION OF SKILLS, IMPROVE ACADEMIC PERFORMANCE AND DESIGNER CAREER PROMOTION IN THE UNIVERSITY

Multiple Measures Assessment Project - FAQs

Expanded Learning Time Expectations for Implementation

KENTUCKY FRAMEWORK FOR TEACHING

Contract Language for Educators Evaluation. Table of Contents (1) Purpose of Educator Evaluation (2) Definitions (3) (4)

UW Colleges to UW Oshkosh

$0/5&/5 '"$*-*5"503 %"5" "/"-:45 */4536$5*0/"- 5&$)/0-0(: 41&$*"-*45 EVALUATION INSTRUMENT. &valuation *nstrument adopted +VOF

ABET Criteria for Accrediting Computer Science Programs

Preliminary Report Initiative for Investigation of Race Matters and Underrepresented Minority Faculty at MIT Revised Version Submitted July 12, 2007

University of Oregon College of Education School Psychology Program Internship Handbook

Scoring Guide for Candidates For retake candidates who began the Certification process in and earlier.

George Mason University College of Education and Human Development Secondary Education Program. EDCI 790 Secondary Education Internship

Basic Skills Plus. Legislation and Guidelines. Hope Opportunity Jobs

West Georgia RESA 99 Brown School Drive Grantville, GA

PROJECT DESCRIPTION SLAM

Davidson College Library Strategic Plan

Practice Learning Handbook

Education for an Information Age

UW-Stout--Student Research Fund Grant Application Cover Sheet. This is a Research Grant Proposal This is a Dissemination Grant Proposal

Software Development Plan

Indiana Collaborative for Project Based Learning. PBL Certification Process

The Teaching and Learning Center

SERVICE-LEARNING Annual Report July 30, 2004 Kara Hartmann, Service-Learning Coordinator Page 1 of 5

University of the Arts London (UAL) Diploma in Professional Studies Art and Design Date of production/revision May 2015

DOCTOR OF PHILOSOPHY BOARD PhD PROGRAM REVIEW PROTOCOL

MSW POLICY, PLANNING & ADMINISTRATION (PP&A) CONCENTRATION

HIGHLAND HIGH SCHOOL CREDIT FLEXIBILITY PLAN

HDR Presentation of Thesis Procedures pro-030 Version: 2.01

Qualification handbook

Volunteer State Community College Strategic Plan,

E-Portfolio for Teacher Educators at EIU. February 2005

Qualitative Site Review Protocol for DC Charter Schools

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

SCHOOL IMPROVEMENT PLAN Salem High School

ONTARIO FOOD COLLABORATIVE

Computer Science Self-Study Report for APC Review Fall 2007

Abstract. Janaka Jayalath Director / Information Systems, Tertiary and Vocational Education Commission, Sri Lanka.

KUTZTOWN UNIVERSITY KUTZTOWN, PENNSYLVANIA DEPARTMENT OF SECONDARY EDUCATION COLLEGE OF EDUCATION

School Leadership Rubrics

Mathematics Program Assessment Plan

Sample Performance Assessment

Massachusetts Juvenile Justice Education Case Study Results

Higher education is becoming a major driver of economic competitiveness

Student User s Guide to the Project Integration Management Simulation. Based on the PMBOK Guide - 5 th edition

WEBSITES TO ENHANCE LEARNING

MINNESOTA STATE UNIVERSITY, MANKATO IPESL (Initiative to Promote Excellence in Student Learning) PROSPECTUS

Lecturer Promotion Process (November 8, 2016)

Emerald Coast Career Institute N

Procedures for Academic Program Review. Office of Institutional Effectiveness, Academic Planning and Review

CAREER SERVICES Career Services 2020 is the new strategic direction of the Career Development Center at Middle Tennessee State University.

CORE CURRICULUM FOR REIKI

Office: Bacon Hall 316B. Office Phone:

MASTER OF ARTS IN APPLIED SOCIOLOGY. Thesis Option

Exhibition Techniques

Full-time MBA Program Distinguish Yourself.

School Inspection in Hesse/Germany

State Parental Involvement Plan

Lismore Comprehensive School

EDIT 576 DL1 (2 credits) Mobile Learning and Applications Fall Semester 2014 August 25 October 12, 2014 Fully Online Course

State Improvement Plan for Perkins Indicators 6S1 and 6S2

Implementing Pilot Early Grade Reading Program in Morocco

Occupational Therapist (Temporary Position)

Practice Learning Handbook

e-portfolios: Issues in Assessment, Accountability and Preservice Teacher Preparation Presenters:

Welcome to California Colleges, Platform Exploration (6.1) Goal: Students will familiarize themselves with the CaliforniaColleges.edu platform.

Master s Degree Online in Educational Leadership

Standard 5: The Faculty. Martha Ross James Madison University Patty Garvin

Writing a Basic Assessment Report. CUNY Office of Undergraduate Studies

Applied Universal Design for Learning In STEM Education

Special Education majors can be certified to teach grades 1-8 (MC-EA) and/or grades 6-12 (EA-AD). MC-EA and EA- AD are recommended.

content First Introductory book to cover CAPM First to differentiate expected and required returns First to discuss the intrinsic value of stocks

Interprofessional Education Assessment Strategies

When Student Confidence Clicks

Effective Supervision: Supporting the Art & Science of Teaching

Transcription:

Integrated Analytics for Student Success Nancy Whitaker and Tracy Hribar University of Wisconsin-Parkside Innovation Grant 2015-2016 I. Executive Summary The University of Wisconsin-Parkside Institute of Professional Educator Development, (IPED), used this project to pilot the use of LiveText analytics to expand its use of data to drive student success. Student artifacts were uploaded to LiveText and rated by multiple faculty assessors. These assessments were analyzed for ease of implantation and interrater reliability. This pilot project enabled IPED to create and integrate the necessary LiveText analytic support structures. Project personnel worked through the operational component though utilizing a new software platform, as well as training, faculty, staff and students in its use. II. Purpose and Objectives This project will improve Educator Development faculty and staff use of analytics to support student learning in course and clinical experiences by expanding the existing browser-based e-portfolio and assessment management web application, LiveText. III. Organization and Approach LiveText consists of two modules: C1, used for storing and evaluating student work to create a portfolio for licensure and post-graduation job applications and a field experience module (FEM) in which students create pre-observation planning documents shared with university supervisors, receive feedback on clinical field experiences, and assessments (self-assessment, supervisor assessment, and P-12 mentor teacher assessment). LiveText is a student-centric system (UWS Operations emphasis) that has the capacity to connect storage and assessment of academic artifacts (text and video) in the C1 module as well as storage and assessment of clinical experience in the FEM module. Students self-assess and are assessed by university supervisors, faculty, and P-12 mentor teachers using this web application. The Institute for Professional Educator Development (IPED) at University of Wisconsin- Parkside, was established 2012. IPED is committed to unbiased, fair, and accurate assessments of teacher education candidates. All teacher education candidates in Wisconsin must create a curated electronic portfolio and pass the edtpa (a high-stakes performance based assessment) to achieve teacher licensure. Every teacher education program has distinguishing features, and IPED has been built on a foundation of progressive involvement in clinical field experience through collaborative partnerships between students and P-12 teachers. Page 1 of 6

The model of clinical field experience used by IPED is comprised of seven co-teaching strategies that increase in responsibility as the student develops pedagogical skills. IPED students engage in extensive clinical experiences prior to the student teaching semester, ranging from 240 to 440 hours in area P-12 schools. Clinical training for all P-12 teachers and teacher education students includes an orientation to the levels of co-teaching. Course experiences and clinical experiences are linked in a progression from first to fourth year, and students receive a basic level of rubric feedback from the LiveText Field Experience Module. This level of assessment feedback represents a baseline level of support for student success in clinical field experience. The IPED Clinical Coordinator currently uses an Access database in conjunction with LiveText basic reporting to provide metrics on student demographics, clinical performance, and progress in the program. The creation of metrics using these two separate software applications is inefficient and provides limited metrics. An additional LiveText analytics package would provide an efficient integrated assessment approach (UWS Operations emphasis) that would include portfolio components, video analysis, and field experience rubric-based feedback. Nancy Whitaker is the department chair for the Institute for Professional Educator Development, and is a faculty member and clinical supervisor in the unit. She is the direct supervisor of Tracy Hribar, and is an experienced lead technology faculty member, is a member of the university assessment committee, and serves as an assessment liaison for IPED and the Music Department on campus. Professor Whitaker will be involved in every aspect of the grant implementation and evaluation. III.A. Methodology The UW-Parkside teacher education program utilized a data-driven approach (UWS Learning Technology Emphasis) to: 1) implement the C1 module to provide instructor and P-12 teacher collaborative feedback on student-created videos of co-teaching connected to program goals; 2) utilize the enhanced analytics package to identify predictive flags in the program related to student success; 3) close the loop by providing detailed early intervention for students (UWS Learning Technology Emphasis) based on analytics; and 4) utilize predictive analytics (UWS Learning Technology Emphasis) to make changes in course and fieldwork structure and student support. IPED requested funding for expansion of this technology-enabled learning space (UWS Learning Technology Emphasis) to support extended faculty and staff use of the application and to develop a climate of data-driven assessment that involves P-12 teachers, university faculty, and the clinical coordinator as a network for student success. We anticipated that the clinical coordinator s efficiency would be increased by the use of Page 2 of 6

enhanced, integrated analytic capacities in LiveText rather than relying primarily on LiveText and Access basic metrics, as well as the use of a powerful data management tool. We anticipated that students would benefit from the analytics on inter-rater reliability in the C1 module as they received feedback on portfolio components. This request focused on the implementation of an add-on Analytic Module and a twostage pilot implementation. Stage I Foundation consisted of the creation of an assessment template using the Analytics Module in conjunction with the C1 Module and running a pilot assessment using a stratified random sample of text and video student artifacts from the 300 and 400 level courses. Stage II Integration consisted of running analytics on a stratified random sample consisting of one text artifact and one video artifact from students enrolled in the 300 and 400 EDU level courses. III.B. Task Structure This project focused on the implementation of an add-on Analytic Module in LiveText and a two-stage pilot implementation. 1. Stage I Foundation consisted of the creation of an assessment template using the Analytics Module in conjunction with the C1 Module and running a pilot assessment using a stratified random sample of text and video student artifacts from the 300 and 400 level courses, n=8 students. Implementation of Analytics Module and C1 Module required approximately 80 hours of work time to integrate within LiveText and prepare training documents. Professor Whitaker trained selected students in the use of the voice-responsive Swivl cameras that provided video records of clinical work and supported the students pre-pilot in elementary clinical placements through clinical consent and upload of video data. 2. Stage II Integration consisted of running analytics on a stratified random sample consisting of one text artifact and one video artifact from each student enrolled in the 300 and 400 EDU level courses, n= 8. Sample Expansion required approximately 80 hours or work time. The Clinical Coordinator generated reports related to student success in individual course experiences and across the program. The utilization of the analytics required approximately 40 hours or work time. Professor Whitaker structured the identification of students within the courses, secured consent from clinical sites, and assisted students with uploading video data. III.C. Initial Budget 1. Cost of LiveText Analytics Package for one year: $1,500 Page 3 of 6

2. Salary and fringe for support staff to assist Clinical Coordinator to create and provide video-based training modules for students and P-12 mentor teachers and set up/implement the use of extended analytics in C1 LiveText module: $6,000 3. (2) Swivl Robotic Platforms (including carrying case and camera mount) to facilitate the collection of video data in P-12 classrooms @ 482.99 ea. = 965.98. These platforms will be utilized with ipads. Total request: $8465.98 IV. Analysis and Findings The process of implementing multiple reviewers of student text and video samples required that the Clinical Coordinator create a training module with multiple student artifacts. The process of creating the training module was by no means simplistic; due in part to an upgrade in the LiveText interface in early January 2016, the Clinical Coordinator had to utilize extensive customer service support to make the Analytical Module operational with the training module assessments. The rubric used for the common assessment of student work is one that is used throughout the IPED program. The rubric is based on a nationally-recognized teacher evaluation model by Charlotte Danielson; the rubric also serves as the basis for one of two teacher evaluation models used in Wisconsin P-12 settings. Four faculty members in IPED reviewed the student artifacts. The process of setting up the structures in LiveText to support multiple raters necessitated additional training from LiveText on the part of the coordinator. This is noteworthy because any glitches in artifact review and assessment were identified and completely addressed as part of this pilot. During Spring 2016 semester, student teachers created a portfolio of artifacts to be assessed by multiple raters. As a result of this grant, IPED can support analysis of this portfolio data with confidence during this semester. This pilot analytic assessment was designed to understand the issues involved in creating structures supported by analytics. The Clinical Coordinator customized the analytics to include interrater reliability by learning standards used in two courses represented by the artifacts. Interrater reliability on the five standards, on a scale of 1.0-4.0, ranged from a mean of 2.0 to 2.5; SD.707 to 1.414. We underestimated the levels and types of training required by the addition of the C1 module; this challenge was exacerbated by the substantial changes in the LiveText platform on 1/1/16. Faculty required substantial training to be able to assess multiple artifacts in the platform. Analytic capability cannot be used with the LiveText FEM [field experience] module, a module that consists of multiple assessments of student field experiences using a common rubric. This analytic limitation can be overcome by having student submit videos into the C1 module, where analytics can be employed, as was done during this study. Page 4 of 6

While the vendor plans to implement this as a feature of the program, it was essential, and prudent, to have tested this prior to larger scale program-wide implementation. V. Conclusions and Recommendations This study was invaluable given the group of student teachers who will have a seamless experience when downloading artifacts into the C1 module during the current semester. Working through the granular aspects of LiveText structure and faculty training proved essential; the process has been substantially debugged for our campus. The pilot provides support not only for spring semester student teachers but for the entire teacher preparation program. Students at the 100 through 400 levels are being asked to upload artifacts into the C1 module, and faculty have been trained to use a common rubric for assessment. Next steps include: 1. Identification of specific points in coursework at which students need assistance. The identification of these points will drive differentiated instruction in the classroom. 2. Identification of specific student demographic and achievement data and correlation with student retention in coursework and in clinical field experience to support student retention. As a result of this pilot, demographic data can be correlated with specific elements of the common program rubric. If students do not put artifacts into the C1 module, then we do not have the ability to analyze their progress. 3. Identification of demographics of P-12 school clinical experience placements (classroom demographics) and correlation with student success in clinical field experiences. Due to the limitations of the FEM module, this can only be accomplished by multiple assessments of a student video in the C1 module by multiple faculty. LiveText has indicated that they hope to be able to import analytic capacity into the FEM module in the future. This project provides support for a successful work around. What has become clear that program analytics must include the program Access database; using LiveText and the database will continue to provide the most detailed and powerful program metrics. 4. Expansion of the number of Swivl robotic platforms (to 12) for student use across the program would facilitate the expansion for this project across the IPED program. 5. We to plan to replicate this study internally during the Spring 2016, Fall 2016, and Spring 2017 semesters. During that time frame we anticipate that LiveText may have expanded analytics to the FEM [field experience module]; if not, we will continue to refine the structure and training supported by the combination of LiveText analytics and Access database. Page 5 of 6

VI. Appendix Final Budget (identical to original budget) 1. Cost of LiveText Analytics Package for one year: $1,500 2. Salary and fringe for support staff to assist Clinical Coordinator to create and provide video-based training modules for students and P-12 mentor teachers and set up/implement the use of extended analytics in C1 LiveText module: $6,000 3. (2) Swivl Robotic Platforms (including carrying case and camera mount) to facilitate the collection of video data in P-12 classrooms @ 482.99 ea. = 965.98. These platforms will be utilized with ipads. Total request: $8465.98 Page 6 of 6