TECHNOLOGY TRAINING FOR IN-SERVICE TEACHERS AN EVALUATION. Rimjhim Banerjee. Abstract

Similar documents
What does Quality Look Like?

NATIONAL SURVEY OF STUDENT ENGAGEMENT

Arkansas Tech University Secondary Education Exit Portfolio

Developing an Assessment Plan to Learn About Student Learning

The Evaluation of Students Perceptions of Distance Education

Focus Groups and Student Learning Assessment

Kelso School District and Kelso Education Association Teacher Evaluation Process (TPEP)

Week 4: Action Planning and Personal Growth

Growth of empowerment in career science teachers: Implications for professional development

POL EVALUATION PLAN. Created for Lucy Learned, Training Specialist Jet Blue Airways

Effective Recruitment and Retention Strategies for Underrepresented Minority Students: Perspectives from Dental Students

Philosophy of Literacy. on a daily basis. My students will be motivated, fluent, and flexible because I will make my reading

Indiana Collaborative for Project Based Learning. PBL Certification Process

CHAPTER 4: REIMBURSEMENT STRATEGIES 24

Delaware Performance Appraisal System Building greater skills and knowledge for educators

MIDDLE SCHOOL. Academic Success through Prevention, Intervention, Remediation, and Enrichment Plan (ASPIRE)

West Georgia RESA 99 Brown School Drive Grantville, GA

Midterm Evaluation of Student Teachers

Institutional Program Evaluation Plan Training

UK Institutional Research Brief: Results of the 2012 National Survey of Student Engagement: A Comparison with Carnegie Peer Institutions

Helping Graduate Students Join an Online Learning Community

Algebra I Teachers Perceptions of Teaching Students with Learning Disabilities. Angela Lusk Snead State Community College

The Efficacy of PCI s Reading Program - Level One: A Report of a Randomized Experiment in Brevard Public Schools and Miami-Dade County Public Schools

TAI TEAM ASSESSMENT INVENTORY

National Survey of Student Engagement

Presentation 4 23 May 2017 Erasmus+ LOAF Project, Vilnius, Lithuania Dr Declan Kennedy, Department of Education, University College Cork, Ireland.

Undergraduates Views of K-12 Teaching as a Career Choice

BLENDED LEARNING IN ACADEMIA: SUGGESTIONS FOR KEY STAKEHOLDERS. Jeff Rooks, University of West Georgia. Thomas W. Gainey, University of West Georgia

Albemarle County Public Schools School Improvement Plan KEY CHANGES THIS YEAR

Testimony to the U.S. Senate Committee on Health, Education, Labor and Pensions. John White, Louisiana State Superintendent of Education

SETTING STANDARDS FOR CRITERION- REFERENCED MEASUREMENT

A pilot study on the impact of an online writing tool used by first year science students

EDUCATING TEACHERS FOR CULTURAL AND LINGUISTIC DIVERSITY: A MODEL FOR ALL TEACHERS

Using Team-based learning for the Career Research Project. Francine White. LaGuardia Community College

School Leadership Rubrics

Positive turning points for girls in mathematics classrooms: Do they stand the test of time?

New Jersey Department of Education World Languages Model Program Application Guidance Document

Evaluation of Hybrid Online Instruction in Sport Management

Review of Student Assessment Data

Secondary English-Language Arts

Leadership Development at

SACS Reaffirmation of Accreditation: Process and Reports

(2) "Half time basis" means teaching fifteen (15) hours per week in the intern s area of certification.

Biological Sciences, BS and BA

AC : DEVELOPMENT OF AN INTRODUCTION TO INFRAS- TRUCTURE COURSE

Table of Contents PROCEDURES

Standards-Based Bulletin Boards. Tuesday, January 17, 2012 Principals Meeting

eportfolios in K-12 and in Teacher Education

A Diverse Student Body

Jason A. Grissom Susanna Loeb. Forthcoming, American Educational Research Journal

BUILDING CAPACITY FOR COLLEGE AND CAREER READINESS: LESSONS LEARNED FROM NAEP ITEM ANALYSES. Council of the Great City Schools

Guatemala: Teacher-Training Centers of the Salesians

Mathematics Program Assessment Plan

HIGHLAND HIGH SCHOOL CREDIT FLEXIBILITY PLAN

Reading Horizons. A Look At Linguistic Readers. Nicholas P. Criscuolo APRIL Volume 10, Issue Article 5

TEXAS CHRISTIAN UNIVERSITY M. J. NEELEY SCHOOL OF BUSINESS CRITERIA FOR PROMOTION & TENURE AND FACULTY EVALUATION GUIDELINES 9/16/85*

The Consistent Positive Direction Pinnacle Certification Course

Robert S. Unnasch, Ph.D.

Charter School Performance Comparable to Other Public Schools; Stronger Accountability Needed

Abstract. Janaka Jayalath Director / Information Systems, Tertiary and Vocational Education Commission, Sri Lanka.

Carolina Course Evaluation Item Bank Last Revised Fall 2009

National Survey of Student Engagement Spring University of Kansas. Executive Summary

How to Recruit and Retain Bilingual/ESL Teacher Candidates?

NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE)

How to Judge the Quality of an Objective Classroom Test

A Study of Metacognitive Awareness of Non-English Majors in L2 Listening

Assessment. the international training and education center on hiv. Continued on page 4

Match or Mismatch Between Learning Styles of Prep-Class EFL Students and EFL Teachers

Connecting to the Big Picture: An Orientation to GEAR UP

EDUC-E328 Science in the Elementary Schools

Class of 2018 Junior Proposal for Senior Project. Make the Most of Your Journey

Contents. Foreword... 5

The specific Florida Educator Accomplished Practices (FEAP) addressed in this course are:

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

What Is a Chief Diversity Officer? By. Dr. Damon A. Williams & Dr. Katrina C. Wade-Golden

Center for Higher Education

Enter Samuel E. Braden.! Tenth President

New Jersey Institute of Technology Newark College of Engineering

Assessment and Evaluation

National Survey of Student Engagement (NSSE) Temple University 2016 Results

Request for Proposal UNDERGRADUATE ARABIC FLAGSHIP PROGRAM

Linguistics Program Outcomes Assessment 2012

Karla Brooks Baehr, Ed.D. Senior Advisor and Consultant The District Management Council

Chicago State University Ghana Teaching and Learning Materials Program:

Curriculum Assessment Employing the Continuous Quality Improvement Model in Post-Certification Graduate Athletic Training Education Programs

NC Global-Ready Schools

ASSESSMENT OF STUDENT LEARNING OUTCOMES WITHIN ACADEMIC PROGRAMS AT WEST CHESTER UNIVERSITY

STEM SMART Workshop Las Vegas - Sept 19, 2012

Doctor of Philosophy in Theology

Level: 5 TH PRIMARY SCHOOL

Restorative Measures In Schools Survey, 2011

Department of Education School of Education & Human Services Master of Education Policy Manual

Cognitive Thinking Style Sample Report

Procedia - Social and Behavioral Sciences 98 ( 2014 ) International Conference on Current Trends in ELT

Implementing Response to Intervention (RTI) National Center on Response to Intervention

ADDENDUM 2016 Template - Turnaround Option Plan (TOP) - Phases 1 and 2 St. Lucie Public Schools

GENERAL COMPETITION INFORMATION

Envision Success FY2014-FY2017 Strategic Goal 1: Enhancing pathways that guide students to achieve their academic, career, and personal goals

ASSISTANT DIRECTOR OF SCHOOLS (K 12)

Reference to Tenure track faculty in this document includes tenured faculty, unless otherwise noted.

Transcription:

TECHNOLOGY TRAINING FOR IN-SERVICE TEACHERS AN EVALUATION Rimjhim Banerjee Abstract A process to determine impact is needed for all training programs. The Tech Mentors Program, a three-year initiative of the PT3 project at Florida International University, focuses on the professional development of cooperating teachers in the Miami-Dade County School District as a key strategy for improving teacher education. In the first year of this program, 100 participating teachers received a full week of technology training and agreed to accept a FIU intern in his/her classroom in fall 2002 or spring 2003, use what he/she learns to support technology integration into the student intern s field experiences, and mentor the intern and help to evaluate his/her use of instructional technology. Since data has not been gathered regarding the impact of technology training on classroom mentoring practice and student intern achievement, the purpose of this evaluation study is to determine whether the program actually helps cooperating teachers to acquire the required knowledge, skills and dispositions to mentor teacher education students. Kirkpatrick s model of evaluation is used as a framework to build the evaluation questions. Based on the results in each level of the evaluation, the program has been successful in providing the Miami-Dade public school teachers with the much needed technology integration training. Introduction Technology Training for Teachers in the United States Technology pervades society and impacts everyone's life in innumerable ways. According to the International Society for Technology in Education (ISTE), school and college graduates, in order to live and work successfully, must be computer literate. To stress the importance of having technology knowledge and skills, ISTE developed the National Educational Technology Standards for Students, which provided a framework of linking performance indicators at four developmental levels in schools. However, in reality, our educational system lags far behind in its utilization of technology. The National Center for Education Statistics (NCES) in its survey found that 99% of full-time, regular public school teachers have access to computers or the Internet in their schools which is an increase from 35 percent in 1994. However, only one-third of the teachers reported being prepared to use computers and the Internet in their instruction (Lumpkin, 2001). The literature suggests that teacher readiness to use technology hinges on more funding, technology training, and administrator support for instructional technology. In most schools, despite the millions of dollars spent for Internet connectivity and computer hardware and software, most children continue to be taught through traditional methods. Technology Preparedness Jones (2001) stated that less experienced teachers indicated that they felt better prepared to use technology than their experienced colleagues. Most teachers said that they had not received the training necessary to incorporate technology in their classrooms (Wetzel, 2001).There are several challenges that teachers face when trying to embrace technology. Learning new software is just one challenge. Developing lesson plans that incorporate new technology is another. Lack of released time to learn how to use computers and the Internet was one of the most frequently reported barriers to public school teachers using computers and the Internet in instruction. Training, preparation, and work environments also play roles in a

teacher's readiness to use technology. Research shows that traditional professional development activities are often short term, devoid of adequate follow up, and do not address school contexts (Anderson, 2002). Evaluation of Technology Training Programs Evaluation is the systematic investigation of the merit or worth of an object (program) for the purpose of reducing uncertainty in decision making. A process to determine impact is needed for all training programs. Schwab and Foa (2001) in their study indicate that those states that conducted serious pilot projects of the US WEST Foundation s five-year training program had the most effective first-year training results. The majority of these evaluation procedures was not highly sophisticated and did not require lots of statistical analysis. They simply required willingness on a regular basis for everyone involved to analyze openly and cooperatively what was working, what was not, and how it could be improved. While current evaluation data clearly indicates that aggressive staff development effort provides teachers with enhanced technology skills and knowledge about best practice, data has not been gathered regarding the impact on classroom mentoring practice and student intern achievement. The Tech Mentors Program, a three-year initiative of the PT3 project at Florida International University, focuses on the professional development of cooperating teachers in the Miami-Dade County Public Schools (MDCPS) as a key strategy for improving teacher education. In the first year of this program, 100 participating teachers received a full week of technology training and agreed to a) accept a FIU intern in his/her classroom in fall 2002 or spring 2003, b) use what he/she learns to support technology integration into the student intern s field experiences, c) mentor the intern and help to evaluate his/her use of instructional technology. In response to the need to identify the impact that technology training has on classroom practice, the purpose of this evaluation study is to determine whether the program actually helps cooperating teachers to acquire the required knowledge, skills and disposition to mentor teacher education students who need authentic experiences with technology in their field experiences, and who need to see mentor teachers using technology to support teaching and learning in a variety of ways. Evaluation of the Tech Mentor program The goal of the evaluation is to determine the value of the Tech Mentor Program based on the accomplishment of objectives. The program was judged based on a) the impact on the participants ability to integrate technology with teaching and learning in their classrooms in terms of time, methods and resources, b) Participant feedback about the strengths and weaknesses of the workshops and the training program as a whole c) Participants demonstrated mastery of the workshop objectives d) Participants use of technology integration with the student intern s field experiences e) Ability of interns to use technology with their learners. Evaluation Model Kirkpatrick s model of evaluation is used as a framework to build the evaluation questions. Participants reactions were gathered immediately after the training regarding the effectiveness of the training methods, time allocation, utility and resources, and strengths and weaknesses of the program. Participants completed Likert type survey forms at the end of training week. Frequencies/percentages of responses, mean and standard deviation, and content analysis of open-ended questions were done to analyze the data. These responses constituted the Reaction Level evaluation. Using four-point evaluation rubrics, artifacts produced in each workshop and technology journals produced by each participant were assessed to measure the

accomplishment of workshop objectives. The number and percent judged as acceptable or unacceptable were used to evaluate the knowledge acquired by each trainee. These assessments constituted the Learning Level evaluation. Participant feedback collected through survey forms about eight months after training during on-the-job application provided the data for Behavior Level evaluation. Frequencies/percentages of responses, mean, and standard deviation were done to analyze the data. Student interns feedback collected through survey forms in the fall 2002 and spring 2003 provided the data for Results Level evaluation. Both quantitative and qualitative data have been interpreted to provide information about the effectiveness of the program. Participants Program participants (trainees) belonged to a group of current or former cooperating teachers for FIU s teacher preparation programs. A hundred teachers participated in the training, of which 76% were regular teachers and 24% were Special Education teachers. Of the 76 regular teachers, 40.8% (n=31) were 6-12 teachers while 59.2% (n=45) were K-5 teachers. Of the 24 Special Education teachers, 25% (n=6) were 6-12 teachers, 66.7% (n=16) were K-5 teachers, and 8.3% (n=2) were Special Center teachers. Participant characteristics were collected through an application form. These characteristics were: number of years of teaching experience, number of computers in their classroom, technology proficiency (advanced, proficient, beginner), frequency of technology use (regular, occasional, rare). Stakeholders There were five categories of stakeholders for the Tech Mentor program. The students, faculty and administration at FIU s College of Education wanted to know what technology competencies mentor teachers have acquired and how the training will impact the internship experience for future student interns. MDCPS central administration was interested in expanding the number of teachers who have technology infusion training. Principals wanted to know what classroom technology tools and techniques teachers can or will use in the classroom. US Department of Education (funding source) wanted data that will indicate whether the PT3 project is meeting its stated goals and objectives. Participants mentors for FIU interns wanted to know how other teachers experienced the training and how they plan to apply the training. They also wanted to know whether other teachers share their concerns. Project staff wanted to know what aspects of the training were most effective, the tools and techniques most valued by classroom teachers, and what to do differently next year in order to make the training more effective. Key Findings In the Reaction Level, participant ratings of the program quality and value were found to be very favorable. Nearly ninety four percent (n=91) of the respondents rated the overall value of the institute as excellent, 5.2% (n=5) rated it as good and only 1% (n=1) rated it as average. The mean rating was 4.93 on a scale of 1-5 with a low standard deviation of 0.30. Ninety percent of the participants strongly agreed that the facilities provided for the training was adequate. Eighty five percent to 86% of the participants felt strongly prepared to integrate technology into their classroom teaching and in their mentoring experiences with student interns. Eighty four percent of the participants agreed strongly to the statement that the training supported them in meeting their professional responsibilities. The learning activities and the items developed in the institute were rated as extremely valuable by more than 80% of the participants and they felt that with practice, they can do what they had learned at the institute. Responses to open ended

questions in the Institute Evaluation instrument were analyzed by breaking them into categories. A large number of respondents (67.3%, n=66) felt that the training program benefited them the most since it gave them new technical skills and knowledge for use in the classroom. Nearly 27% (n=26) indicated that the program increased their proficiency with computers and 13.3% indicated that their confidence level of using technology rose as a result of the training. Nearly 20% of the respondents (n=19) indicated that they were able to use the Internet for resources and 6% (n=6) indicated they would be able to mentor co-teachers and interns as a result of the training. Eight respondents benefited from hands-on activities. Only 89 participants provided different kinds of feedback on ways to improve the training program. Though 28.1% of the respondents (n=25) felt that the program was perfect and they had no suggestions for improvement, about 20% (n=18) felt the need for more training time in each workshop, while 5.6% (n=5) wanted more time for hands-on activities. Grouping of participants by technology skill levels was sought by 9% of the respondents (n=8) and nearly 5% (n=4) wanted follow-up practice sessions. A remarkable observation to make here would be the improvement sought in the Media on the Move workshop. About 18% of the respondents (n=16) expressed the need to improve various aspects of this particular workshop. In the Learning Level, artifacts produced in each workshop and technology journals produced by each trainee were assessed to measure the accomplishment of training objectives. Technology journals were based on KWL what you know, what you want to know, what you learned. The artifacts and the journal were graded on a four-point rubric, where a score of 3 or 4 was considered acceptable. The overall acceptable percent of journals was 57 while 69% of the artifacts were acceptable. On analyzing the data further based on categories, it was found that out of the 31 journals of the regular 6-12 teachers that were graded, 78% (n=18) were acceptable while 84% (n=26) of the artifacts were acceptable. For the 45 K-5 regular teachers, 60% (n=27) of the journals were acceptable, while 65% (n=29) of the artifacts were acceptable. Fifty percent of the journals of the Special Education teachers (n=12) were acceptable while 58% (n=14) of their artifacts were acceptable. In the Behavior Level evaluation stage, survey forms were sent out to the 100 participants after about eight months following the training. Thirty participants responded and their responses constitute the data for the behavior level evaluation. About seventy two percent (n=21) of the respondents rated their preparedness to use technology in the classroom to have enhanced much as a result of the training, while 60% (n=17) were actually able to increase their use of technology in the classroom in the year following the training. Interestingly, 35% of the trainees considered limited access to computers and other technology as a major barrier in integrating technology into their classrooms. Sixty-two percent (n=18) consider the training to have vastly improved their ability to mentor student teachers on technology. Other teachers and the technical support person in the school were ranked as the biggest facilitators for integrating technology in the classroom. On being questioned about changes in teaching practices due to the use of technology, about 83% (n=24) stated that they were now more comfortable with small group activities and 75% (n=21) were more comfortable with students working independently. Seventy one percent (n=20) were already able to differentiate instruction to accommodate diverse learning styles of students while 14% (n=4) indicated that though not yet, but they thought in the future they will be able to do so. Seventy percent of the teachers (n=21) were already able to present complex materials to their students. About 52% (n=15) indicated that as a result of the training received, they were now able to mentor or collaborate with other teachers on technology and 62% (n=18) stated that the training has enhanced their ability to mentor student teachers on technology.

For the Reaction Level stage, data from 95 student interns who were mentored by the trainees and responded to the survey was analyzed. Nearly fifty eight percent of the interns (n=55) rated the overall support they received from their cooperating teacher for integrating technology in their internship experience as excellent while 24.2% (n=23) rated it as good. About 80% (n=75) of the interns stated that their mentors shared his/her experience with integrating technology to support teaching and learning. Eighty-two percent (n=78) stated that their cooperating teacher gave them suggestions and/or feedback on their use of technology in the classroom. Ninety percent of the interns (n=85) stated that their mentors ensured that they had access to available technology resources at the school. Implications and Recommendations The results of the evaluation study will be important to the practice of adult education and human resource development because it will contribute to the body of knowledge that already exists on the impact of technology training for in-service teachers. Since not many such evaluation studies exist, this study will allow technology training program planners design and implement effective training programs by understanding some of the factors that work and those that do not work as an impact of such training at the classroom level. This study also contributes to the body of knowledge of the impact of technology training on the mentoring experience of teachers and the internship experience of teacher education students. Based on the evaluation results on all four levels of Kirkpatrick s model, it can be said that the Tech Mentor program was a success in providing the Miami-Dade public school teachers with the much needed technology integration training. Since this study focuses on the first year of this program, the evaluation results have been used for improving the quality of the program in the successive years. Data from the Reaction Level revealed that the Media on the Move workshop needed improvement in various aspects. The program planners took this into consideration while planning for the following year. Teachers considered limited access to computers and other technology as the major barrier in integrating technology into their classrooms. Most of them also considered other teachers and the technical support person in the school as the biggest facilitators for integrating technology in the classroom. The barriers and the facilitators to the use of technology in the classroom identified in this study can be used by school administration to explore the possibility of collaborative classroom sessions between teachers to make the best possible use of limited technology access. These barriers and facilitators could also guide program planners while developing technology training programs. The Tech Mentor program can be evaluated further based on the initial technology proficiency of the trainee. Further studies can also be done to determine whether there is a difference in the internship experience of those student interns who have been mentored by the trainees against those who have been not been mentored by them. References Anderson, J. (2002). District initiative keys in on classroom. Journal of Staff Development, 23(1), p. 36-38. Jones, C.A. (2001). Tech support: preparing teachers to use technology: Principal Leadership. High School Ed., 1(9), p. 35-39. Jones, C.A. (2001). When teachers' computer literacy doesn't go far enough. The Education Digest, 67(2), p. 57-61.

Lumpkin, A & Clay, M.N. (2001). A college of education's technology journey: from neophyte to national leader. Action in Teacher Education, 23(3), p. 20-26. Schwab, R. & Foa, L.J. (2001). Integrating technologies throughout our schools. Phi Delta Kappan, 82(8), p. 620-624. Wetzel, K. (2001). Preparing teacher leaders: a partnership that works part 2. Learning and Leading with Technology, 29(3), p. 50-53. Rimjhim Banerjee, Information Management Coordinator, College of Education, Florida International University. Rimjhim.Banerjee@fiu.edu Acknowledgements: Dr. Charles Divita, Dr. Mary Haley, and my family. Presented at the Midwest Research to Practice Conference in Adult, Continuing, and Community Education, Indiana University, Indianapolis, IN, October 6-8, 2004.