Counselor Keys Effectiveness System

Similar documents
AC : DEVELOPMENT OF AN INTRODUCTION TO INFRAS- TRUCTURE COURSE

Learn & Grow. Lead & Show

THEORY OF PLANNED BEHAVIOR MODEL IN ELECTRONIC LEARNING: A PILOT STUDY

Executive Summary. Palencia Elementary

SECTION I: Strategic Planning Background and Approach

Price Sensitivity Analysis

Greek Teachers Attitudes toward the Inclusion of Students with Special Educational Needs

Colorado State University Department of Construction Management. Assessment Results and Action Plans

Georgia Professional Standards Commission. Presentation to GASPA Spring 2013

BUILDING CAPACITY FOR COLLEGE AND CAREER READINESS: LESSONS LEARNED FROM NAEP ITEM ANALYSES. Council of the Great City Schools

A Pilot Study on Pearson s Interactive Science 2011 Program

Contract Language for Educators Evaluation. Table of Contents (1) Purpose of Educator Evaluation (2) Definitions (3) (4)

Executive Summary. Colegio Catolico Notre Dame, Corp. Mr. Jose Grillo, Principal PO Box 937 Caguas, PR 00725

COLLEGE OF EDUCATION. Administrative Officers. About the College. Mission. Highlights. Academic Programs. Sam Houston State University 1

English for Specific Purposes World ISSN Issue 34, Volume 12, 2012 TITLE:

Pre-Professional Graduate Certificate Program in. Marriage and Family Therapy 2017/2018

Standardized Assessment & Data Overview December 21, 2015

PERSPECTIVES OF KING SAUD UNIVERSITY FACULTY MEMBERS TOWARD ACCOMMODATIONS FOR STUDENTS WITH ATTENTION DEFICIT- HYPERACTIVITY DISORDER (ADHD)

Study Abroad Housing and Cultural Intelligence: Does Housing Influence the Gaining of Cultural Intelligence?

1. Conclusion: Supply and Demand Analysis by Primary Positions

Running Head GAPSS PART A 1

Team Work in International Programs: Why is it so difficult?

DISCIPLINE PROCEDURES FOR STUDENTS IN CHARTER SCHOOLS Frequently Asked Questions. (June 2014)

STUDENT PERCEPTION SURVEYS ACTIONABLE STUDENT FEEDBACK PROMOTING EXCELLENCE IN TEACHING AND LEARNING

Clinical Mental Health Counseling Program School Counseling Program Counselor Education and Practice Program Academic Year

Multiple Intelligences 1

State Parental Involvement Plan

Strategic Plan Update Year 3 November 1, 2013

VIEW: An Assessment of Problem Solving Style

Testing Schedule. Explained

STUDENT LEARNING ASSESSMENT REPORT

The Tapestry Journal Summer 2011, Volume 3, No. 1 ISSN pp. 1-21

UIMN Preparing for Intercultural Ministry (3 hours) Fall 2015 MW 11:00 WM 122

FINAL EXAMINATION OBG4000 AUDIT June 2011 SESSION WRITTEN COMPONENT & LOGBOOK ASSESSMENT

Cuero Independent School District

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Nancy Papagno Crimmin, Ed.D.

Unequal Opportunity in Environmental Education: Environmental Education Programs and Funding at Contra Costa Secondary Schools.

SAT Results December, 2002 Authors: Chuck Dulaney and Roger Regan WCPSS SAT Scores Reach Historic High

Georgia State University Department of Counseling and Psychological Services Annual Report

School Inspection in Hesse/Germany

CONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION. Connecticut State Department of Education

Saeed Rajaeepour Associate Professor, Department of Educational Sciences. Seyed Ali Siadat Professor, Department of Educational Sciences

UNCF ICB Enrollment Management Institute Session Descriptions

NORTH CAROLINA VIRTUAL PUBLIC SCHOOL IN WCPSS UPDATE FOR FALL 2007, SPRING 2008, AND SUMMER 2008

Exploring the adaptability of the CEFR in the construction of a writing ability scale for test for English majors

Introduction to Questionnaire Design

ACBSP Related Standards: #3 Student and Stakeholder Focus #4 Measurement and Analysis of Student Learning and Performance

Kelso School District and Kelso Education Association Teacher Evaluation Process (TPEP)

Loyola University Maryland Center for Montessori Education

DESIGN-BASED LEARNING IN INFORMATION SYSTEMS: THE ROLE OF KNOWLEDGE AND MOTIVATION ON LEARNING AND DESIGN OUTCOMES

The Sarasota County Pre International Baccalaureate International Baccalaureate Programs at Riverview High School

Academic Support Services Accelerated Learning Classes The Learning Success Center SMARTHINKING Student computer labs Adult Education

Growth of empowerment in career science teachers: Implications for professional development

Research Design & Analysis Made Easy! Brainstorming Worksheet

Executive Summary. Lincoln Middle Academy of Excellence

Atlanta Police Study Guide

Group of National Experts on Vocational Education and Training

World s Best Workforce Plan

WORK OF LEADERS GROUP REPORT

Evidence-based Practice: A Workshop for Training Adult Basic Education, TANF and One Stop Practitioners and Program Administrators

Procedia - Social and Behavioral Sciences 64 ( 2012 ) INTERNATIONAL EDUCATIONAL TECHNOLOGY CONFERENCE IETC2012

Cultivating an Enriched Campus Community

RtI Meeting 9/24/2012. # (Gabel)

Online Journal for Workforce Education and Development Volume V, Issue 3 - Fall 2011

Executive Summary. Curry High School

Interdisciplinary Journal of Problem-Based Learning

Physical and psychosocial aspects of science laboratory learning environment

UPPER ARLINGTON SCHOOLS

The My Class Activities Instrument as Used in Saturday Enrichment Program Evaluation

International School of Kigali, Rwanda

Barstow Community College NON-INSTRUCTIONAL

Developing skills through work integrated learning: important or unimportant? A Research Paper

Linking the Common European Framework of Reference and the Michigan English Language Assessment Battery Technical Report

The School Report Express. FYI Picayune

Development and Implementation of Written Education Plans (WEPs) Grant Toolkit

West Georgia RESA 99 Brown School Drive Grantville, GA

DAS-REMI District Accountability System Reporting, Evaluating, and Monitoring Instrument for the P2E2020SBP

California s Bold Reimagining of Adult Education. Meeting of the Minds September 6, 2017

ScienceDirect. Noorminshah A Iahad a *, Marva Mirabolghasemi a, Noorfa Haszlinna Mustaffa a, Muhammad Shafie Abd. Latif a, Yahya Buntat b

KDE Comprehensive School. Improvement Plan. Harlan High School

Algebra I Teachers Perceptions of Teaching Students with Learning Disabilities. Angela Lusk Snead State Community College

Learning Objectives by Course Matrix Objectives Course # Course Name Psyc Know ledge

Focus on. Learning THE ACCREDITATION MANUAL 2013 WASC EDITION

Effective Pre-school and Primary Education 3-11 Project (EPPE 3-11)

IMPROVING THE QUALITY OF LIFE FOR GRADUATE STUDENTS AT UNC

Curriculum Assessment Employing the Continuous Quality Improvement Model in Post-Certification Graduate Athletic Training Education Programs


California Professional Standards for Education Leaders (CPSELs)

ACCREDITATION STANDARDS

Historical Overview of Georgia s Standards. Dr. John Barge, State School Superintendent

ICT + PBL = Holistic Learning solution:utem s Experience

Sheila M. Smith is Assistant Professor, Department of Business Information Technology, College of Business, Ball State University, Muncie, Indiana.

PIRLS. International Achievement in the Processes of Reading Comprehension Results from PIRLS 2001 in 35 Countries

Save Children. Can Math Recovery. before They Fail?

Massachusetts Juvenile Justice Education Case Study Results

Reading Horizons. The Effectiveness of SSR: An Overview of the Research. Katherine D. Wiesendanger Ellen D. Birlem APRIL 1984

Alignment of Australian Curriculum Year Levels to the Scope and Sequence of Math-U-See Program

Working with traumatized students: A preliminary study of measures to assess school staff

Integrating Grammar in Adult TESOL Classrooms

Transcription:

Counselor Keys Effectiveness System Preliminary Analyses of Georgia CKES Piloting A collaborative research project of Georgia Department of Education, Georgia School Counselor Association, & Georgia Southern University College of Education

Counselor Keys Effectiveness System Preliminary Analyses of Georgia CKES Piloting Overview In 2014-15 the Georgia School Counselor Association, in partnership with the Georgia Department of Education, conducted a pilot of the Georgia Counselor Keys Effectiveness System (CKES). Building off of the Teacher Keys Effectiveness System (TKES), the CKES aimed to provide an instrument that would: (1) be a useful tool for administrators conducting annual evaluations of school counselors; and (2) be aligned with best practices and national standards as delineated by the American School Counselor Association (ASCA). Guiding Questions Questions guiding the subsequent research and analyses of the CKES pilot were: How did administrators perceive the CKES? How did school counselors perceive the CKES? How did the CKES instrument perform? This research brief provides an overview of the preliminary findings of the CKES research and analysis. What is the CKES Based on? Evaluation & Accountability Foundational to School Counseling With an outline similar to the TKES instrument, the CKES aims to minimize formatting and structural differences. The CKES aligns with state DOE expectations while simultaneously infusing school counseling professional standards and expectations as set forth by the American School Counselor Association (ASCA). School counselor evaluation aligns with the ASCA comprehensive guidance & counseling program framework (i.e., the ASCA National Model). 1

School Counseling in Georgia School Counselors serve PK-12 students academic, personal/social, and college/career needs across all school levels. Implementing and maintaining a comprehensive guidance and counseling program, school counselors are an integral part of student success. In their Role of the Professional School Counselor statement, the American School Counselor Association (ASCA) describes the purpose and necessity of evaluation: ACCOUNTABILITY To demonstrate the effectiveness of the school counseling program in measurable terms, school counselors analyze school and school counseling program data to determine how students are different as a result of the school counseling program. School counselors use data to show the impact of the school counseling program on student achievement, attendance and behavior and analyze school counseling program assessments to guide future action and improve future results for all students. The performance of the school counselor is evaluated on basic standards of practice expected of school counselors implementing a comprehensive school counseling program. More information about the role of school counselors and suggestions for how administrators can effectively evaluate and partner with their school counselor(s) is available at: http://www.schoolcounselor.org/administrators 2

How Many School Counselors Practice in Georgia? Based on Department of Education (DOE) statistics from March, 2015 approximately 3,768 school counselors practice in the state of Georgia. School Counselor Placement Figure 1 1. Gwinnett 9.8% 11. Muscogee 2.0% 2. Cobb 7.4% 12. Richmond 1.7% 3. DeKalb 7.4% 13. Columbia 1.6% 4. Fulton 5.0% 14. Douglas 1.5% 5. Atlanta 3.4% 15. Paulding 1.5% 6. Clayton 2.7% 16. Houston 1.5% 7. Henry 2.6% 17. Bibb 1.4% 8. Chatham 2.4% 18. Fayette 1.3% 9. Forsyth 2.2% 19. Hall 1.3% 10. Cherokee 2.1% 20. Coweta 1.2% The majority 98.0% practice in traditional public districts (i.e., county, city) with only 2.0% serving at state charter schools. Figure 1 outlines the 20 districts with the highest concentrations of school counselors. These 20 districts make up 60% of school counselor placements in the state. The remaining 40% are disbursed across 171 districts throughout the state. Figure 2 presents the reported levels (i.e., elementary, middle, and high) school counselors are practicing at. DOE data indicates 40% of school counselors practice at the Elementary level, 23% at Middle, 35% at High, and 2% at Other. For reporting purposes, Other included Counseling Parapros, Special Education (SPED) Counselors, and GNETS Counselors. Figure 2 2% 35% 40% ES MS HS Other 3 23%

Reporting the Results Administrator Perceptions During the 2014-15 academic year, a small initial sampling of administrators (N = 24) reviewed the CKES instrument, utilized the CKES in evaluation of their school counselor(s), and then completed a 33-question survey. This CKES pilot was done in addition to normal evaluative practices. For participant convenience, the survey was administrated online and the majority of questions were multiple-choice perception questions. Participants were presented with a statement ( The directions on the instrument were easy to follow ) and asked to choose the answer that best represented how they felt. The majority of survey questions utilized a 5-point Likert-type scale ranging from Strongly Disagree, Disagree, Unsure, Agree, and Strongly Agree. Majority of Administrators Rated CKES Favorably 4 Of the 33 questions presented to participating administrators, 11 seem germane to discussion of the CKES reception, use, and performance. Item responses not included here (i.e., This instrument changed or influenced my understanding of the role of the school counselor. ) are available from the project researcher. Figure 3 presents results indicating the majority of participating administrators found the CKES instrument a reasonable, easy, helpful and fair tool for evaluating school counselors. Specifically, 91.7% of administrators felt that the CKES was an improvement compared to evaluation tools previously used for school counselors (e.g., Question 19). Figure 3 Strongly Question/Perception Agree Agree TOTAL (Q2) CKES easy to follow 66.7% 25.0% 91.7% (Q3) CKES easy to understand 70.8% 25.0% 95.8% (Q4) CKES easy to use 62.5% 29.2% 91.7% (Q5) CKES reasonable length 70.8% 29.2% 100.0% (Q7) CKES terminology easy to understand 62.5% 29.2% 91.7% (Q8) CKES rating scale fair and equitable 54.2% 41.7% 95.9% (Q9) CKES examples were helpful 54.2% 37.5% 91.7% (Q12) Distinguishing CKES Proficient and Exemplary ratings 58.3% 33.3% 91.6% was clear (Q13) CKES helpful in defining role of school counselor per 58.3% 29.2% 87.5% ASCA (Q14) CKES reflects role of school counselor at my school 50.0% 37.5% 87.5% (Q19) CKES is an improvement compared to former evaluation tool 66.7% 25.0% 91.7%

School Counselor Perceptions During the 2014-15 academic year, a small initial sampling of school counselors (N = 40) reviewed the CKES instrument, agreed to be evaluated by administration using the CKES, and then completed a 33-question online survey similar in structure to the administrators survey (i.e., asked to respond to statements, multiple-choice, 5-point Likert-scale, etc.). This CKES pilot was done in addition to normal evaluative practices. Again, of 33 questions presented to participants, 11 seem germane to discussion of the CKES. Item responses not included here (i.e., This instrument helped open dialogue with my administrator about my role as a school counselor at our school. ) are available from the project researcher. Majority of School Counselors Rated CKES Favorably Figure 4 presents results indicating that for most survey items, the majority of participating school counselors found the CKES instrument a reasonable, easy, helpful and fair tool for evaluating school counselors. Notably, where the vast majority of administrators felt that the CKES was an improvement over previous evaluation tools, only 47.5% of participating school counselors indicated a similar perception (e.g., Question 19). This is interesting considering the favorable ratings (62.5 82.5%) participating school counselors gave to the other 10 survey questions, and warrants further study. Figure 4 Question/Perception Agree Strongly Agree Total (Q2) CKES easy to follow 65.0% 10.0% 75.0% (Q3) CKES easy to understand 62.5% 10.0% 72.5% (Q4) CKES easy to use 57.5% 7.5% 65.0% (Q5) CKES reasonable length 62.5% 5.0% 67.5% (Q7) CKES terminology easy to understand 72.5% 7.5% 80.0% (Q8) CKES rating scale fair and equitable 60.0% 12.5% 72.5% (Q9) CKES examples were helpful 47.5% 22.5% 70.0% (Q12) Distinguishing CKES Proficient and Exemplary ratings 55.0% 15.0% 70.0% was clear (Q13) CKES helpful in defining role of school counselor per 57.5% 25.0% 82.5% ASCA (Q14) CKES reflects my current role as school counselor 50.0% 12.5% 62.5% (Q19) CKES is an improvement compared to former evaluation tool 27.5% 20.0% 47.5% 5

CKES Instrument Performance During the 2014-2015 academic year, CKES pilot data was collected for approximately 117 participating school counselors. This CKES pilot was done in addition to normal evaluative practices. Participating school counselors represented 9 different districts (county or city) across the state. Figure 5 lists participating districts in descending order. Figure 6 presents participant distribution across levels. Participating Districts Figure 5 1. Gwinnett 62.2% 2. Lowndes 15.1% 3. Burke 9.2% 4. Peach 6.7% Mirroring state-wide trends, the majority of participants were school counselors in Gwinnett County schools. Similarly, the majority of participants were school counselors at the Elementary School level. Figure 6 3% 5. Decatur 2.5% 6. Chatham 1.7% 7. Glynn 0.8% 25% 44% ES MS HS 8. Towns 0.8% Other 9. Valdosta 0.8% Preliminary assessment of the CKES instrument performance involved statistical analyses reviewing descriptive statistics and review of three areas: normality; reliability; and item correlations. 28% Normality of the distribution of scores from the piloting of the CKES was determined by review of individual items histogram graphs, skewness and kurtosis statistics for individual items, and items skewness and kurtosis statistics divided by individual standard errors (SE). 6 Initial review of items histograms showed minimal concern. Review of items skewness statistics indicated no items performing outside general parameters. However, items #1, #3, #7, and #8 all presented kurtosis statistics outside general parameters. Further analysis employed review of kurtosis statistics divided by the standard error of the statistic. This action suggested data from items #7 and #8 performed outside the expectations of a normal distribution.

Reliability of the CKES instrument was assessed reviewing Cronbach s Alpha. The Cronbach Alpha statistic for the CKES was α = 0.920. Consideration of removal of any of the 10 individual items yielded no increase in alpha. Finally, inter-item correlations between the 10 items on the CKES were reviewed. Correlations ranged from 0.281 (items #6 and #9) to 0.691 (items #7 and #9) with a mean average 0.54. CKES Performed Moderately Well Overall the CKES instrument performed moderately well in this preliminary piloting. Concerns regarding the normality of the data centered around items #7 and #8. General practice recognizes a degree of skewness and kurtosis will be displayed when working with perceptual Likert-type scale instruments. Furthermore, kurtosis may be more pronounced when the data presents flat due to ratings being near-evenly distributed across 2 categories (e.g., Proficient and Exemplary ). The Cronbach s Alpha statistic of 0.920 suggests relatively high internal consistency. Further encouraging is that the CKES reliability would not be improved if any individual items were removed. Review of the inter-item correlations indicated values higher than expected. This may be due to various reasons such as overlapping items measuring the same content in different ways, or users espousing a more holistic perception of school counselor functioning as they complete each item. Recommendations for Further Research INCREASED SAMPLE SIZE: An increased sample size would provide data for continued and more in-depth analyses. FACTOR ANALYSIS: An integral part of survey development and validation is the use of factor analyses. Such analyses would identify and assess various dimensions the CKES is attempting to measure. RESOURCES: Funded by a grant from the US DOE, the 2013 TKES/LKES pilot evaluation stands as a noteworthy example. Incorporating observational data, focus groups, district trainings, and surveys, the study utilized approximately 5,800 participants. Administrators and school counselors implementing the CKES would benefit from similar support. AGENCY COLLABORATION: Continued collaboration between GA DOE, GSCA, and school counselor preparation programs (i.e., Georgia Southern University) will provide consistent information, resources, and support to administrators and school counselors using the CKES. 7

Project Contact Information Who do I contact for more information about CKES? More information about the CKES instrument is available by contacting Georgia School Counselor Association (GSCA) at: CKES@gaschoolcounselor.org The CKES instrument was created in 2013-14 by the GSCA Performance Evaluation Committee. Members include: 8 Julie Hartline, EdD President, 2014-2015 Georgia School Counselor Association Julie.hartline@gaschoolcounselor.org Shellie Caplinger, Fulton County Public Schools Mark Ellis, Fulton County Public Schools Julie Hartline, Cobb County School District Stacey Miller, Gwinnett County Public Schools Sloane Molloy, Glynn County Public Schools Tinisha Parker, Gwinnett County Public Schools Lakeshia Williams, Bibb County Robin Zorn, Gwinnett County Public Schools Dr. Julie Hartline chaired the GSCA Performance Evaluation Committee. Dr. Hartline entered the field of education in 1991 after serving as a parole officer in Atlanta, Georgia and discovering that over 85% of her caseload had not completed high school. She served as chair of the Campbell High School counseling department for 14 years where her department became one of the first high schools in the state to receive the Recognized ASCA Model Program (RAMP) award from the American School Counselor Association (ASCA) in 2008. In 2009, Dr. Hartline was nationally recognized, being awarded the 2009 ASCA Counselor of the Year. Dr. Hartline was elected 2013-2014 Georgia School Counselor Association President Elect and chaired the committee that created CKES. Dr. Hartline is the School Counseling and Advisement Consultant for Cobb County School District. She is passionate about the field of school counseling and the difference comprehensive school counseling programs make in the lives of students. She enjoys training school counselors and administrators in her district, the state, and around the nation on the ASCA National Model and the RAMP application process.

Project Researcher: Dr. Richard E. Cleveland While a full-time school counselor in Washington, Richard was involved with planning and coordinating district-wide initiatives, leading school counselor in-service trainings, and serving with the district s crisis response team. In 2008 Richard was awarded the Washington School Counselor Association s (WSCA) School Counselor of the Year Award. In 2009 Richard was elected President-Elect of WSCA. Leaving public schools, Richard served as an adjunct instructor while pursuing his doctoral degree. Since 2014 Richard has served as an Assistant Professor in the College of Education at Georgia Southern University. Richard s presentations include district inservice trainings, workshop sessions at state, national, and international levels, invited speaker for television and radio, and keynote speaker at educational events. Richard has written for various publications including Counseling and Spirituality, Counselors for Social Justice The Activist, The Journal of Research on Christian Education, the American School Counselor Association s School Counselor, as well as the Washington School Counselor Association s Insights. Richard E. Cleveland, PhD Assistant Professor Leadership, Technology, & Human Development College of Education Georgia Southern University (912)478.8022 office rcleveland@georgiasouthern.edu https://richardcleveland.me Follow on Twitter: @RichieKinz 9