CAREER-TECHNICAL EDUCATION

Similar documents
Writing a Basic Assessment Report. CUNY Office of Undergraduate Studies

Higher Education / Student Affairs Internship Manual

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Colorado State University Department of Construction Management. Assessment Results and Action Plans

ASSESSMENT REPORT FOR GENERAL EDUCATION CATEGORY 1C: WRITING INTENSIVE

Contract Language for Educators Evaluation. Table of Contents (1) Purpose of Educator Evaluation (2) Definitions (3) (4)

SPECIALIST PERFORMANCE AND EVALUATION SYSTEM

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

CONNECTICUT GUIDELINES FOR EDUCATOR EVALUATION. Connecticut State Department of Education

Department of Geography Bachelor of Arts in Geography Plan for Assessment of Student Learning Outcomes The University of New Mexico

Kentucky s Standards for Teaching and Learning. Kentucky s Learning Goals and Academic Expectations

School Leadership Rubrics

Practice Learning Handbook

GRADUATE PROGRAM Department of Materials Science and Engineering, Drexel University Graduate Advisor: Prof. Caroline Schauer, Ph.D.

Assessment of Student Academic Achievement

CONTINUUM OF SPECIAL EDUCATION SERVICES FOR SCHOOL AGE STUDENTS

Practice Learning Handbook

MYP Language A Course Outline Year 3

Introduction: SOCIOLOGY AND PHILOSOPHY

Saint Louis University Program Assessment Plan. Program Learning Outcomes Curriculum Mapping Assessment Methods Use of Assessment Data

DEPARTMENT OF KINESIOLOGY AND SPORT MANAGEMENT

Minutes. Student Learning Outcomes Committee March 3, :30 p.m. Room 2411A

Doctoral GUIDELINES FOR GRADUATE STUDY

Wildlife, Fisheries, & Conservation Biology

Examining the Structure of a Multidisciplinary Engineering Capstone Design Program

ACADEMIC AFFAIRS GUIDELINES

Reference to Tenure track faculty in this document includes tenured faculty, unless otherwise noted.

Arizona s English Language Arts Standards th Grade ARIZONA DEPARTMENT OF EDUCATION HIGH ACADEMIC STANDARDS FOR STUDENTS

Final Teach For America Interim Certification Program

Comprehensive Program Review Report (Narrative) College of the Sequoias

Kelso School District and Kelso Education Association Teacher Evaluation Process (TPEP)

EQuIP Review Feedback

Midterm Evaluation of Student Teachers

Palomar College Curriculum Committee Meeting Agenda Wednesday March 1, 2017 Room AA 140 at 3:00 pm

BENCHMARK TREND COMPARISON REPORT:

Senior Project Information

Facing our Fears: Reading and Writing about Characters in Literary Text

University of Toronto

College of Science Promotion & Tenure Guidelines For Use with MU-BOG AA-26 and AA-28 (April 2014) Revised 8 September 2017

STUDENT LEARNING ASSESSMENT REPORT

Assessment Method 1: RDEV 7636 Capstone Project Assessment Method Description

Guidelines for Project I Delivery and Assessment Department of Industrial and Mechanical Engineering Lebanese American University

Nova Scotia School Advisory Council Handbook

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

Scoring Guide for Candidates For retake candidates who began the Certification process in and earlier.

South Carolina English Language Arts

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

SURVEY RESEARCH POLICY TABLE OF CONTENTS STATEMENT OF POLICY REASON FOR THIS POLICY

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Lecturer Promotion Process (November 8, 2016)

Strategic Planning for Retaining Women in Undergraduate Computing

A Systems Approach to Principal and Teacher Effectiveness From Pivot Learning Partners

Table of Contents. Internship Requirements 3 4. Internship Checklist 5. Description of Proposed Internship Request Form 6. Student Agreement Form 7

Student Learning Outcomes: A new model of assessment

Expanded Learning Time Expectations for Implementation

ABET Criteria for Accrediting Computer Science Programs

School Size and the Quality of Teaching and Learning

Linking the Common European Framework of Reference and the Michigan English Language Assessment Battery Technical Report

Teachers Guide Chair Study

August 22, Materials are due on the first workday after the deadline.

Legal Technicians: A Limited License to Practice Law Ellen Reed, King County Bar Association, Seattle, WA

Promotion and Tenure Guidelines. School of Social Work

Evidence for Reliability, Validity and Learning Effectiveness

ECON 365 fall papers GEOS 330Z fall papers HUMN 300Z fall papers PHIL 370 fall papers

eportfolio Guide Missouri State University

Core Strategy #1: Prepare professionals for a technology-based, multicultural, complex world

Bureau of Teaching and Learning Support Division of School District Planning and Continuous Improvement GETTING RESULTS

Major Milestones, Team Activities, and Individual Deliverables

SACS Reaffirmation of Accreditation: Process and Reports

CREATE YOUR OWN INFOMERCIAL

A Note on Structuring Employability Skills for Accounting Students

Getting Ready for the Work Readiness Credential: A Guide for Trainers and Instructors of Jobseekers

John Jay College of Criminal Justice, CUNY ASSESSMENT REPORT: SPRING Undergraduate Public Administration Major

A Guide to Student Portfolios

Career Checkpoint. What is Career Checkpoint? Make the most of your Marketable Skills

HIGHLAND HIGH SCHOOL CREDIT FLEXIBILITY PLAN

LITERACY ACROSS THE CURRICULUM POLICY

EDIT 576 DL1 (2 credits) Mobile Learning and Applications Fall Semester 2014 August 25 October 12, 2014 Fully Online Course

ACADEMIC AFFAIRS CALENDAR

EDIT 576 (2 credits) Mobile Learning and Applications Fall Semester 2015 August 31 October 18, 2015 Fully Online Course

The following faculty openings are managed by our traditional hiring process:

BEYOND FINANCIAL AID ACTION PLANNING GUIDE

Update on Standards and Educator Evaluation

The Ohio State University Library System Improvement Request,

Procedures for Academic Program Review. Office of Institutional Effectiveness, Academic Planning and Review

Promotion and Tenure Policy

Improving recruitment, hiring, and retention practices for VA psychologists: An analysis of the benefits of Title 38

Student Learning Objectives Overview for New Districts

SASKATCHEWAN MINISTRY OF ADVANCED EDUCATION

BUSINESS OPERATIONS RESEARCH EVENTS

Content Teaching Methods: Social Studies. Dr. Melinda Butler

TABLE OF CONTENTS. By-Law 1: The Faculty Council...3

TU-E2090 Research Assignment in Operations Management and Services

ASSESSMENT OF STUDENT LEARNING OUTCOMES WITHIN ACADEMIC PROGRAMS AT WEST CHESTER UNIVERSITY

Faculty Feedback User s Guide

Building our Profession s Future: Level I Fieldwork Education. Kari Williams, OTR, MS - ACU Laurie Stelter, OTR, MA - TTUHSC

Accreditation of Prior Experiential and Certificated Learning (APECL) Guidance for Applicants/Students

Karla Brooks Baehr, Ed.D. Senior Advisor and Consultant The District Management Council

Programme Specification. BSc (Hons) RURAL LAND MANAGEMENT

Transcription:

Los Angeles Valley College CAREER-TECHNICAL EDUCATION Program Outcomes Assessment Report CTE Program Assessment Group: Elmida Baghdaserians (Child Development), Alan Cowen (Emergency Services), Joan Hackeling (SLO Program Assessment Coordinator), Annette Jennings (Computer Applications & Office Technology), Laurie Nalepa (CTE Dean) and Ron Reis (Technology) MAY 2012

Career-Technical Education Program Outcomes Assessment Report June 2012 TABLE OF CONTENTS I. Executive Summary II. III. IV. Career-Technical Education Program Assessment Career-Technical Education Program Student Learning Outcomes Overview of the Program Assessment Process A. Participating Offices, Departments, Committees, Workgroups and Coordinators B. Program Assessment Workgroup C. Program Assessment Timeline V. Methodology A. Evidence B. Sampling C. Data Collection D. Data Assessment VI. VII. VIII. IX. Findings A. Student Learning Outcomes B. Program Assessment Process Recommendations A. Student Learning Outcomes B. Program Assessment Process Plans for Implementation A. Student Learning Outcomes B. Program Assessment Process Plan to Share Results X. List of Appendices XI. Appendices 2

Career-Technical Education Program Outcomes Assessment Report June 2012 I. EXECUTIVE SUMMARY The goal of all student learning assessment is to improve student learning. Assessment helps us recognize actual student skills and learning, which is a crucial part of any effort to strengthen those outcomes. This report presents the results of the first assessment of student learning in the Career-Technical Education (CTE) program at Los Angeles Valley College. It addresses the program's four broad student learning outcomes (SLO) related to communication skills, reasoning skills, technical skills and professional behavior. What did we learn about the skills and learning of our CTE students? We learned that the majority of students met or exceeded faculty expectations across all four areas, though skill levels varied considerably across disciplines. We also identified strengths and weaknesses. In terms of writing skills, our students generally know how to provide relevant content in support of a main idea, but they need more guidance and practice in organizing that content and in the correct use of grammar and sentence structure. In terms of reasoning skills, our students are able to acquire information, but are less skilled in evaluating, analyzing, interpreting and applying that information to solve problems. Finally, students showed room for improvement in the areas of responsibility and punctuality, though no norming process was conducted among faculty who evaluated students' professional behavior. We learned important things about the program assessment process as well. We learned that participating faculty are generally supportive of this work, given enough information, advance notice and guidance in collecting the necessary data. We saw that adjustments in our assessment tools (rubrics, surveys) and the format of student work samples would allow us to better differentiate among diverse skill levels and improve the accuracy of our assessment. We learned that we need to develop more ways to assess student reading, speaking and listening skills, beyond students' self-assessments. And we need to do a better job of communicating our criteria for achieving learning outcomes with all parties who evaluate the skills of our students, including advisory board members and employers. Finally, we must develop better strategies and tools to engage faculty in the very time-consuming process of analyzing detailed data. It is essential to a shared, sustainable and credible program assessment process. We make several broad recommendations. We recommend that CTE learning goals and expectations be made much more visible and explicit to all relevant faculty, students, advisory boards, employers, administrators and staff. We urge faculty to find ways to integrate program SLO assessment rubrics into coursework. We also request that faculty create more opportunities for students to practice and assess their program SLO skills. We recommend that materials be developed to support faculty in making these adjustments in their courses. Finally, we request that professional development credit be extended to faculty who participate actively as members of program SLO assessment or implementation teams. We support the formation of a CTE faculty SLO leadership team that will take the lead in implementing the recommendations of this report. We ask department chairs to facilitate wider discussions of programlevel skills and changes as part of existing conversations, such as department meetings. We request that discipline-specific technical skills be communicated to all students pursuing particular CTE degrees and/or certificates (via college catalog, college website). This report represents a year-long project of the CTE Program Assessment Workgroup. Members of the workgroup are: Elmida Baghdaserians (Child Development), Alan Cowen (Emergency Services), Annette Jennings (Computer Applications and Office Technology), Laurie Nalepa (CTE Dean), Ron Reis (Technology) and Joan Hackeling (SLO Program Assessment Coordinator).

Career-Technical Education Program Outcomes Assessment Report June 2012 I. CAREER-TECHNICAL EDUCATION PROGRAM ASSESSMENT Los Angeles Valley College has defined an instructional program as a major education pathway that a student pursues through the institution. Career-Technical Education (CTE) is one of three such pathways. Program assessment at the College is based on these pathways. For a complete statement of the pathways model for program assessment, see Appendix 1. The College's Career/Technical Education (CTE) program includes disciplines with a CTE TOP code. This program emphasizes the achievement of students' goals relative to employment and includes general skills in communication and reasoning, specific technical skills appropriate to the field of study, and the demonstration of professional behavior. II. CTE PROGRAM STUDENT LEARNING OUTCOMES The student learning outcomes (SLOs) for the CTE program at LA Valley College are: Communication Skills SLO: Students will be able to communicate clearly through writing, speaking, listening and reading. Reasoning Skills SLO: Students will be able to acquire, evaluate and interpret information. As a result, students will be able to solve problems relevant to their field of study. Technical Skills SLO: Students will be able to demonstrate technical skills appropriate to their field of study. Professional Behavior SLO: Students will be able to demonstrate appropriate professional behavior (e.g. timeliness) and interpersonal skills (e.g. teamwork, leadership, cultural diversity). III. OVERVIEW OF THE PROGRAM ASSESSMENT PROCESS A. Participating Offices, Academic Departments, Faculty Committees, Workgroups, Faculty and Coordinators The assessment of the CTE program was overseen by the Office of Research and Planning. It was guided by the SLO Committee, the SLO Coordinator and the SLO Executive Team. It was conducted by a CTE faculty assessment workgroup and coordinated by the SLO Program Assessment Coordinator. Additional faculty from across 28 CTE disciplines also participated in the collection of student work samples and the assessment of student professional behavior. The following CTE disciplines participated in this assessment cycle: Accounting, Administration of Justice, Architecture, Banking and Finance, Broadcasting, Business, Business Law, Child Development, Cinema, Commercial Music, Computer Applications and Office Technology, Computer Science and Information Technology, Electronics, Engineering-General Technology, Fire Technology, International Business, Journalism, Management, Machine Shop, Marketing, Media Arts, Numerical Control, Nursing, Photography, Real Estate, Respiratory Therapy, Theater Arts (Technical) and Tool and Manufacturing.

Career-Technical Education Program Outcomes Assessment Report June 2012 For a variety of reasons, the following CTE departments or disciplines were not included in this cycle: Continuing Education (Vocational), Cooperative Education, Correctional Science, Educational Technologies, GIS Mapping, Graphic Design, Homeland Security, Scientific Visualization. B. CTE Assessment Workgroup The CTE assessment workgroup was formed in early Spring 2011. Members of the workgroup represented a broad range of CTE disciplines, including the two largest program departments: Child Development and Emergency Services. They were: Elmida Baghdaserians (Child Development), Alan Cowen (Emergency Services), Annette Jennings (Computer Applications and Office Technology), Ron Reis (Technology), Laurie Nalepa (Dean, Career Technical Education) and Joan Hackeling (Program Assessment Coordinator). C. CTE Program Assessment Timeline This CTE program assessment cycle began formally in Spring 2011, with the formation of our assessment workgroup, though planning and preliminary work began in Fall 2010. The process was completed in Spring 2012. For a visualization of the various steps or stages in the CTE assessment process, see Appendix 2. Fall 2010: The Program Assessment Coordinator met with the CTE Committee in October to begin preparing for data collection the following Spring. Faculty completed course-program alignment grids for each CTE discipline. These grids specify the degree of emphasis each CTE course places on each of the program student learning outcomes. A sample alignment grid is attached in Appendix. CTE faculty also began developing lists of the discipline-specific technical skills that students should master upon completion of a certificate or degree. Winter 2011: The PA Coordinator worked with individual CTE faculty to facilitate the completion of the course-program alignment grids and the lists of discipline-specific technical skills. The PA Coordinator also planned for the assessment data collection beginning in Spring 2011. Spring 2011: The assessment workgroup met throughout Spring 2011 to (1) develop the program SLO assessment rubrics, (2) develop the student survey, () plan for the collection of student work samples and faculty assessments of student professional behavior, and () develop an appropriate and valid sampling methodology in consultation with the Dean of Research and Planning. Student work samples and assessments of student professional behavior were collected from participating faculty across the CTE program by the end of May. Summer 2011: The workgroup convened for its assessment week August 1- to assess the student work samples, review student professional behavior assessments, summarize the results and draft preliminary findings and recommendations for improvement. Fall 2011: The workgroup presented its preliminary findings to the College on August 25, 2011, Opening Day of the 2011-2012 academic year (see Appendix X). The workgroup met again in September to review and refine its preliminary findings and recommendations, and draft a plan to implement those recommendations. The CTE survey of students was also finalized and conducted online in December 2011. Personal e-mail survey invitations were sent to all students who had applied for CTE degrees or certificates in 2011. Two follow-up e-mails were also sent to remind students to complete the survey. 5

Career-Technical Education Program Outcomes Assessment Report June 2012 Spring 2012: The results of the CTE student survey, along with post-graduation Nursing surveys and professional licensing exam results for Nursing and Respiratory Therapy, were circulated to the workgroup in February. Several members of the workgroup met on March 6 and March 1 to review and discuss the additional assessment data and to update our findings, recommendations and implementation plan. The Program Assessment Coordinator will present the CTE Assessment Report to the following College bodies in Spring 2012 for feedback and approval: CTE Committee (April 10), SLO Committee (April 16), Program Effectiveness and Planning Committee (May ), Institutional Effectiveness Committee (May 15), Academic Senate (May 17), and Chairs and Directors' Committee (May 22). IV. METHODOLOGY A. Evidence The evidence gathered to assess CTE student learning was both direct and indirect. The direct evidence included (1) student writing work samples, (2) student reasoning work samples, () faculty assessments of student professional behavior, and () the results of professional licensing exams. Indirect evidence included (5) a survey of students who recently completed CTE certificates and/or degrees, (6) surveys of CTE advisory boards, and (7) a survey of employers. For an overview of the CTE assessment model, see Appendix. B. Sampling A sampling process was conducted in order to identify a valid subset of CTE students from whom writing and reasoning work could be collected and assessed. Instructors were also asked to assess the professional behavior of students identified in the sample. Sampling was conducted at three levels: courses, sections and students. 1. Courses: In order to capture student learning across the CTE program, the workgroup identified courses in each CTE discipline that students were more likely to take toward the end of their LAVC experience. We sought the input of department chairs and other faculty representatives in order to identify and prioritize "capstone" or advanced courses. Some disciplines, however, have no progression of courses. In such cases, we identified courses that were most likely taken by majors or students pursuing degrees or certificates in that CTE discipline. This was done in close collaboration with the Dean of Research and Planning. 2. Sections: If more than one section was offered for a course, sections were chosen for sampling to reflect the diversity of instruction: full-time and adjunct instructors, day and evening classes, and online, hybrid or face-to-face class meetings. So, if there were 5 sections of a course, we selected 2- sections with this diversity in mind. We also sought to have our sample population reflect the distribution of students across CTE disciplines. Larger disciplines, such as Child Development and Administration of Justice, should make up larger proportions of the sample population than smaller disciplines, such as Theater or Machine Shop.. Students: The Office of Research and Planning generated post-census student enrollment rosters for 70 course sections across 27 CTE disciplines. These rosters were then randomized by student ID number. 6

Career-Technical Education Program Outcomes Assessment Report June 2012 We then developed several rules for sampling from each section. For example, sections with 10 or fewer students were sampled at 100% (all students). Sections with 11-19 students were sampled at 50%. Sections with 20 or more students were sampled at 25%. These rules determined the number of students to be sampled from each section. Once the section sample size was calculated, the Program Assessment Coordinator randomly (though systematically) selected the appropriate number of students from the rosters already randomized by student ID number. An additional -6 students were then added to the list in the event the work of the sample population was unavailable for collection. Instructors were sent these lists, which included students' names, to facilitate their identification. Each student was also given a unique number. To protect the identity of students, instructors used these unique numbers to identify students on the professional behavior assessment rubrics they completed. A total of 2,180 students were enrolled in these courses/sections at the time of sampling. Our sample size was 55 students, approximately 25%, within a -% margin of error at a confidence level of 95%. The 25% sample size is also slightly more than the minimum 20% sample indicated, according to the accepted sampling practices for SLO assessment (to the extent that there is a standard in place). This provided a small margin for uncollected data. C. Data Collection 1. Student Writing Samples: Participating faculty were sent lists of the students in their classes identified for sampling. Guidelines were sent regarding the collection and submission of these works (see Appendix ). The exact format and timing was left to the discretion of individual faculty. Whenever possible, faculty were encouraged to submit student work based on existing student assignments. There was considerable discussion among participating faculty, the workgroup, the SLO Coordinator, the Dean of Office of Research & Planning and the Chair of the Curriculum Committee about the provision of work samples from some disciplines, such as Computer Science and Nursing, for which there seemed to be no appropriate existing assignments. 2. Student Reasoning Samples: Participating faculty submitted student reasoning work collected from the same sample population. Guidelines were also sent to faculty regarding the collection and submission of these work samples. Again, the exact format was left to the discretion of individual faculty. Whenever possible, faculty were encouraged to submit existing student assignments. In most cases, the student writing samples also served as reasoning samples.. Results of Professional Licensing Exams: The National Council of State Boards of Nursing provides the college with a detailed report of the performance of its students on professional licensing exams (NCLEX) each year. The report also includes a comparison of LAVC students' performance from year to year and relative to the performance of students from comparable regional and national institutions. Nursing exam results used in this assessment cycle were drawn from the April 2010 - March 2011 reporting period. The National Board for Respiratory Care (NBRC) provides the college with the passing rates of its students on a number of licensing exams: the CRT exams for certification as an entry-level respiratory therapist, as well as the Clinical Scenario Exam (CSE) and Written Registry Respiratory Therapy 7

Career-Technical Education Program Outcomes Assessment Report June 2012 (WRRT) exam for the more advanced credential as a registered respiratory therapist (RRT). For more detail, visit the NBRC website: http://www.nbrc.org/. Assessments of Professional Behavior: Participating faculty used an assessment rubric to evaluate the professional behavior of their students identified in the sample. This professional behavior assessment rubric was approved by the CTE Committee. Students were evaluated in terms of their personal integrity, respect for others, responsibility and punctuality. 5. Assessments of Student Reading, Listening and Speaking Skills: The workgroup was unable to develop a mechanism to assess the reading, listening and speaking skills of CTE students within the time frame of this first assessment cycle. 6. Student surveys: The workgroup developed a survey that asked students to assess their own skills and provide feedback about their educational experience at LAVC. The questions were developed with input from the CTE Committee. A unique survey was developed for each CTE discipline. Each survey included a set of common questions along with unique questions about technical skills specific to each discipline. The workgroup, in consultation with the SLO Coordinator and Dean of Research & Planning, opted to survey students who had applied for CTE degrees and certificates in 2011, and to do so via Survey Monkey. No surveys were completed by students in disciplines in which no certificates or degrees were awarded. The survey was drafted and prepared in Survey Monkey by June 2011. At the end of Fall 2011, personalized e-mails were sent to 06 students, inviting them to complete the online survey. Two followup e-mails were sent to unresponsive students to remind them to complete the survey before the deadline. An example of the student survey is included in Appendix 5. Nursing and Respiratory Therapy students were not included in this survey of students. These disciplines conduct their own exit surveys of graduating students each Spring. 7. Advisory Board surveys: Each department or discipline within the CTE pathway is required to have an Advisory Board, and that Board meets every Spring. At those meetings, Board members are asked to complete a paper survey about different aspects of the program. The survey includes questions about the students' communication skills, reasoning skills, technical skills and professional behavior (see Appendix 6). 8. Employer surveys: Employers who hire LAVC students as interns as part of the College's Cooperative Education program are asked to evaluate the skills of those interns. The survey is included in Appendix 7. Recruiters who participate in the Annual Cooperative Education Job Fair on campus are also surveyed. The Job Fair survey is included in Appendix 8. The CTE assessment model also calls for the review of course SLO assessment reports to assess student technical skills. Too few course assessment reports had been completed at the time of this assessment cycle to be included as evidence. D. Data Assessment Student writing and reasoning samples were assessed by the workgroup according to criteria developed and approved by the CTE Committee. The critieria are detailed in the assessment rubrics. 8

Career-Technical Education Program Outcomes Assessment Report June 2012 In order to ensure inter-rater reliability, members of the assessment workgroup participated in norming processes for both writing and reasoning work samples. Each member received copies of five student writing samples and five student reasoning samples. They assessed these samples using the assessment rubrics for those skills. The group then reviewed and discussed their assessments and identified areas in which evaluations differed. The group then developed consensus regarding what constituted different levels of success across each skill dimension. Following the norming process, the workgroup assessed the remaining 5 writing samples and 6 reasoning samples. Each sample was evaluated by one group member, but no one member assessed all the student work from a particular discipline. At least two faculty assessed samples from each discipline. The workgroup assessed writing skills in terms of the following four dimensions: organization, content, spelling and vocabulary, and grammar and sentence structure. The rubric is attached as Appendix 9. Reasoning skills were assessed according to the following four dimensions: acquisition of information, analysis or evaluation of information, interpretation of information or drawing of conclusions, and application to solve a problem or issue. The reasoning skills assessment rubric is attached in Appendix 10. Student professional behavior was assessed directly by individual instructors in terms of personal integrity, respect for others, responsibility and punctuality. The rubric is attached in Appendix 11. The results of all of these outcomes assessments were then tabulated, summarized and reviewed by the workgroup and PA Coordinator by the end of the third day of our 2011 summer assessment week. On the fourth day, members reviewed and discussed these results, along with the results of the Advisory Board surveys, and drafted preliminary findings and recommendations. The results of the 2011 Nursing and Respiratory Therapy licensing exams and Nursing exit surveys were reviewed in Spring 2012. 9

Career-Technical Education Program Outcomes Assessment Report June 2012 V. FINDINGS This section summarizes the results and findings of the data assessment process. The workgroup's recommendations and plans to implement changes for each program learning outcome are included in Sections VI, VII, VIII and IX. A. Student Learning Outcomes 1. Communication Skills a. A total of 58 student writing samples from across 25 CTE disciplines were assessed. A summary of the results are shown in Table 1. The majority of students met or exceeded expectations across all four dimensions of writing, based on the CTE workgroup's assessment of student writing samples. Student skills showed the greatest room for improvement in terms of grammar and sentence structure. 1 b. 8-87% of CTE student survey respondents indicated that the program prepared them well or very well to communicate clearly through writing and speaking and to listen and read effectively (see Table ). c. Graduating nursing students strongly agreed that the Nursing program prepared them to utilize effective communication to achieve positive client outcomes. No Respiratory Therapy exit survey data was available at the time this report was prepared. d. 67% of Advisory Board survey respondents agreed or somewhat agreed that LAVC students left their programs with appropriate communication skills. 2% indicated they were unable to rate. e. Three employer surveys of LAVC Spring 2011 interns were submitted as part of the College s Cooperative Education program. They indicated that 100% of student employees possessed appropriate communication skills. These surveys are available upon request. e. Three employer surveys of LAVC Spring 2011 interns were submitted as part of the College's Cooperative Education program. They indicated that 100% of student employees possessed appropriate communication skills. These surveys are available upon request. TABLE 1: Results of Faculty Assessment of Student Writing Samples ASSESSMENT OF STUDENT WRITING SKILLS Exceeds Meets Does Not Meet Insufficient Expectations Expectations Expectations Data Organization 5% 52% 9% % Content 5% 58% 5% % Grammar & Sentence Structure 21% 61% 16% % Vocabulary & Spelling 22% 70% 6% 2% 1 No writing samples were submitted from the following participating disciplines: Finance, Geographic Information Systems, Graphic Design and Respiratory Therapy. 10

Career-Technical Education Program Outcomes Assessment Report June 2012 2. Reasoning Skills a. Faculty assessed 51 student reasoning work samples from across 25 CTE disciplines. A summary of the results are shown in Table 2. The majority of students met or exceeded expectations across all four dimensions of reasoning, according to the assessment of student reasoning samples. Student reasoning skills were strongest in terms of the acquisition of information. They showed the greatest room for improvement in the higher-order cognitive tasks of evaluation/analysis, interpretation and application/problem-solving. 2 b. 89% of student survey respondents indicated that the program had prepared them well or very well to acquire, evaluate and interpret information and to solve problems relevant to their field of study (see Table ). c. Graduating nursing students strongly agreed that the LAVC Nursing program prepared them to: (1) assess client needs, (2) formulate clinical judgments and decision-making utilizing critical thinking skills, () utilize evidence-based practice in clinical decision-making, and () use dosage calculation math required for the Nursing program. Twenty-six graduating Nursing students (100%) completed the Nursing Graduate Exit Survey in Spring 2011. The results are included in Appendix 1. No exit survey of graduating Respiratory Therapy students was available at the time this report was prepared. d. 66% of Advisory Board survey respondents agreed or somewhat agreed that LAVC students left their programs with appropriate problem-solving abilities. 1% indicated they were unable to rate. e. Three employer surveys of LAVC Spring 2011 interns were submitted as part of the College's Cooperative Education program. They indicated that 100% of student employees possessed appropriate problem-solving skills. A summary of the results are shown in Table 5. f. Between 2007-2011, 92% of Respiratory Therapy program graduates passed the Certified Respiratory Therapist (CRT) licensing exam on their first attempt. 98% passed on their first or second attempts for the same time period. Approximately 80% of graduates during this same time frame also took the Clinical Scenario (CSE) and Written Registry Respiratory Therapy (WRRT) exams for the more advanced Registered Respiratory Therapist (RRT) credential. These exams assess knowledge of content as well as reasoning and critical thinking skills in a clinical context at a more advanced level. Of that 80%, 96% passed, on average, for the same 5-year period across both exams. 2 No reasoning samples were submitted from the following participating disciplines: Geographic Information Systems, Graphic Design, Respiratory Therapy. 11

Career-Technical Education Program Outcomes Assessment Report June 2012 TABLE 2: Results of Faculty Assessment of Student Reasoning Skills ASSESSMENT OF STUDENT REASONING SKILLS Exceeds Meets Does Not Meet Insufficient Expectations Expectations Expectations Data Acquisition of Information 6% % 7% 2% Analysis / Evaluation 8% 6% 12% 2% Interpretation % 56% 10% % Application 1% 51% 1% %. Professional Behavior a. Instructors evaluated the professional behavior of 97 students. A summary of the results are shown in Table. 99% of students either met or exceeded expectations in terms of personal integrity and respect for others, according to instructor assessments. About 10% fewer students met or exceeded expectations in the areas of responsibility and punctuality; 10%-12% of sampled students did not meet expectations for these behaviors. However, no norming process was conducted for participating faculty regarding the use of the professional behavior assessment rubric. It was left to individual faculty to decide how to interpret and apply the assessment rubric. b. 92% of student survey respondents indicated that the program had prepared them well or very well to conduct themselves in an ethical and professional manner in the workplace (see Table ). c. Graduating nursing students strongly agreed that the Nursing program prepared them to (1) demonstrate caring behaviors toward clients and (2) manage client care based on the standards of professional practice within the ethical and legal framework of the profession. d. 67% of Advisory Board survey respondents agreed or somewhat agreed that LAVC students left their programs exhibiting professional behavior. The remaining % indicated they were unable to rate. e. Three employer surveys of LAVC Spring 2011 interns were submitted as part of the College's Cooperative Education program. They indicated that 100% of student employees exhibited professional behavior. A summary of the results are shown in Table 5. TABLE : Results of Faculty Assessment of Student Professional Behavior FACULTY ASSESSMENT OF STUDENT PROFESSIONAL BEHAVIOR TOTALS Exceeds Expectations Meets Expectations Personal Integrity 79% 21% 1% Respect for Others 70% 29% 1% Responsibility 2% 8% 10% Punctuality 9% 9% 1% Does Not Meet Expectations 12

Career-Technical Education Program Outcomes Assessment Report June 2012. Technical Skills a. 89% of student survey respondents indicated that they had been prepared well or very well to perform the skills associated with their field of study (see Table ). The survey did not ask students to assess their discipline-specific technical skills, as called for by the assessment model. b. 72% of Advisory Board survey respondents agreed or somewhat agreed that students leave LAVC with appropriate technical skills. 29% indicated they were unable to rate. c. Three employer surveys of LAVC Spring 2011 interns were submitted as part of the College's Cooperative Education program. They indicated that 100% of student employees possessed the appropriate technical skills. A summary of the results are shown in Table 5. d. Graduating nursing students strongly agreed that the Nursing program prepared them (1) to pass the NCLEX exam and (2) for entry level Registered Nurse Employment. Between 2009-2011, an average of 90% of graduates of the Nursing program passed the NCLEX-RN licensing exam on their first attempt. e. Between 2007-2011, 92% of Respiratory Therapy program graduates passed the Certified Respiratory Therapist (CRT) licensing exam on their first attempt. 98% passed on their first or second attempts for the same time period. Approximately 80% of LAVC graduates during this same time frame also took the Clinical Scenario (CSE) and Written Registry Respiratory Therapy (WRRT) exams for the more advanced Registered Respiratory Therapist (RRT) credential. These exams assess knowledge of content and reasoning/critical thinking skills in a clinical context at a more advanced level. Of that 80%, 96% passed, on average, for the same 5-year period across both exams. The results of the Nursing (NCLEX) and Respiratory Therapy (NBRC) professional licensing exams are included in Appendices 15 and 16. TABLE : Results of Student Survey - Students' Self-Assessments of Program SLO Skills HOW WELL DID THE LAVC PROGRAM PREPARE YOU TO DO THE FOLLOWING? Conduct yourself in an ethical and professional manner in the workplace Perform the skills associated with your field of Communicate clearly through Communicate clearly through Very well Well Adequately Inadequately Total # % # % # % # % # % 6 67% 2 25% 7 7% 0 0% 95 100% 62 6% 2 25% 11 11% 0 0% 97 100% 7 9% 5% 12 1% % 95 100% 50 5% 6% 11 11% 1 1% 96 100% Listen effectively 55 59% 2 % 7 7% 0 0% 9 100% Read effectively 9 52% 5% 12 1% 0 0% 9 100% Acquire, evaluate and interpret 5 56% 1 % 9 10% 1 1% 9 100% 1

Career-Technical Education Program Outcomes Assessment Report June 2012 Solve problems relevant to your field of study. 52 57% 0 % 9 10% 1 1% 92 100% Of the 06 students invited to complete the CTE student survey, 100, or %, responded within the - week allowable time frame. For a complete summary of the results of this survey, see APPENDIX 12. The response rate was uneven across departments, ranging from 19% to 67%. The results skew toward the views of students in departments with the greatest number of respondents. For example, 5 of the respondents (5%) applied for degrees or certificates in Child Development. Eleven of the responding students applied for degrees or certificates in Administration of Justice. Students from these two departments, the two largest CTE departments of campus, constituted 65% of survey respondents. Some of the other disciplines had only 1-2 respondents. When asked broader questions about their preparation at LAVC (question, rows 1, 2), student responses were more positive. When asked more specifically about concrete skills, students responded somewhat less positively (question, rows -). B. Program Assessment Process 1. Many faculty expressed concern that they did not have existing assignments that could serve as work samples for program assessment purposes, particularly writing samples. Some also expressed that they did not have enough room in their class schedule or enough advance notice to prepare additional assignments or inform students that such work would be expected of them. 2. Student writing and reasoning work samples were extremely diverse in terms of format (narrative or powerpoint outline), length (research paper or two-sentence response), content, and context in which they were completed (in-class and outside-of-class, exam or ungraded exercise). This diversity made it challenging to assess student skills meaningfully.. The writing and reasoning assessment rubrics were not clear enough. Simpler and more specific rubrics would make for a more accurate and meaningful assessment.. We were unable to develop a tool for evaluating student reading, listening and speaking skills in time for this assessment cycle. 5. The CTE student survey did not ask students to self-assess their discipline-specific technical skills. 6. Faculty who provided assessments of the professional behavior of their sampled students did not participate in a norming process. It was left to individual faculty to decide how to interpret and apply the rubric. 7. Students in the Continuing Education's vocational courses were not included in this CTE assessment cycle. Nor were students in Cooperative Education. 8. Program assessment is a lot of work for participating faculty. This cycle took one year and the workgroup's membership has dwindled. This is due in part to the PA Coordinator's commitments 1

Career-Technical Education Program Outcomes Assessment Report June 2012 to organize the assessment of the other two programs. As a result, the Implementation Plan reflects the contributions of fewer faculty. 9. Approximately 0% of Advisory Board survey respondents indicated they were unable to rate student skills. 10. Fifty seven CTE Advisory Board members responded to the survey. The size of each Advisory Board varies considerably, from 2 to 15 members. The results of the survey skewed toward departments with the greatest number of respondents. A summary of the Advisory Board Survey results are included in Appendix 1. 11. More than 0% of CTE Advisory Board members have not employed LAVC students. 12. The specifics of the CTE program student learning outcomes are not communicated with advisory boards, employers or students, all of whom are asked to assess these skills. This makes it more difficult to link their assessments of these skills with those of faculty. 1. A number of participating faculty and workgroup members expressed that they were uneasy assessing their students' personal integrity and respect for others. Some said they had no basis for evaluating these qualities. They indicated that they assumed students possessed personal integrity and held respect for others, unless they had explicit evidence to the contrary, and were therefore more likely to assess students as exceeding expectations in these areas. Faculty indicated that they were in a better position to evaluate the more readily observable student behaviors of responsibility and punctuality. 15

Career-Technical Education Program Outcomes Assessment Report June 2012 VII. RECOMMENDATIONS This section outlines the recommendations of the CTE program assessment workgroup in light of the findings discussed in the previous sections and their year-long experience conducting the College's first program assessment process. In general, our recommendations focus on four areas: (1) improving the visibility of our expectations regarding student learning outcomes to all constituencies: faculty, students, advisory boards and employers; (2) creating more opportunities for CTE students to practice their program-slo-related skills; () bringing students into the assessment process and calling upon them to engage more actively in the evaluation of their own work; () rewarding faculty who participate actively on program assessment and implementation workgroups with professional development credit or other form of recognition. A. Student Learning Outcomes 1. Communication Skills a. Make expectations regarding all communication skills more transparent and explicit to students, faculty, advisory boards and employers (all parties asked to demonstrate or assess these skills). Include the assessment rubrics on all syllabi. Share, review and discuss the rubrics as part of existing meetings and conversations at departmental and program levels. b. Create more opportunities for students to practice and assess their own communication skills. Incorporate one or more narrative writing assignments into CTE courses in support of this program learning outcome and per California Code of Regulations Title 5, Article 1, Section 55002 and individual course outlines. Have students use the assessment rubrics to evaluate their own work. c. Familiarize students with the dimensions of effective communication. Make these dimensions explicit in class and throughout coursework, whenever possible. d. Develop a worksheet to help faculty make these changes in their courses. e. Include a statement about the LAVC Writing Center on all CTE course syllabi and encourage students to use this valuable campus resource. 2. Reasoning Skills a. Make expectations regarding reasoning skills more transparent and explicit to students, faculty, advisory boards and employers (all parties asked to demonstrate or assess these skills). Include the assessment rubric on all syllabi. Share, review and discuss the assessment rubrics as part of existing meetings and conversations at departmental and program levels. b. Create more opportunities for students to practice and assess their higher-order reasoning / critical thinking skills. Incorporate one or more reasoning assignments into all CTE courses in support of this program learning outcome, per California Code of Regulations Title 5, Article 1, Section 16

Career-Technical Education Program Outcomes Assessment Report June 2012 55002, and per individual course outlines. Have students use the assessment rubric to evaluate their own reasoning/critical thinking skills. c. Familiarize students with the dimensions of reasoning/critical thinking. Make dimensions of reasoning explicit in class and throughout coursework, whenever possible. d. Develop a worksheet to help faculty make these changes in their courses. e. Approach the Writing Center Director about making more explicit the dimensions of reasoning / critical thinking as part of effective written communication in their work with students.. Technical Skills a. Make expectations regarding discipline-specific technical skills more transparent and explicit to students, faculty, advisory boards and employers (all parties asked to demonstrate or assess these skills). Share, review and discuss lists of discipline-specific technical skills as part of existing conversations and meetings at departmental and program level. b. Share, review and discuss at departmental and program level student survey feedback regarding the discipline-specific technical skills they value most after leaving the program.. Professional Behavior a. Make expectations regarding professional behavior more transparent and explicit to students, faculty, advisory board members and employers (all parties asked to demonstrate or assess these skills). Share, review and discuss the assessment rubric as part of existing meetings and conversations at departmental and program levels. b. Teach and model professional behavior more explicitly across the CTE discipline, particularly responsibility and punctuality. c. Have students use the assessment rubric to evaluate their own professional behavior. d. Revise the professional behavior assessment rubric to better reflect the description of this program learning outcome. e. Address faculty concerns regarding the assessment of personal integrity and respect for others. Clarify the descriptions of each performance level. B. Program Assessment Process 1. Give faculty more lead time and direction to prepare work samples as part of the next cycle of CTE assessment. The effectiveness of integrating of faculty expectations regarding critical thinking skills into coursework was validated as part of a research project at Washington State University. For more information on this project, visit: http://wsuctproject.wsu.edu/ph.htm 17

Career-Technical Education Program Outcomes Assessment Report June 2012 2. Refine the assessment rubrics for writing, reasoning and professional behavior. Modify writing assessment rubric to separate vocabulary and spelling into distinct dimensions.. Develop tools to assess student reading, speaking and listening skills for the next assessment cycle.. Share the relevant program assessment rubrics with advisory boards and employers, who are asked to evaluate CTE student skills. This will give them a clearer idea of what we consider effective communication skills, reasoning skills, technical skills and professional behavior. 5. Find ways to improve the consistency of student writing and reasoning work samples for the next cycle of program assessment. Propose that students write an autobiographical essay that asks them to connect general course content with their experiences, interests and goals. Also develop a more uniform critical thinking question that makes sense across CTE disciplines. These changes will address the issue of the uneven cognitive demands made of students in producing work samples. 6. Develop a consensus among CTE faculty regarding the appropriate conditions under which students will produce writing samples for the next assessment cycle. 7. Find ways to increase the number of CTE advisory board members who employ our students. Revise the Cooperative Education Employer Survey to include a question about becoming a membedr of a CTE advisrory board. 8. Reward faculty efforts by linking resources and funding to participation in program assessment and/or completion of program student learning initiatives. Make the link explicit on funding request forms. 9. Include Continuing Education s vocational education courses and students as part of CTE assessment. 10. Clarify how best to include Cooperative Education as part of CTE assessment. 11. Raise the issue of the assessment of students with special needs with Services for Students with Disabilities (SSD), the CTE Committee, SLO Committee. How should special needs students be assessed? Success may not mean the same thing. 12. Review program assessment process in light of online learning. Some CTE faculty who teach online indicated they felt the process is currently better suited face-to-face teaching environments. 18

Career-Technical Education Program Outcomes Assessment Report June 2012 IX. IMPLEMENTATION PLAN This section outlines our plan to implement the changes we recommend. It describes the steps to be taken, semester by semester, for each program student learning outcome for the next two years, between Fall 2012-Spring 201. A. Student Learning Outcomes 1. Form a CTE faculty SLO leadership team. The team will take the lead in elaborating upon and carrying out the implementation plan, including setting up additional groups to take the lead on specific projects. Target date: Spring 2012 2. Respectfully request that CTE chairs convey and discuss program-level changes at department meetings. Target date: ongoing. Respectfully request that CTE faculty incorporate CTE learning outcomes on syllabi. Target date: Fall 2012. A CTE faculty team will develop a worksheet or other materials to help departments and faculty make adjustments, e.g. developing writing assignments and reasoning exercises, integrating CTE learning expectations and assessment rubrics into coursework. Target date: Fall 2012 - Spring 201 5. Respectfully request that CTE Chairs work with their faculty to incorporate one or more narrative writing assignments into their courses in support of this CTE learning goal and per Title 5, Article 1, Section 55002 and per course outlines. Target date: Fall 2012 6. CTE faculty will develop a boiler-plate statement (e.g. "Did you know...") regarding disciplinespecific technical skills for inclusion on relevant course syllabi. These skills statements will be incorporated into the college catalog and into online college information about certificates and degrees. Target date: Fall 2012 - Spring 201 7. Respectfully request that the Professional Development Committee extend professional development credit to faculty who participate actively as members of program SLO assessment or implementation workgroups. B. Program Assessment Process 1. Request that the Professional Development Committee extend professional development credit to faculty who participate actively as members of program SLO assessment or implementation workgroups. 2. Have the CTE communication SLO workgroup recommend guidelines regarding the appropriate conditions under which students will produce writing samples for the next assessment cycle. 19

Career-Technical Education Program Outcomes Assessment Report June 2012. Request the Director of Cooperative Education to revise the employer survey by Fall 2012 to include a question about joining an Advisory Board.. Request that employers use the assessment rubric to evaluate the professional behavior of student interns and employees. 5. Request that all campus supervisors of student employees or interns assess the professional behavior of the students they supervise. Request that they also assess the communication, reasoning and technical skills of student employers, when applicable. 6. Revise the advisory board surveys. Ask members to specify on what basis they are evaluating student skills, e.g. direct contact with individual students as interns or employees, information provided by the department. 7. Discuss the advisory board survey mechanism with the Office of Research and Planning and CTE Committee. What mechanism for surveying the advisory boards is most effective, i.e. will generate the best data? Should Advisory Board surveys be conducted online? 8. Develop and conduct a norming process for faculty regarding the assessment of professional behavior in time for the next assessment cycle. 9. Develop a consensus among CTE Committee members regarding the make-up of CTE advisory boards. What best practices can be agreed upon to be sure board members can fulfill their roles effectively? 10. Be sure to ask students to self-assess their discipline-specific technical skills as part of the next CTE student survey. 11. Be sure to include Continuing Education s vocational education as part of CTE assessment. 12. Be sure to include Cooperative Education fully in the next cycle of CTE program assessment. IX. PLAN FOR SHARING RESULTS X. LIST OF APPENDICES A. Appendix 1: CTE Program Assessment Model Grid B. Appendix 2: CTE Program Assessment Flowchart C. Appendix : Course- Program Alignment Grid D. Appendix : CTE Advisory Board Survey E. Appendix 5: CTE Advisory Board Survey Results F. Appendix 6: CTE Employer Survey 20

Career-Technical Education Program Outcomes Assessment Report June 2012 G. Appendix 7: CTE Student Survey Results H. Appendix 8: CTE Graduates Assessments of the Value of Technical Skills I. Appendix 9: CTE Program SLO Assessment Rubric Writing Skills J. Appendix 10: CTE Program SLO Assessment Rubric Reasoning Skills K. Appendix 11: CTE Program SLO Assessment Rubric Professional Behavior L. Appendix 12: Results of Nursing student exit survey M. Appendix 1: Results of Nursing licensing exams N. Appendix 1: Results of Respiratory Therapy licensing exams. 21