Faculty member completing template Nikrouz Faroughi, Graduate Coordinator and member of Assessment Committee Date: 2/3/2012

Similar documents
Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

Arizona s English Language Arts Standards th Grade ARIZONA DEPARTMENT OF EDUCATION HIGH ACADEMIC STANDARDS FOR STUDENTS

Rubric for Scoring English 1 Unit 1, Rhetorical Analysis

ACADEMIC AFFAIRS GUIDELINES

Achievement Level Descriptors for American Literature and Composition

Examining the Structure of a Multidisciplinary Engineering Capstone Design Program

I. Proposal presentations should follow Degree Quality Assessment Board (DQAB) format.

Graduate Program in Education

Delaware Performance Appraisal System Building greater skills and knowledge for educators

ASSESSMENT OF STUDENT LEARNING OUTCOMES WITHIN ACADEMIC PROGRAMS AT WEST CHESTER UNIVERSITY

TRAITS OF GOOD WRITING

Program Assessment and Alignment

South Carolina English Language Arts

Saint Louis University Program Assessment Plan. Program Learning Outcomes Curriculum Mapping Assessment Methods Use of Assessment Data

Student Name: OSIS#: DOB: / / School: Grade:

The College Board Redesigned SAT Grade 12

Facing our Fears: Reading and Writing about Characters in Literary Text

CEFR Overall Illustrative English Proficiency Scales

Higher Education / Student Affairs Internship Manual

Grade 11 Language Arts (2 Semester Course) CURRICULUM. Course Description ENGLISH 11 (2 Semester Course) Duration: 2 Semesters Prerequisite: None

MASTER S THESIS GUIDE MASTER S PROGRAMME IN COMMUNICATION SCIENCE

PREPARING FOR THE SITE VISIT IN YOUR FUTURE

PAGE(S) WHERE TAUGHT If sub mission ins not a book, cite appropriate location(s))

MBA 5652, Research Methods Course Syllabus. Course Description. Course Material(s) Course Learning Outcomes. Credits.

BENCHMARK TREND COMPARISON REPORT:

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Programme Specification. BSc (Hons) RURAL LAND MANAGEMENT

Standards and Criteria for Demonstrating Excellence in BACCALAUREATE/GRADUATE DEGREE PROGRAMS

CARITAS PROJECT GRADING RUBRIC

HDR Presentation of Thesis Procedures pro-030 Version: 2.01

Submission of a Doctoral Thesis as a Series of Publications

CONTRACT TENURED FACULTY

Statistical Analysis of Climate Change, Renewable Energies, and Sustainability An Independent Investigation for Introduction to Statistics

English Language Arts Missouri Learning Standards Grade-Level Expectations

Common Core State Standards for English Language Arts

Teachers Guide Chair Study

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

ABET Criteria for Accrediting Computer Science Programs

Guidelines for Project I Delivery and Assessment Department of Industrial and Mechanical Engineering Lebanese American University

Queen's Clinical Investigator Program: In- Training Evaluation Form

Tap vs. Bottled Water

Mathematics Program Assessment Plan

1. Answer the questions below on the Lesson Planning Response Document.

Guidelines for Writing an Internship Report

EDUC-E328 Science in the Elementary Schools

Writing the Personal Statement

SELF-STUDY QUESTIONNAIRE FOR REVIEW of the COMPUTER SCIENCE PROGRAM and the INFORMATION SYSTEMS PROGRAM

Contract Language for Educators Evaluation. Table of Contents (1) Purpose of Educator Evaluation (2) Definitions (3) (4)

Linguistics Program Outcomes Assessment 2012

MSc Education and Training for Development

Writing a composition

Master Program: Strategic Management. Master s Thesis a roadmap to success. Innsbruck University School of Management

MYP Language A Course Outline Year 3

TU-E2090 Research Assignment in Operations Management and Services

This Performance Standards include four major components. They are

Colorado State University Department of Construction Management. Assessment Results and Action Plans

Oklahoma State University Policy and Procedures

Livermore Valley Joint Unified School District. B or better in Algebra I, or consent of instructor

Be aware there will be a makeup date for missed class time on the Thanksgiving holiday. This will be discussed in class. Course Description

POLICIES AND PROCEDURES

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

Chemistry Senior Seminar - Spring 2016

Individual Interdisciplinary Doctoral Program Faculty/Student HANDBOOK

HISTORY COURSE WORK GUIDE 1. LECTURES, TUTORIALS AND ASSESSMENT 2. GRADES/MARKS SCHEDULE

November 2012 MUET (800)

Davidson College Library Strategic Plan

Developing an Assessment Plan to Learn About Student Learning

DOCTOR OF PHILOSOPHY BOARD PhD PROGRAM REVIEW PROTOCOL

Oakland Unified School District English/ Language Arts Course Syllabus

Indiana Collaborative for Project Based Learning. PBL Certification Process

Secondary English-Language Arts

STUDENT ASSESSMENT AND EVALUATION POLICY

PROGRAM REVIEW REPORT EXTERNAL REVIEWER

MASTER OF ARTS IN APPLIED SOCIOLOGY. Thesis Option

GRAND CHALLENGES SCHOLARS PROGRAM

Ph.D. in Behavior Analysis Ph.d. i atferdsanalyse

Oklahoma State University Policy and Procedures

The Characteristics of Programs of Information

Scoring Guide for Candidates For retake candidates who began the Certification process in and earlier.

MPA Internship Handbook AY

Comprehension Recognize plot features of fairy tales, folk tales, fables, and myths.

EQuIP Review Feedback

Oakland Unified School District English/ Language Arts Course Syllabus

Program Report for the Preparation of Journalism Teachers

Prentice Hall Literature Common Core Edition Grade 10, 2012

Designing a Rubric to Assess the Modelling Phase of Student Design Projects in Upper Year Engineering Courses

Smarter Balanced Assessment Consortium: Brief Write Rubrics. October 2015

Guidelines for the Use of the Continuing Education Unit (CEU)

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

Arkansas Tech University Secondary Education Exit Portfolio

ACCT 3400, BUSN 3400-H01, ECON 3400, FINN COURSE SYLLABUS Internship for Academic Credit Fall 2017

Master of Philosophy. 1 Rules. 2 Guidelines. 3 Definitions. 4 Academic standing

Strategic Planning for Retaining Women in Undergraduate Computing

Senior Project Information

KENTUCKY FRAMEWORK FOR TEACHING

The Paradox of Structure: What is the Appropriate Amount of Structure for Course Assignments with Regard to Students Problem-Solving Styles?

Assessment. the international training and education center on hiv. Continued on page 4

Writing for the AP U.S. History Exam

Conceptual Framework: Presentation

MSE 5301, Interagency Disaster Management Course Syllabus. Course Description. Prerequisites. Course Textbook. Course Learning Objectives

Transcription:

1/21 Program MS in Software Engineering Department Computer Science Number of students enrolled in the program in Fall, 2011: 19 Faculty member completing template Nikrouz Faroughi, Graduate Coordinator and member of Assessment Committee Date: 2/3/2012 Period of reference in the template: 2006-2007 to present 1. Please describe your program s learning-outcomes trajectory since 2006-07: Has there been a transformation of organizational culture regarding the establishment of learning outcomes and the capacity to assess progress toward their achievement? If so, during which academic year would you say the transformation became noticeable? What lies ahead; what is the next likely step in developing a learning-outcomes organizational culture within the program? The Department offers two masters programs, MS in Computer Science (MS-CSC) and MS in Software Engineering (MS-SE). The Department Graduate Curriculum and Assessment Committees initially developed nine (a-i) student learning outcomes during 2008-2010 as outlined in the 2009-2010 Graduate Program Self Study Report, dated June 7, 2010. While, the list of courses in MS-CSC and MS-SE are different, all the courses are shared as either required or elective courses. Therefore, no distinction is made in the list of outcomes and measurement methods in the two programs. (For information on the history of the establishment of the MS-SE Program, see item 8.b of this report.) Periodic meetings with the Department Industry Advisory Committee (IAC), onsite visits with alumni, supervisors evaluations of graduate student internships, faculty evaluation of masters thesis/project reports, faculty evaluation of students oral communication skills, and survey of graduating MS students were used to assess the learning outcomes. Five skill sets as outlined in the Fall 2010 Graduate Assessment Progress Report were highly rated by the IAC. More importantly, one skill-set, ability to work independently and deal with incomplete information, which was not included among our initial set of outcomes, was rated high by the IAC. The discussions with the IAC were served as a catalyst for updating the students learning outcomes in Fall 2010, as outlined in the progress report. Also included in the report are three new rubrics for evaluation of term projects, term papers, and masters thesis/project works, which in addition to the existing writing and oral communication rubrics, we believe are more appropriate tools for assessing graduate level learning outcomes. In addition to updates made to the students learning outcomes, the lessons learned from previous undergraduate and graduate assessment cycles are used as guidelines to develop methods that we believe will reduce evaluation errors, distribute data collection across the career of graduate students, make data collection task more routine and tied to normal faculty workload on grading, and provide micro (criterion level) and macro (outcome level) performance information. The Department has the plan to finalize the assessment methods during next academic year. 2. Please list in prioritized order (or indicate no prioritization regarding) up to four desired learning outcomes ( takeaways concerning such elements of curriculum as perspectives, specific content knowledge, skill sets, confidence levels) for students completing the program. For each stated

2/21 outcome, please provide the reason that it was designated as desired by the faculty associated with the program. The four (4) learning outcomes listed below are un-prioritized; for it is believed that these outcomes cover different skill sets, and any outcome with less than satisfactory performance (if any) will be marked high priority. A. An ability to apply knowledge from undergraduate and graduate software engineering and related disciplines to identify, formulate, and solve novel and complex computer science problems that require advanced knowledge within the field. The faculty believe the core of Software Engineering graduate education should include students ability to conduct research/project work on novel ideas that illustrates 1) application of new and contemporary advanced concepts; 2) application of fundamental concepts and skills used in the project; for example, the work is a culmination of undergraduate experience and skill set and includes extensive software/hardware development; and 3) integration of knowledge and skills from multiple software engineering subject areas such that work shows innovation and creative application of student skill set. Additionally, they should 1) be able to recall applicable facts, theories, and learned materials; 2) be aware of the knowledge pieces and able to compare, contrast, paraphrase, extend, and summarize; 3) be able to break down a problem into its knowledge pieces to understand its organizational structure; 4) be able to combine knowledge pieces into new entities; and 5) be able to assess the performance of new entities and make performance-related decisions. B. Ability to understand and integrate new knowledge within the field. The faculty believe it is a valuable learning experience for graduate students to be able to summarize or present a published research/project work that 1) identifies motivation for the research/project work; 2) articulates the proposed solutions; 3) articulates new information into sentences; 4) analyzes and evaluates contributions such as advantages, disadvantages, and applications; and 5) uses logic and argumentation to illuminate contributions of newly learned materials to topics covered in classes. C. The ability to plan, conduct, and report (written communication) on an organized and systematic study of an advanced topic within the field. The faculty believe an important component of graduate education is for students to be able to 1) provide the rationale for conducting thesis/project work; 2) provide literature review and present it coherently, for example, organized by methodology, chronologically, or relevancy of topics; 3) interpret results as they relate to existing solutions such as examine and discuss sensitivity of results to different assumptions, consider implication of results to key audience, etc.; and 4) prepare quality reports that demonstrate good writing skills, makes meaningful connections between sections, meets requirements of style guide, and is free of spelling and grammatical errors. D. The ability to work as a team in a diverse changing world, recognize the ethical standards, and possess skills for effective oral communication. The Department faculty members believe for students to have successful careers they should 1) be able to collaborate and contribute as active team members; 2) possess oral communication skills, for

3/21 example, articulate introductory and concluding statements with logical progression with clear and consistent audible voice, consistently make eye contacts, use minimal gap-fillers such as um, etc.; 3) possess active listener skills; and 4) abide by ethical standards. 3. N/A 4. For each desired outcome indicated in item 2 above, please: a) Describe the method(s) by which its ongoing pursuit is monitored and measured. Outcome A: Each student completes a thesis or project as part of Masters Degree. Sponsoring faculty members (two for project and three for thesis) will each independently complete an evaluation based on a masters thesis/project rubric (part 1) that is designed to measure in part student s performance. For those graduate courses that include term projects, the faculty member teaching the course will use a term project rubric to additionally evaluate students performance. Outcome B: For those graduate courses that include term papers, the faculty member teaching the course will complete a term paper rubric to evaluate students performance. Outcome C: The sponsoring faculty members (two for project and three for thesis) will each independently complete an evaluation based on a masters thesis/project rubric (part 2) as well as a written communication rubric that are designed to measure student s performance. Outcome D: For those graduate courses that include term paper presentations, the faculty member teaching the course will complete an oral communication rubric to evaluate students performance. Additionally, students performance is measured in part by group discussions in IAC, onsite visits with Alumni, and supervisors evaluations of student internships. b) Include a description of the sample of students (e.g., random sample of transfer students declaring the major; graduating seniors) from whom data were/will be collected and the frequency and schedule with which the data in question were/will be collected. All the outcomes will be evaluated once every three years using all the data collected in the previous three years. The assessment data collection task is expected to be ongoing with minimum expected faculty effort during each semester as it will be routine and directly tied to faculty member s evaluation of students works for grading purposes as follows: Outcome A: The sponsoring faculty members will evaluate the students theses/projects work during each semester. For those courses that include term projects and are offered every semester students performance is evaluated once a year. For those courses that include term projects and are offered less frequently students performance is evaluated when the courses are offered. In the current curriculum, 11 courses have individual projects and 15 have group projects. The majority

4/21 of students are expected to take several from this list of courses to satisfy their degree program and thus the collected data will be a good indication of student performance. Outcome B: For those courses that include term papers and are offered every semester, students performance is evaluated once a year. For those courses that include term papers and are offered less frequently student s performance is evaluated when the courses are offered. In the current curriculum, 6 courses have individual term papers and 5 have group term papers. Students who take one or more of these courses will represent a random sample of all the students in the program. Outcome C: The sponsoring faculty members will evaluate the students theses/projects work during each semester; thus, data for this outcome will be collected for all the graduating students. Outcome D: For those courses that include term paper presentations students performance is evaluated every time the courses are offered. One IAC meeting per year and one onsite visit with the alumni every two years is routine. Internship supervisors evaluations serve as a reasonable sample to evaluate students performance; 15-20 students enroll in the internship course every semester, including summer. Additionally, in an effort to provide timely feedback to students on their learning outcomes, the faculty is encouraged to return students rubric-scores, especially those of term projects and papers, to students along with each student s evaluated report. One would expect that the feedbacks received in one semester would help the students to do better in a following semester. c) Describe and append a sample (or samples) of the instrument (e.g., survey or test), artifact (e.g., writing sample and evaluative protocol, performance review sheet), or other device used to assess the status of the learning outcomes desired by the program. Outcome A: Masters Thesis/Project Rubric, questions 1-4 (Appendix A) Term Project Rubric (Appendix B) Outcome B: Term Paper Rubric (Appendix C) Outcome C: Masters Thesis/Project Rubric, questions 5-7 (Appendix A) Written Communication Rubric (Appendix D). [Note: This version of the rubric was used during the previous assessments. It is anticipated that in future assessments, certain criteria from this rubric will be combined with those in Masters Thesis/Project Rubric.] Outcome D: Oral Communication Rubric (Appendix E)

5/21 IAC questionnaire (Appendix F) Internship supervisor evaluation form (Appendix G) d) Explain how the program faculty analyzed and evaluated (will analyze and evaluate) the data to reach conclusions about each desired student learning outcome. Each rubric consists of a list of specific performance criteria (e.g., knowledge, comprehension, application, etc. from the Term Project Rubric. Appendix B) that each will be scored between 1 and 4 by faculty members as follows: 4 Exceeds expectations 3 Meets expectations 2 Progressing to expectations 1 Below expectations To determine if a student s performance in a criterion is satisfactory, faculty evaluators must score the student s performance as either 3 or 4. The following set of analysis will be used for assessment purposes: (See Appendix H, I, and J for examples) i. For each of the performance criteria (e.g., knowledge, comprehension, application, etc. from Term Project Rubric, Appendix B), the average performance (AP) of all faculty scores will be computed. An AP of, say, 3.5 with a low standard deviation, for example, in knowledge criterion, would mean on average the performance of all students meet or exceed expectations in knowledge criterion. An AP with high standard deviation would signify a (performance criterion) weakness either in a majority of students or only in limited number of students, respectively. Faculty will discuss the results to address the underline issues. ii. The average of all APs of the respective list of performance criteria will be computed as the aggregate average performance (AAP) for the work done by all students as part of their masters theses/projects and term projects and papers. Again, an AAP of, say, 3.5 with a low standard deviation, for example, in term projects, would mean on average all students meet or exceed expectations in satisfying student s performance on term projects. An AAP with high standard deviation would signify weakness either in a majority of students or only in limited number of students, respectively. Faculty will discuss the results to address the underline issues. The AAP computed from a specific list of performance criteria will be used to measure students performance in each of the learning outcomes as follows: Outcome A: The AAP is computed based on criteria 1 through 4 in Masters Thesis/Project Rubric and the list of criteria in Term Project Rubric. Outcome B: The AAP is computed based on the list of criteria in Term Paper Rubric. Outcome C: The AAP is computed based on criteria 5 through 7 in Masters Thesis/Project Rubric and the list of criteria in the Written Communication Rubric. [Please refer to the Note in Item 4.c].

6/21 Outcome D: The AAP is computed based on the list of criteria in the Oral Communication Rubric and internship supervisor s evaluation forms. 5. Regarding each outcome and method discussed in items 2 and 4 above, please provide examples of how findings from the learning outcomes process have been utilized to address decisions to revise or maintain elements of the curriculum (including decisions to alter the program s desired outcomes). If such decision-making has not yet occurred, please describe the plan by which it will occur. In item 4, we outlined the methods that will be used to evaluate the updated learning outcomes listed in item 2. However, in this section, we report on methods and results of assessing students written communication skill, which was conducted twice as reported in Re-Assessment of Written Communication, dated February 2, 2011 as well as the summary assessment reports on oral communication and internship supervisors evaluations. Outcome A: Table 1 shows the summary results of supervisors evaluations of student internships from Fall 2006 to Fall 2009. All MS students evaluated met or exceeded the performance criteria assessed as part of the employer internship evaluations. As shown in the table, 98.68% our students are consistently well regarded in their ability to develop a computerized solution to a real life problem by their employers. Table 1. Interns Ranked Average to Outstanding by Their Supervisors Performance Criteria % Students Meeting or Exceeding Criteria Ability to Develop a Computerized Solution to a Real Life Problem (Outcomes A and B) 98.68% Ability to Function as a Team Member (Outcome D) 100.00% Effective Oral Communication (Outcome D) 98.75% Effective Written Communication (Outcome C) 98.67% Appropriate Use of Presentation Tools (Outcome D) 100.00% Awareness of Ethical and Societal Concerns 100.00% Outcome B: The development of a computerized solution to a real life problem requires the integration of various Computer Engineering concepts, and according to Table 1, employers are very much satisfied with our students ability to develop software. Outcome C: Evaluation of Written Communication In 2009-2010 academic year graduate program assessment cycle, 20 Master s Projects were evaluated each by two faculty members using the Written Communication Rubric, which had nine performance

7/21 criteria. The initial assessment results indicated possible deficiencies in syntax, analysis of results, and conclusion performance criteria. However, after careful analysis of the data, it was determined that the results obtained were not conclusive due to 1) having insufficient evaluation data for each project, and 2) lack of evaluation standards; faculty members were not trained prior their evaluation of the project reports to follow a set of common standards. Additionally, the Assessment Committee decided to add one more performance criterion as student s overall ability to communicate in a coherent manner the substance of the project. A decision was made to close the loop and repeat the assessment in the next assessment cycle, which took place in Fall of 2010 with seven Masters Projects that were completed in Spring 2010. The list of evaluators for each project was increased from 2 to 4 or 5, and a general evaluation guideline to reduce individual differences in evaluation was given to each evaluator. The projects were evaluated using 10 (9+1) performance criteria. The results of this assessment indicated that the few deficiencies that were observed in the earlier assessment cycle with insufficient number of evaluators, this time around were only limited to two projects, and based on average score of 3.0 or higher, all projects met or exceeded expectations. Those deficiencies that were present in only two project reports were discussed in a department meeting and faculty was asked to pay more attention to those criteria in the future. The two lessons from this assessment experience are 1) the need to increase the number of faculty evaluation data points without substantially increasing faculty workload, and 2) faculty ought to agree on a set of standards when evaluating students works. The list of new rubrics and the data collection and analysis methods described in Item 4 are attempts to collect more and detailed data without substantially increasing faculty workload. In contrast to the results for assessment of oral and written communications, it is worth noting that employers, as shown in Table 1, rated students much more highly than faculty when specific performance criteria were used. Outcome D: Oral Communication Table 2 presents the evaluation results of student presentations during the graduate symposium on April 19, 2010. With the exception of the Supporting Materials criterion, the percentage of students satisfying performance criteria for oral communication far exceeded the criterion of Meet. In general, our students have effective oral communications skills. However, it was recommended that more emphasis be made in CSC 209, Research Methodology, on the need to adequately document supporting materials used in projects or theses. Such documentation should be included in oral presentations. This recommendation has been implemented already in CSc 209. Table 2. Students Meeting or Exceeding Oral Communication Categories. % Students Meeting or Performance Criteria Exceeding Criteria Organization Pattern 86 % Language Choices 100 % Delivery Techniques 93 % Supporting Materials 71 % Communication of Technical Content 100 %

8/21 Outcome D: Internship Supervisors Evaluations 6. Has the program systematically sought data from alumni to measure the longer-term effects of accomplishment of the program s learning outcomes? If so, please describe the approach to this information-gathering and the ways in which the information will be applied to the program s curriculum. If such activity has not yet occurred, please describe the plan by which it will occur. The College of Engineering and Computer Science conducted the Educational Benchmarking Inc. (EBI) survey of the graduating MS students during academic year 2008-2009. A total of nineteen (19) MS-CSC and MS-SE students responded; the survey did not distinguish between the two groups of students. The survey is made of 15 overall criteria, referred to as factors where each consists from 2 to 8 questions. EBI survey is primarily designed for graduating BS students; however, all the questions, except those of Factor 15, are general and could be used in part to collect indirect outcome related data from graduating MS students. Table 3 is a summary of the survey results for factors that in part relate to learning outcomes A through D. Table 3. EBI survey results related in part to learning outcomes A through D. Factor Outcome Mean Std Dev 3. Breadth of Curriculum A, D 5.55 1.24 4. Team & Extracurricular Activities D 5.33 1.50 8. System Design & Problem Solving A 5.75 0.93 9. Impact of Engineering Solutions D 5.82 0.72 10. Use of Tools and Text C, D 5.63 1.00 11. Apply Knowledge and Identify Problems B 5.52 1.49 12. Design Experience Built On Coursework B 5.82 0.81 The remaining 7 factors do not relate to learning outcomes A through D, and include questions concerning college life experiences such as quality of classroom lectures, course related workload, quality of laboratory facilities and computing services, career services, social topics, and student body. The results from this survey indicate the mean scores, except for Factor 4, were greater than the EBI benchmark goal of 5.5. The questions on Extracurricular Activities in Factor 4 refer to student participation in student organization activities, which graduate students typically do not participate. On the other hand, response to question Value derived from team experiences (also in Factor 4) was scored with mean of 5.46 and standard deviation (std dev) of 1.45. The std dev results, especially those of Factors 3, 4, and 11, indicate a wider range for how students perceived the value of their MS degree education. This could be in part due to the educational backgrounds of the students. A large number of our graduate students are international students who have earned their undergraduate computer science or related degree elsewhere. The set of new rubrics as well as data collection across the career of graduate students should provide more objective assessment information that we hope will help to better gage students preparedness for graduate education.

9/21 Additionally, the survey results were compared to those of group Select 6 institutions, which included CSU Northridge, CSU LA, University of the Pacific, University of Texas-Austin, Santa Clara University, and Oregon State University. This group includes several institutions that are similar to CSUS, including two CSU campuses. The combined performance of the students in MS-CSC and MS-SE programs was comparable and in some cases better than that of Select 6 group. 7. Does the program pursue learning outcomes identified by an accrediting or other professional discipline-related organization as important? Does the set of outcomes pursued by your program exceed those identified as important by your accrediting or other professional discipline-related organization? N/A 8. Finally, what additional information would you like to share with the Senate Committee on Instructional Program Priorities regarding the program s desired learning outcomes and assessment of their accomplishment? The Department would like to share the following information with the Senate Committee on Instructional Program Priorities: a. In February 2011, an external consultant and the University Program Review Team evaluated both MS-CSC and MS-SE programs. The following comments are direct quotes from their combined report: The two master s programs (Computer Science and Software Engineering) and the Certificates of Advanced Study in the Computer Science Program at the Department of Computer Sciences be approved for six years or until the next scheduled program review. The review team commends the Department for being recognized as one of the premier programs in computing in the CSU system. The review team commends the Department Chair and the Graduate Coordinator for their leadership and commitment to graduate program review and assessment. The review team commends the faculty in the Department for their hard work in the graduate program review and assessment. The review team commends the Department for developing an assessment plan that gets at the types of data that are necessary to understand what the students are learning. The review team commends the Department for carefully constructing rubrics to evaluate the learning of graduate students. The review team commends the Department for its efforts to close the loop with assessment data. The review team commends the Department for engaging in and implementing strategies to improve the analysis efforts in the assessment process.

10/21 b. The MS-SE Program at Sacramento State is the first MS-SE Program in the California State University system. After consulting with computing and software development industry regarding industry needs for professionals of graduate degree education in Software Engineering, the Department began to offer the MS-SE Program in Spring 1999 with the support of the College, the University, and the Office of the Chancellor. Although the MS-CSC Program and the MS-SE Program offered by the Department have different degree requirements, the courses required by the MS-SE Program are part of the normally offered curriculum of the MS-CSC Program. More specifically, all the core courses of the MS-SE Program are part of the elective courses of the MS-CSC Program. And, all the other courses (core and electives) for the MS-CSC Program can be taken to meet the elective requirements of the MS-SE Program. Thus the MS-SE Program does not add any cost to the MS- CSC Program or to the Department. The graduate courses offered by the Department have been well enrolled by students. The Department has been proud of the curriculum design and development for the MS-SE Program, Which have helped the Department ensure that the MS-SE Program has no resource implications. c. The Department has recently received nationally competitive grants and recognitions. In 2007 the US Department of Homeland Security and the National Security Agency designated the Department as a national Center of Academic Excellence in Information Assurance and Security Education. This national level designation recognizes the Computer Science faculty's achievements in research and in curriculum development for both undergraduate and graduate programs in response to the discipline needs in information assurance and security. In September 2010, the Department received about $1.2 million funding from NSF for a Scholarship for Service Program to educate and train in the next four years 15 undergraduate and graduate students to become information assurance and security professionals mainly for Federal and State governments. This NSF grant also provides the Department with funding support for faculty development. In January 2011, the Department also received $740K funding from NSF for a three year project on game design to mentor K-12 educationally disadvantaged students in computer science and math. These two most recently received nationally competitive NSF grants again recognize the Computer Science faculty's achievements in scholarly contributions and curriculum development. In addition, other grants were also awarded for funded research projects. d. This report includes data and information from the following existing reports that are available upon request: Academic Program Review Report, Graduate Programs College of Engineering and Softwar Engineering, Oct. 2011 Fall 2010 Graduate Program Assessment Program Report, Department of Softwar Engineering, January 19, 2011. Re-Assessment of Written Communication, dated February 2, 2011. 2009 2010 Graduate Program Self Study Report, Department of Softwar Engineering, June 7, 2010.

11/21 Appendix A Masters Thesis/Project Rubric Fundamental and advanced knowledge (Meets Criteria) 1. Research/Project Topic: Includes some research and/or novel ideas 2. Advanced Concepts: Includes a few applications of new and contemporary advanced concepts 3. Fundamental concepts and skills: The work is a culmination of undergraduate experience and skill set and includes good levels software/hardware development 4. Integration: Includes integration of knowledge and skills from a few computer science and/or software engineering subject areas Exceeds Meets Progressing to Below Not Applicable Ability to plan and conduct research/development work (Meets Criteria) 5. Research/Project Objectives: Research/project question (or statement) is clearly articulated to reader and sufficient background is provided for reader to understand the importance of the topic. Chosen topic applies student s skill set; 6. Understanding of Literature: Demonstrates comprehensive review of peer reviewed academic literature related to student s topic; student identifies limitations of the existing literature. 7. Analysis/Presentation of Results: Student presents results in tabular and/or graphical form to facilitate reader s understanding (professional quality tables & graphs). Student clearly summarizes results; discussion of results is focused and tied to proposed research/development question; describes implications for future research. Exceeds Meets. Progressing to Below Not Applicable

12/21 Appendix B Term Project Rubric Fundamental and advanced knowledge 1. Knowledge: Ability to recall all the applicable facts, theories, and learned materials 2. Comprehension: Aware of the knowledge pieces (e.g., can compare, contrast, paraphrase, extend, summarize) 3. Application: Ability to apply the knowledge pieces to new ideas 4. Analysis: Ability to break down a problem into its knowledge pieces to understand its organizational structure 5. Synthesis: Ability to combine knowledge pieces into new entities 6. Evaluation: Ability to assess the performance of new entities and make performance-related decisions. Exceeds Meets Progressing to Below Not Applicable

13/21 Attachment C Term Paper Rubric Knowledge integration within the field 1. Background information: Able to identify relevant supporting augments for the work such as original authors motivation for the topic (e.g., how clearly motivation is identified.), relevancy to the discussion that follows, etc. Exceeds Meets Progressing to Below Not Applicable 2. Proposed solution(s): Able to articulate proposed solution(s) (e.g., how well the solutions are understandable; how well there are transitions from point to point?) 3. Use of evidence: Able to use primary source information to support every point (e.g., how well information is integrated into sentences.) 4. Analysis: Ability to analyze and identify contributions (e.g., advantages, disadvantages, and applications) 5. Logic and argumentation: Ability to use logic and argumentation to make points (e.g., how well ideas flow logically and make novel connections to other material, such as class subjects, that illuminate contributions.)

14/21 Table 1. Evaluation of composition and completeness Attachment D Written Communication Rubric Criteria 4 Exceeds 3 Meets 2 Progressing to 1 Below NA Score Structure. This section evaluates the formal structure of the report including the organization of sections and subsections. Reports should have a title and a table of contents showing logical sections and subsections. Structure (organization and transitions) The report is well organized, Report is organized with a and maintains a consistent reasonable flow of ideas. style. Transitions are logical Most transitions are logical and smooth. and smooth. Report is somewhat organized. Transitions are not always logical and smooth. Report is not organized. Little sense of wholeness and completeness. Poor transitions. Syntax, Sentence structure and conventions of standard English. This section evaluates the author's use of language to clearly communicate ideas. Spelling and grammar are included in the evaluation. Syntax, sentence structure and conventions of standard English Words are chosen with care in consideration of fine differences in meaning. Very few errors in syntax, spelling, and/or grammar. Sentence structure usually conveys the intended meaning. In general, there are few errors in syntax, spelling, and/or grammar. Sentence structure sometimes conveys confusing meanings, but the intent can still be discerned from the context. A number of errors in syntax, spelling, and/or grammar. Sentence structure conveys misleading meanings. Many errors in syntax, spelling, and/or grammar. Paragraph Structure. This section evaluates the author's integration of sentences into meaningful paragraphs. Please evaluate the report with respect to the following description of a well-written paragraph: The first sentence of a paragraph establishes some perspective for the remainder of the paragraph (e.g., a topic sentence or a transitional sentence). Within a paragraph, sentences are relevant to the paragraph and are in a logical order. Near the end of the paragraph, there is some statement that unifies or completes the ideas presented in that paragraph. Paragraph Paragraphs are on topic and Most paragraphs are on understandable. Stylistic topic and understandable variations show command of with some errors. Although language. there may be some loss of focus, paragraphs are reasonably written. Some paragraphs indicate good structure, but often, paragraphs do not show unifying thought and logic. Sentences within paragraphs seem to be related. Paragraphs are confusing, with unclear topic and meaning. Table 2. Presentation of technical content This is an evaluation of writing skills as used to convey technical content, not an evaluation of the perceived difficulty of the project. Consider whether the student has effectively communicated the attributes of the project. If any of the following aspects does not apply, then mark NA. Criteria 4 Exceeds Criteria 3 Meets Criteria 2 Progressing to Criteria 1 Below NA Score Problem Statement. This section evaluates the problem statement. A problem statement describes the purpose of the work (i.e., the need being addressed) as well as how the project results will accomplish that purpose. Problem Statement Objective, nature of challenges and value of the project are clearly established. Objective, nature of challenges and value of the project are adequately stated. Some significant aspects of the objective, nature of challenges and value of the project are missing. Design Requirements. This section includes specifications of functional and/or non- functional requirements. Design Requirements Specifications Specifications are complete. Appropriate design constraints have been identified. Specifications are fairly complete. Most design constraints have been identified. Some specifications are missing. Some design constraints are not identified. Significant aspects of the objective, nature of challenges and value of the project are missing. Requirements are not specified. Design constraints are not identified.

15/21 Development Process. In this section, students document their development process. The purpose is not to write a history of the project, but to document key development decisions and the factors that should be considered in making those decisions. It is possible that this section will recommend to the reader an improvement over the development process that was actually followed. Development Process Key development decision alternatives are well identified and/or compared. Reasoning shows a deep understanding of problem area. Key development decision alternatives are adequately identified and/or compared. Reasoning shows a good understanding of problem area. Limited key development decision alternatives are identified and/or compared. Reasoning shows a limited understanding of problem area. Key development decision alternatives are not identified and compared. Reasoning does not show an understanding of problem area. Analysis of Project Results. In this section, do not evaluate how far the student has developed the project, but evaluate whether you understand what has been accomplished in the project on the basis of data analysis and performance results. Analysis of Results All important aspects of the performance of the project are described with measured results or precise evaluative statements. The implementation of specified requirements is fully analyzed and verified. Most important aspects of the performance of the project are described with measured results or evaluative statements. The implementation of specified requirements is adequately analyzed and verified. Some aspects of the performance of the project are described with measured results or evaluative statements. The implementation of specified requirements are minimally analyzed and verified. No aspect of the performance of the project is described with measured results or evaluative statements. The implementation of specified requirements is not analyzed and verified. Conclusion. Evaluate how well the report summarizes and evaluates the major efforts involved in the project, and discusses future work. Conclusion Conclusion succinctly describes the accomplishments of the effort and relates them to the original problem. Future work is fully discussed. Conclusion clearly describes most of the accomplishments and relates them to the original problem statement. Future work is reasonably well discussed. Conclusion describes some of the accomplishments and relates them to the original problem statement. Discussion on future work is very limited. No clear summary of project. No discussion of future work.

16/21 Appendix E Oral Communication Rubric Exceeds Meets Progressing to Below ORGANIZATION Organizes content logically and sequentially. Main points are clearly identified and concisely presented. Transitions are logical and smooth. Introduction, body, and conclusion are clearly delineated. Provides a clear summary of project. Attracts and holds interest of audience. Speaks clearly, distinctly, and with sufficient volume. Presents material effectively with confidence and enthusiasm. Maintains eye contact throughout presentation. Appropriate use of vocabulary. Accurate use of technical terms and phrases. Consistently follows rules of standard English. Presents ideas & arguments persuasively, logically, and clearly. Techniques used are clearly stated and presented. Demonstrates a thorough knowledge of problem area. Uses appropriate visual aids that are clear, readable and aid in better understanding of the project. STYLE and DELIVERY LANGUAGE and VOCABULARY COMMUNICATION OF TECHNICAL CONTENT Answers all questions clearly and to the point..

17/21 Attachment F Industrial Advisory Committee (IAC) Survey B.S.* in Computer Science Program, CSUS IAC Member Name Date of Survey Completion Please rate each of the 10 characteristics listed below in terms of relative importance by giving a numerical score: 4 Essential, 3 Important, 2 Desirable, or 1 No relevant. Characteristics of successful employees 3 to 5 years after completion of B.S. degrees in Computer Science Score 1. Significant contributor in one or more of the aspects associated with the development, maintenance, and support of real world computing systems. 2. Effective and contributing member of a project team. 3. Continually engage in the pursuit of professional development opportunities and/or pursue postgraduate studies. 4. Assume a leadership role in their chosen career and profession. 5. Demonstrate competence in producing high quality written documents, and in reviewing and revising those prepared by others. 6. Demonstrate competence in developing and delivering technical and non-technical presentations of high quality to a variety of audiences. 7. Abide by the ethical standards of the profession and understand the social implications of their professional activities. 8. Able to work independently and function in an environment with incomplete information. 9. Able to take risk and think outside of the box. 10. Maintain currency in the discipline. Additional (extra) characteristics which should also be included (please also rate their relative importance.) *Same criteria hold for MS students, according to IAC members

18/21 Student Name: Organization: Supervisor Name: Position: Work period from: Average hours per week: Appendix G Internship Supervisor Evaluation Form CSC 195 CSC 295 to: Copy of report received? Date: Phone: The above student has requested college units for the experience gained under your supervision. We would appreciate it if you would rate the student on the following: Ability to develop a computerized solution to a real life problem using appropriate tools: Outstanding Above Average Average Below Average Weak Did Not Observe Ability to function as a team member: Outstanding Above Average Average Below Average Weak Did Not Observe Effective oral communication: Outstanding Above Average Average Below Average Weak Did Not Observe Effective written communication: Outstanding Above Average Average Below Average Weak Did Not Observe Appropriate use of presentation tools: Outstanding Above Average Average Below Average Weak Did Not Observe Awareness of ethical and societal concerns: Outstanding Above Average Average Below Average Weak Did Not Observe Additional comments (continue on back or attach a separate page if needed): Thank you for your cooperation. Signature:

19/21 Attachment H Mock evaluation scores for a term-project Faculty Name: JS ( #1) Student: #1 Course: CSc2xx Semester: Fall 2012 Fundamental and advanced knowledge 1. Knowledge: Ability to recall all the applicable facts, theories, and learned materials 2. Comprehension: Aware of the knowledge pieces (e.g., can compare, contrast, paraphrase, extend, summarize) 3. Application: Ability to apply the knowledge pieces to new ideas 4. Analysis: Ability to break down a problem into its knowledge pieces to understand its organizational structure 5. Synthesis: Ability to combine knowledge pieces into new entities 6. Evaluation: Ability to assess the performance of new entities and make performance-related decisions. Exceeds Expectation Meets Expectation Progressing to Expectation Below Not Applicable

20/21 Appendix I Mock faculty member evaluation scores Facculty Member: #1 Student # 1. Knowledge Semester: Fall2012 Course: CSc2xx Fundamental and advanced knowledge 2. Comprehension 3. Application: 4. Analysis 5. Synthesis 6. Evaluation 1 3 4 3 2 3 4 2 4 4 4 4 3 4 3 3 3 2 4 4 3 4 3 3 3 3 3 3 Term Projects AAP Std Dev Fac. AP: 3.25 3.5 3 3.25 3.25 3.5 3.29 0.17 Std Dev: 0.43 0.50 0.71 0.83 0.43 0.50

21/21 Faculty Member # 1. Knowledge Appendix J Mock faculty evaluation scores Semester: Fall 2012 Fundamental and advanced knowledge (AP Scores) 2. Comprehension 3. Application: 4. Analysis 5. Synthesis 6. Evaluation Term Projects AAP Std Dev 1 3.25 3.5 3 3.25 3.25 3.5 3.29 0.17 2 3.75 4 3.8 3.2 2.9 3 3.44 0.42 3 3.5 3.2 3.1 3 3.6 3 3.23 0.24 Dept. AP: 3.5 3.6 3.3 3.2 3.3 3.2 3.32 0.16 Std Dev: 0.20 0.33 0.36 0.11 0.29 0.24 Faculty Member # 1. Knowledge Semester: Spring 2013 Fundamental and advanced knowledge (AP Scores) 2. Comprehension 3. Application: 4. Analysis 5. Synthesis 6. Evaluation 2 4 3.5 3 3 2.9 3 3.233 0.39 3 3.7 3.5 3.1 3.4 3 3.2 3.317 0.24 5 3.5 3.6 3 3.25 3 3.2 3.258 0.23 Dept. AP: 3.73 3.53 3.03 3.22 2.97 3.13 3.27 0.27 Std Dev: 0.21 0.05 0.05 0.16 0.05 0.09 Department, two semesters: 3.30 0.03