ABET Criteria for Accrediting Computer Science Programs

Similar documents
NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE)

2005 National Survey of Student Engagement: Freshman and Senior Students at. St. Cloud State University. Preliminary Report.

National Survey of Student Engagement (NSSE)

National Survey of Student Engagement

Office of Institutional Effectiveness 2012 NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE) DIVERSITY ANALYSIS BY CLASS LEVEL AND GENDER VISION

NATIONAL SURVEY OF STUDENT ENGAGEMENT

National Survey of Student Engagement The College Student Report

NATIONAL SURVEY OF STUDENT ENGAGEMENT

BENCHMARK TREND COMPARISON REPORT:

UK Institutional Research Brief: Results of the 2012 National Survey of Student Engagement: A Comparison with Carnegie Peer Institutions

What Is The National Survey Of Student Engagement (NSSE)?

National Survey of Student Engagement Spring University of Kansas. Executive Summary

National Survey of Student Engagement Executive Snapshot 2010

NATIONAL SURVEY OF STUDENT ENGAGEMENT

National Survey of Student Engagement (NSSE)

National Survey of Student Engagement at UND Highlights for Students. Sue Erickson Carmen Williams Office of Institutional Research April 19, 2012

2009 National Survey of Student Engagement. Oklahoma State University

National Survey of Student Engagement (NSSE) Temple University 2016 Results

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Revision and Assessment Plan for the Neumann University Core Experience

Carolina Course Evaluation Item Bank Last Revised Fall 2009

2010 National Survey of Student Engagement University Report

Procedures for Academic Program Review. Office of Institutional Effectiveness, Academic Planning and Review

ACCREDITATION STANDARDS

Student Engagement and Cultures of Self-Discovery

ASSESSMENT OF STUDENT LEARNING OUTCOMES WITHIN ACADEMIC PROGRAMS AT WEST CHESTER UNIVERSITY

George Mason University Graduate School of Education Education Leadership Program. Course Syllabus Spring 2006

eportfolio Guide Missouri State University

STUDENT LEARNING ASSESSMENT REPORT

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

State Parental Involvement Plan

An Introduction to LEAP

Standards and Criteria for Demonstrating Excellence in BACCALAUREATE/GRADUATE DEGREE PROGRAMS

Developing an Assessment Plan to Learn About Student Learning

SURVEY RESEARCH POLICY TABLE OF CONTENTS STATEMENT OF POLICY REASON FOR THIS POLICY

SELF-STUDY QUESTIONNAIRE FOR REVIEW of the COMPUTER SCIENCE PROGRAM and the INFORMATION SYSTEMS PROGRAM

University of Toronto Mississauga Degree Level Expectations. Preamble

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Contract Language for Educators Evaluation. Table of Contents (1) Purpose of Educator Evaluation (2) Definitions (3) (4)

Saint Louis University Program Assessment Plan. Program Learning Outcomes Curriculum Mapping Assessment Methods Use of Assessment Data

Cultivating an Enriched Campus Community

Request for Proposal UNDERGRADUATE ARABIC FLAGSHIP PROGRAM

Volunteer State Community College Strategic Plan,

Focus on. Learning THE ACCREDITATION MANUAL 2013 WASC EDITION

EDIT 576 DL1 (2 credits) Mobile Learning and Applications Fall Semester 2014 August 25 October 12, 2014 Fully Online Course

AB104 Adult Education Block Grant. Performance Year:

Field Experience and Internship Handbook Master of Education in Educational Leadership Program

ACADEMIC AFFAIRS GUIDELINES

Digital Media Literacy

GRADUATE CURRICULUM REVIEW REPORT

EDIT 576 (2 credits) Mobile Learning and Applications Fall Semester 2015 August 31 October 18, 2015 Fully Online Course

- COURSE DESCRIPTIONS - (*From Online Graduate Catalog )

Student Handbook 2016 University of Health Sciences, Lahore

PEDAGOGY AND PROFESSIONAL RESPONSIBILITIES STANDARDS (EC-GRADE 12)

Strategic Plan SJI Strategic Plan 2016.indd 1 4/14/16 9:43 AM

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Lecturer Promotion Process (November 8, 2016)

MINNESOTA STATE UNIVERSITY, MANKATO IPESL (Initiative to Promote Excellence in Student Learning) PROSPECTUS

HIGHLAND HIGH SCHOOL CREDIT FLEXIBILITY PLAN

Program Assessment and Alignment

STUDENT EXPERIENCE a focus group guide

Davidson College Library Strategic Plan

Teachers Guide Chair Study

Program Guidebook. Endorsement Preparation Program, Educational Leadership

Person Centered Positive Behavior Support Plan (PC PBS) Report Scoring Criteria & Checklist (Rev ) P. 1 of 8

Assessment of Student Academic Achievement

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

Higher education is becoming a major driver of economic competitiveness

The Characteristics of Programs of Information

IBCP Language Portfolio Core Requirement for the International Baccalaureate Career-Related Programme

Master s Programme in European Studies

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

Higher Education / Student Affairs Internship Manual

San Diego State University Division of Undergraduate Studies Sustainability Center Sustainability Center Assistant Position Description

HEALTH INFORMATION ADMINISTRATION Bachelor of Science (BS) Degree (IUPUI School of Informatics) IMPORTANT:

Examining the Structure of a Multidisciplinary Engineering Capstone Design Program

YOU RE SERIOUS ABOUT YOUR CAREER. SO ARE WE. ONLINE MASTER OF SOCIAL WORK

Assessment for Student Learning: Institutional-level Assessment Board of Trustees Meeting, August 23, 2016

SELF-STUDY QUESTIONNAIRE FOR REVIEW of the COMPUTER SCIENCE PROGRAM

Annual Report Accredited Member

Quantitative Study with Prospective Students: Final Report. for. Illinois Wesleyan University Bloomington, Illinois

John Jay College of Criminal Justice, CUNY ASSESSMENT REPORT: SPRING Undergraduate Public Administration Major

University of Oregon College of Education School Psychology Program Internship Handbook

Unit 3. Design Activity. Overview. Purpose. Profile

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

Omak School District WAVA K-5 Learning Improvement Plan

ASSISTANT DIRECTOR OF SCHOOLS (K 12)

ASSESSMENT REPORT FOR GENERAL EDUCATION CATEGORY 1C: WRITING INTENSIVE

Promotion and Tenure Guidelines. School of Social Work

I. Proposal presentations should follow Degree Quality Assessment Board (DQAB) format.

2020 Strategic Plan for Diversity and Inclusive Excellence. Six Terrains

Senior Project Information

The University of North Carolina Strategic Plan Online Survey and Public Forums Executive Summary

GEORGE MASON UNIVERSITY COLLEGE OF EDUCATION AND HUMAN DEVELOPMENT. Education Leadership Program Course Syllabus

PROGRAM PRESENTATION

Alpha provides an overall measure of the internal reliability of the test. The Coefficient Alphas for the STEP are:

Curriculum Policy. November Independent Boarding and Day School for Boys and Girls. Royal Hospital School. ISI reference.

Self Assessment. InTech Collegiate High School. Jason Stanger, Director 1787 Research Park Way North Logan, UT

MGMT 3280: Strategic Management

UNESCO Bangkok Asia-Pacific Programme of Education for All. Embracing Diversity: Toolkit for Creating Inclusive Learning-Friendly Environments

Transcription:

ABET Criteria for Accrediting Computer Science Programs Mapped to 2008 NSSE Survey Questions First Edition, June 2008 Introduction and Rationale for Using NSSE in ABET Accreditation One of the most common institutional uses of NSSE data is for accreditation. In fact, NSSE schools report that accrediting agencies are the primary external group with which they share NSSE results. There are two major reasons for this. First, when assessing educational effectivenesss accreditation agencies give less weight to indicators that represent institutional resources such as library holdings and student characteristics. More emphasis is placed on evidence of student learning. Indeed, regional associations and various specialized accrediting organizations urge colleges and universities to more thoroughly measure student learning. Specifically, demonstrating that processes are in place to assess and enhance learning outcomes and institutional effectiveness on an ongoing basis are among the recommendations. Student engagement results from NSSE are a direct indicator of what students put into their education and an indirect indicator of what they get out of it. NSSE data can show how engaged various types of students are in effective educational practices during the first and last years of college. Second, regional and professional accreditation standards encourage institutions to focus on self-evaluation and formative reviews that guide improvement efforts. Rather than fashion self-studies as a stand-alone document for one-time use, they have begun to feature more elements of strategic planning and program evaluations that can be used to identify areas where institutions wish to improve. NSSE data are especially valuable for this purpose. The results are actionable; that is, they point to aspects of student and institutional performance that institutions can use to improve the curriculum, pedagogy, instructional emphases, and campus climate. In addition, because NSSE results allow a school to compare itself to others, the data often point to areas where improvement may be desired. NSSE results help answer key questions related to institutional policies and programs associated with high levels of student engagement and learning. An effective accreditation plan is context specific. No one approach or template can do justice to the wide variety of institutional missions, curricula, and campus environments the plan is designed to address. However, two common early steps to developing an accreditation plan are to identify the assessment practices already in place and the data that are available and then to augment this evidence with the selfstudy process 1. Specific applications of student engagement information for accreditation vary. They may range from minimal use such as including the results in a self-study appendix to systematic incorporation of NSSE results over time to demonstrate the impact of improvement initiatives. This toolkit provides suggestions for incorporating NSSE into accreditation processes and products with an emphasis on mapping student engagement results to specialized accreditation standards. NSSE and the ABET Accreditation Process NSSE results can be used in many components of the ABET accreditation process. These include but are not limited to (a) the initial self-evaluation report that responds to evaluation criteria established by the accrediting body; (b) the visit by the team of peer evaluators who gather additional evidence; and (c) ongoing review and maintenance that may include annual reports, annual summaries, and periodic reviews of strategic progress. For specific schools or departments of computer science hoping to use NSSE results in specialized program accreditation processes, it is valuable to understand the institutional strat 1 Alstete, J.W. (Ed.). (2004). Why Does Accreditation Matter? [Special Issue]. ASHE-ERIC Higher Education Journal, 30(4), 1-26. ABET COMPUTER SCIENCE ACCREDITATION TOOLKIT 1

egy for administering and utilizing NSSE. The following section provides suggestions on how computer science schools or programs may work with central administration to ensure appropriate timelines, enough computer science students are sampled, and encourage participation in specialized consortia. Even for departments that may not have a direct impact on the decisions regarding NSSE participation, understanding general institutional policies will help determine the use of NSSE data in the specialized accreditation process. Timeline for NSSE Administration Institutions will establish different timelines to meet their self-study objectives. For this reason, some schools administer NSSE on an annual or biennial basis. The appropriate NSSE participation cycle for your school depends on you intend to use your data. Many institutions have found it valuable to have several years of NSSE results to establish a reliable baseline of data. Then, institutions assess their students every few years to allow time for institutional changes to take effect. This planned administration cycle maximizes the use of student engagement data for most accreditation purposes. A substantial number of schools have gathered student engagement information multiple times. This suggests that they may be comparing the results over time to estimate areas in which student performance is changing. It may also indicate that some of these colleges are carefully monitoring student learning processes to track trends over time and to make certain that institutional performance remains at the desired level. Since the ABET accreditation cycle may not coincide with regional accreditation or other priorities that drive an institution s NSSE participation, ABET review committees will need to plan their analyses and use of NSSE data around existing institutional participation timelines. Ideally, ABET committees will have input into institutional decisions about the frequency of NSSE participation to establish baselines as well as whether to include an oversample of computer science students based on selected criteria. Sampling Computer Science Students Although all institutions are required to utilize the standard NSSE random sample, institutions have the option to include an oversample. Institutions can choose to oversample the general population by increasing the percentage of all students in their random sample. They can also choose a targeted oversample which allows them to identify a specific group of students they would like to be part of their sample. One of the key issues for computer science schools hoping to use NSSE in ABET accreditation is ensuring enough computer science students are surveyed to have valid results. Please note that an additional fee is charged for an oversample. Oversampling allows institutions to choose the most appropriate sample for its purposes. Schools will need to determine what criteria to use to identify computer science students. Oversampling seniors in computer science may be a straightforward approach. However, identifying first-year computer science students may not be straightforward since many have not yet declared a major. An institution may select all students registered in certain classes for its sample or use any student who identifies themselves as a computer science major for their analyses. Forming a Consortium Computer science schools or programs that wish to use NSSE data to support ABET accreditation may want to explore participation in a consortium, a group of six or more colleges or universities participating in NSSE that share comparative, aggregated data among their institutions. Consortia may also ask up to 20 additional questions that address unique characteristics of the member schools (all consortium institutions will get the same set of questions). More information on forming a consortium is available on the NSSE Web site, www.nsse.iub.edu/html/consortia.cfm. Special Analysis Special analyses allow for more detailed comparisons between your students and those attending other institutions. A useful special analysis for an institution seeking ABET accreditation might be to compare the responses of your senior student computer science majors to students in similar programs at selected institutions. There are additional fees associated with special analyses. Find more information on the NSSE Web site at www.nsse.iub.edu/special_analysis/ index.cfm, Partner Surveys Finally, NSSE offers two partner surveys, the Faculty Survey of Student Engagement (FSSE) and the Beginning College Survey of Student Engagement (BCSSE). FSSE measures faculty members expectations of student engagement, provides information on how faculty spend their time related to professorial activities, and highlights the kinds of learning experiences the faculty emphasize. BCSSE measures entering first-year students high school academic and co-curricular involvements as well as their expectations and attitudes regarding their participation in educationally purposeful activities for the upcoming year. These surveys may be helpful in supporting ABET standards related to ABET COMPUTER SCIENCE ACCREDITATION TOOLKIT 2

faculty and students expectations and outcomes. Mapping NSSE to ABET Criteria This toolkit is not intended to be a formula for mapping NSSE results to specialized accreditation standards, but is designed to encourage institutions to think more broadly about how these data can be used as evidence to support specific standards. Also, NSSE findings and benchmark scores may be used to support and document institutional improvement efforts, but will be most meaningful when coupled with other measures of student learning outcomes from your campus. We offer suggestions on how institutions might use their NSSE data as indirect evidence. For example, NSSE survey items 1l, Used an electronic medium (listserv, chat group, Internet, instant messaging, etc. ) to discuss or complete an assignment and 7d which asks students if they plan to or have worked on a research project with a faculty member may correlate with the availability of computer technology and resources under Standard V Laboratories and Computing Facilities, particularly in the areas of computer science and information systems. Furthermore, as item 1l addresses the frequency with which students use electronic resources to communicate with faculty or complete an assignment, the pervasiveness of access to technology and the level at which an institution supports an effective technological infrastructure may have a demonstrable impact on student use of technology. Standard V-3, in particular, which ensures that faculty have adequate access to computing resources for class 2008 NSSE Survey Items Mapped to ABET Criteria - Computer Science Programs ABET 1 Academic and Intellectual Experiences a. Asked questions in class or contributed to class discussions b. Made a class presentation IV-15 c. Prepared two or more drafts of a paper or assignment before turning it in IV-16 d. Worked on a paper or project that required integrating ideas or information from various sources e. Included diverse perspectives (different races, religions, genders, political beliefs, etc.) in class discussions or writing assignments f. Come to class without completing readings or assignments g. Worked with other students on projects during class h. Worked with classmates outside of class to prepare class assignments i. Put together ideas or concepts from different courses when completing assignments or during class discussions j. Tutored or taught other students (paid or voluntary) k. Participated in a community-based project (e.g. service learning) as part of a regular course l. Used an electronic medium (Listserv, chat group, Internet, instant messaging etc.) to discuss or complete an assignment m. Used e-mail to communicate with an instructor II-2, V-1, V-2, V-4, V-5, VII- 2, VII-3 II-2, III-9, V-1 - V.5 n. Discussed grades or assignments with an instructor II-2, III-9 o. Talked about career plans with a faculty member or advisor II-3, II-4, III-9 p. Discussed ideas from your readings or classes with faculty members outside of class II-2 ABET COMPUTER SCIENCE ACCREDITATION TOOLKIT 3

2008 NSSE Survey Items Mapped to ABET Criteria - Computer Science Programs ABET q Received prompt written or oral feedback from faculty on your academic performance II-2, II-3, II-4, III-9 r. Worked harder than you thought you could to meet an instructor s standards or expectations s. Worked with faculty members on activities other than coursework (committees, orientation, student life activities, etc.) t. Discussed ideas from your readings or classes with others outside of class (students, family members, coworkers, etc.) u. Had serious conversations with students of a different race or ethnicity than your own v. Had serious conversations with students who are very different from you in terms of their religious beliefs, political opinions, or personal values 2 Mental Activities a. b. c. d. Memorizing facts, ideas, or methods from your courses and readings so you can repeat them in pretty much the same form Analyzing the basic elements of an idea, experience, or theory, such as examining a particular case or situation in depth and considering its components Synthesizing and organizing ideas, information, or experiences into new, more complex interpretations and relationships Making judgments about the value of information, arguments, or methods, such as examining how others gathered and interpreted data and assessing the soundness of their conclusions IV-7 IV-7, IV-9 IV-14 e. Applying theories or concepts to practical problems or in new situations IV-7 3 Reading and Writing a. Number of assigned textbooks, books, or book-length packs of course readings b. Number of books read on your own (not assigned) for personal enjoyment or academic enrichment c. Number of written papers or reports of 20 pages or more d. Number of written papers or reports between 5 and 19 pages e. Number of written papers or reports of fewer than 5 pages 4 Problem Sets a. Number of problem sets that take you more than an hour to complete b. Number of problem sets that take you less than an hour to complete ABET COMPUTER SCIENCE ACCREDITATION TOOLKIT 4

2008 NSSE Survey Items Mapped to ABET Criteria - Computer Science (continued) ABET 5 Exams Mark the box that best represents the extent to which your examinations during the current school year have challenged you to do your best work. 6 Additional Collegiate Experiences a. Attended an art exhibit, play, dance, music, theater, or other performance b. Exercised or participated in physical fitness activities c. Participated in activities to enhance your spirituality (worship, meditation, prayer, etc.) d. Examined the strengths and weaknesses of your own views on a topic or issue e. Tried to better understand someone else s views by imagining how an issue looks from his or her perspective f. Learned something that changed the way you understand an issue or concept 7 Enriching Educational Experiences a. Practicum, internship, field experience, co-op experience, or clinical assignment V-1 - V-5 b. Community service or volunteer work c. d. Participate in a learning community or some other formal program where groups of students take two or more classes together Work on a research project with a faculty member outside of course or program requirements V-1 - V-5 e. Foreign language coursework f. Study abroad g. Independent study or self-designed major h. Culminating senior experience (capstone course, senior project or thesis, comprehensive exam, etc.) 8 Quality of Relationships a. Relationships with other students b. Relationships with faculty members II-2, II-3, II-4, III-9 c. Relationships with administrative personnel and offices II-3, V-5 ABET COMPUTER SCIENCE ACCREDITATION TOOLKIT 5

2008 NSSE Survey Items Mapped to ABET Criteria - Computer Science (continued) ABET 9 Time Usage a. Preparing for class (studying, reading, writing, doing homework or lab work, analyzing data, rehearsing, and other academic activities) b. Working for pay on campus c. Working for pay off campus d. Participating in co-curricular activities (organizations, campus publications, student government, fraternity or sorority, intercollegiate or intramural sports, etc.) e. Relaxing & socializing (watching TV, partying, etc.) f. Providing care for dependents living with you (parents, children, spouse, etc.) g. Commuting to class (driving, walking, etc.) 10 Institutional Environment a. Spending significant amounts of time studying and on academic work b. Providing the support you need to help you succeed academically II-3, II-4, III-9, V-1, V-2, V-4, V-5, VI-7, VI- 8, VII-1 - VII-5 c. Encouraging contact among students from different economic, social, and racial or ethnic backgrounds d. Helping you cope with your non-academic responsibilities (work, family, etc.) e. Providing the support you need to thrive socially f. Attending campus events and activities (special speakers, cultural performances, athletic events, etc.) g. Using computers in academic work II-2, IV-1, IV- 17, V-1, V-2, V-4, V-5, VI-7 11 Educational and Personal Growth a. Acquiring a broad general education IV-3 b. Acquiring job or work-related knowledge and skills II-3, II-4 c. Writing clearly and effectively d. Speaking clearly and effectively e. Thinking critically and analytically f. Analyzing quantitative problems IV-16 IV-15 IV-7 ABET COMPUTER SCIENCE ACCREDITATION TOOLKIT 6

2008 NSSE Survey Items Mapped to ABET Criteria - Computer Science (continued) ABET g. Using computing and information technology h. Working effectively with others IV-3, IV-5, IV- 6, IV-8, IV-9, IV-17, V-1, V-2, V-4, V- 5, VI-7, VII-1 - VII4 i. Voting in local, state, or national elections j. Learning effectively on your own k. Understanding yourself l. Understanding people of other racial and ethnic backgrounds m. Solving complex real-world problems n. Developing a personal code of values and ethics IV-7, IV-17 IV-17 o. Contributing to the welfare of your community p. Developing a deepened sense of spirituality 12 Academic Advising Overall, how would you evaluate the quality of academic advising you have received at your institution? II-3, II-4, III-9 13 Satisfaction How would you evaluate your entire educational experience at this institution? 14 Satisfaction If you could start over again, would you go to the same institution you are now attending? ABET COMPUTER SCIENCE ACCREDITATION TOOLKIT 7

Institutional Examples Interest in using NSSE for specialized accreditation is growing across all sectors and types of institutions. Because NSSE focuses on student behaviors and effective educational practices, colleges and universities have found productive ways to incorporate survey results into their institutional self-studies and annual progress reports for ABET. In this section, we describe how selected institutions are using NSSE in ABET accreditation. Michigan Technical University (MTU) MTU used NSSE benchmark evidence to assess the effectiveness of the General Education program that MTU described as also supporting critical standards that apply to the ABET accreditation process. The Ohio State University In the accreditation summary section of the Program Self- Study Report (2005-2006) for Materials Science and Engineering, prepared for the Engineering Accreditation Commission of ABET, Ohio State uses NSSE results as evidence to support institutional level assessment-related criteria. Ohio State specifies goals and objectives for programs, and identifies direct and indirect measures to determine if those goals are being met. Ohio State will administer NSSE every three years to chart improvement on University-wide goals such as promoting academic excellence, enhancing the overall undergraduate experience, and maintaining a commitment to quality. Pace University Accredited by ABET in 2001, the BS in Information Systems program was created at Pace to align educational objectives with its institutional mission to provide opportunities that prepare men and women for meaningful work, lifelong learning, and responsible participation in a new and dynamic information age. NSSE results were used to assess learning outcomes that correlated with ABET standards for information systems programs such as (1) ability to function effectively on teams to accomplish a common goal; (2) an understanding of professional ethical and social responsibilities; and (3) an ability to communicate effectively with a range of audiences. Recent Trends in Accreditation The following trends in accreditation support the use of student engagement results in assessment and institutional improvement initiatives in specialized programs: Campuses and accrediting bodies are moving toward self-studies that systematically review existing processes over time (i.e., strategic planning, program evaluation or student services, and enrollment management), as contrasted with one-point-in-time reports that have limited utility. Regional and specialized accrediting bodies have shifted away from setting and holding institutions to rigid quantitative standards that feature inputs and resources toward empirically-based indicators of institutional effectiveness and student learning. Regional and program accreditors are emphasizing the importance of cultivating cultures of evidence that nurture and sustain continuous improvement. Progressive campus leaders are increasingly harnessing the regional and specialized program re-accreditation process as a chariot for change. Rather than viewing the process as a burden or hurdle to be overcome, presidents, provosts, and deans are using the selfstudy and team visit as an opportunity to stimulate productive dialogue and to guide constructive change. Accreditation Tips Tip #1: Student engagement results provided by NSSE are direct indicators of what students put into their education and indirect indicators of what they get out of it. Tip #2: NSSE items can be used to analyze the resources and appraise the effectiveness of the institution in fulfilling the mission and goals of individual specialized programs. Two such measures included in the educational gains items are the extent to which students experiences at the institution have: (a) contributed to their knowledge, skills, and personal development in specific program areas; and (b) helped them develop a personal code of values and ethics. Assessment of these experiences may help to demonstrate achievement of program mission and goals. Tip #3: NSSE data are actionable; that is, they point to aspects of student and institutional performance that institutions can address related to the curriculum, pedagogy, instructional emphases, and campus climate. In addition, because NSSE benchmarks allow a school to compare itself to others, the results often point to areas where improvement may be desired. Tip #4: The Faculty Survey of Student Engagement (FSSE) measures faculty expectations of student engagement in educational practices that are empirically linked with high levels of ABET COMPUTER SCIENCE ACCREDITATION TOOLKIT 8

learning and development. Taken together, the combination of NSSE and FSSE results can be used to identify areas of strength as well as aspects of the undergraduate experience that may warrant attention and stimulate discussions related to improving teaching, learning, and the quality of students educational experience. Tip #5: NSSE results can help assess the degree to which the institution encourages contact among students from different economic, social, and racial or ethnic backgrounds and the extent to which students report that their experiences at the institution have contributed to their knowledge, skills, and personal development in understanding people of other racial and ethnic backgrounds. Results can also be used to demonstrate institutional effectiveness in responding to the increasing diversity in society through educational and co-curricular programs. Additional Information: Copies of this document, accreditation toolkits mapped to additional specialized and regional accreditation standards, and research reports related to NSSE data and accreditation are available on the NSSE Institute Web Site: www.nsse.iub. edu/institute/index.cfm?view=tools/accred. Criteria for Accrediting Computing Programs. Effective for Evaluations During the 2008-2009 Accreditation Cycle. ABET. Approved November 3, 2007. NSSE Toolkit update June 2008. NSSE Institute for Effective Educational Practice Indiana University Center for Postsecondary Research 1900 East Tenth Street, Suite 419 Bloomington, IN 47406-7512 Phone: 812-856-5824 Fax: 812-856-5150 E-mail: nsse@indiana.edu www.nsse.iub.edu/institute ABET COMPUTER SCIENCE ACCREDITATION TOOLKIT 9