The Integration of Assessment of Student Learning Outcomes and Teaching Effectiveness

Similar documents
Procedures for Academic Program Review. Office of Institutional Effectiveness, Academic Planning and Review

ASSESSMENT OF STUDENT LEARNING OUTCOMES WITHIN ACADEMIC PROGRAMS AT WEST CHESTER UNIVERSITY

ACADEMIC AFFAIRS GUIDELINES

APPENDIX A-13 PERIODIC MULTI-YEAR REVIEW OF FACULTY & LIBRARIANS (PMYR) UNIVERSITY OF MASSACHUSETTS LOWELL

San Diego State University Division of Undergraduate Studies Sustainability Center Sustainability Center Assistant Position Description

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

PROGRAM PRESENTATION

SURVEY RESEARCH POLICY TABLE OF CONTENTS STATEMENT OF POLICY REASON FOR THIS POLICY

University of Toronto

Pattern of Administration. For the Department of Civil, Environmental and Geodetic Engineering The Ohio State University Revised: 6/15/2012

Promotion and Tenure standards for the Digital Art & Design Program 1 (DAAD) 2

A Note on Structuring Employability Skills for Accounting Students

Delaware Performance Appraisal System Building greater skills and knowledge for educators

The Impact of Honors Programs on Undergraduate Academic Performance, Retention, and Graduation

Full name of the unit organized and maintained by the institution for the purpose of graduate education in library and information studies:

Expanded Learning Time Expectations for Implementation

Core Strategy #1: Prepare professionals for a technology-based, multicultural, complex world

Programme Specification. BSc (Hons) RURAL LAND MANAGEMENT

ABET Criteria for Accrediting Computer Science Programs

School Inspection in Hesse/Germany

VOL VISION 2020 STRATEGIC PLAN IMPLEMENTATION

ARIZONA STATE UNIVERSITY PROPOSAL TO ESTABLISH A NEW GRADUATE DEGREE

Faculty governance especially the

Promotion and Tenure Guidelines. School of Social Work

Developing an Assessment Plan to Learn About Student Learning

The Oregon Literacy Framework of September 2009 as it Applies to grades K-3

ESTABLISHING A TRAINING ACADEMY. Betsy Redfern MWH Americas, Inc. 380 Interlocken Crescent, Suite 200 Broomfield, CO

Case of the Department of Biomedical Engineering at the Lebanese. International University

PROPOSAL FOR NEW UNDERGRADUATE PROGRAM. Institution Submitting Proposal. Degree Designation as on Diploma. Title of Proposed Degree Program

Program Change Proposal:

Within the design domain, Seels and Richey (1994) identify four sub domains of theory and practice (p. 29). These sub domains are:

EDUC-E328 Science in the Elementary Schools

Program Assessment and Alignment

Specific questions on these recommendations are included below in Section 7.0 of this report.

Chapter 9 The Beginning Teacher Support Program

College of Science Promotion & Tenure Guidelines For Use with MU-BOG AA-26 and AA-28 (April 2014) Revised 8 September 2017

The Ohio State University Library System Improvement Request,

National Survey of Student Engagement (NSSE)

CHALLENGES FACING DEVELOPMENT OF STRATEGIC PLANS IN PUBLIC SECONDARY SCHOOLS IN MWINGI CENTRAL DISTRICT, KENYA

Governors and State Legislatures Plan to Reauthorize the Elementary and Secondary Education Act

A cautionary note is research still caught up in an implementer approach to the teacher?

Shelters Elementary School

eportfolio Guide Missouri State University

College of Business University of South Florida St. Petersburg Governance Document As Amended by the College Faculty on February 10, 2014

Modified Systematic Approach to Answering Questions J A M I L A H A L S A I D A N, M S C.

Contract Language for Educators Evaluation. Table of Contents (1) Purpose of Educator Evaluation (2) Definitions (3) (4)

Stimulating Techniques in Micro Teaching. Puan Ng Swee Teng Ketua Program Kursus Lanjutan U48 Kolej Sains Kesihatan Bersekutu, SAS, Ulu Kinta

Colorado State University Department of Construction Management. Assessment Results and Action Plans

Department of Plant and Soil Sciences

Developing a Language for Assessing Creativity: a taxonomy to support student learning and assessment

Rules of Procedure for Approval of Law Schools

A Comparison of the ERP Offerings of AACSB Accredited Universities Belonging to SAPUA

Educational Leadership and Administration

African American Studies Program Self-Study. Professor of History. October 8, 2010

Running Head: Implementing Articulate Storyline using the ADDIE Model 1. Implementing Articulate Storyline using the ADDIE Model.

STUDENT LEARNING ASSESSMENT REPORT

Guidelines for the Use of the Continuing Education Unit (CEU)

Automating Outcome Based Assessment

Introductory thoughts on numeracy

Mapping the Assets of Your Community:

Assessment of Student Academic Achievement

An Industrial Technologist s Core Knowledge: Web-based Strategy for Defining Our Discipline

From practice to practice: What novice teachers and teacher educators can learn from one another Abstract

2015 Academic Program Review. School of Natural Resources University of Nebraska Lincoln

Nottingham Trent University Course Specification

The Teaching and Learning Center

Quality in University Lifelong Learning (ULLL) and the Bologna process

Kelso School District and Kelso Education Association Teacher Evaluation Process (TPEP)

The completed proposal should be forwarded to the Chief Instructional Officer and the Academic Senate.

Oklahoma State University Policy and Procedures

Availability of Grants Largely Offset Tuition Increases for Low-Income Students, U.S. Report Says

TEACHING QUALITY: SKILLS. Directive Teaching Quality Standard Applicable to the Provision of Basic Education in Alberta

Workload Policy Department of Art and Art History Revised 5/2/2007

Minutes. Student Learning Outcomes Committee March 3, :30 p.m. Room 2411A

Proficiency Illusion

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

A Profile of Top Performers on the Uniform CPA Exam

Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd. Hertfordshire International College

Pattern of Administration, Department of Art. Pattern of Administration Department of Art Revised: Autumn 2016 OAA Approved December 11, 2016

Helping Graduate Students Join an Online Learning Community

Lecturer Promotion Process (November 8, 2016)

Save Children. Can Math Recovery. before They Fail?

UCB Administrative Guidelines for Endowed Chairs

VI-1.12 Librarian Policy on Promotion and Permanent Status

Programme Specification. MSc in International Real Estate

and Beyond! Evergreen School District PAC February 1, 2012

EXPANSION PROCEDURES AT THE UNIVERSITY OF ARIZONA

Office Hours: Day Time Location TR 12:00pm - 2:00pm Main Campus Carl DeSantis Building 5136

Field Experience and Internship Handbook Master of Education in Educational Leadership Program

FORT HAYS STATE UNIVERSITY AT DODGE CITY

TABLE OF CONTENTS. By-Law 1: The Faculty Council...3

BHA 4053, Financial Management in Health Care Organizations Course Syllabus. Course Description. Course Textbook. Course Learning Outcomes.

Faculty Athletics Committee Annual Report to the Faculty Council September 2014

Biomedical Sciences. Career Awards for Medical Scientists. Collaborative Research Travel Grants

TEXAS CHRISTIAN UNIVERSITY M. J. NEELEY SCHOOL OF BUSINESS CRITERIA FOR PROMOTION & TENURE AND FACULTY EVALUATION GUIDELINES 9/16/85*

Department of Political Science Kent State University. Graduate Studies Handbook (MA, MPA, PhD programs) *

What It Means to Be a Professional Development School

College of Court Reporting

How Might the Common Core Standards Impact Education in the Future?

Cooper Upper Elementary School

Transcription:

University of South Florida Scholar Commons School of Information Faculty Publications School of Information Fall 2002 The Integration of Assessment of Student Learning Outcomes and Teaching Effectiveness Authors: Anna H. Perrault, Vicki L. Gregory, and James Carey This paper reports the conceptual phase of the USF School of Library and Information Science s participation in the Pew Trust Challenge Grant project Innovations in Faculty Work Life, administered by the Harvard Graduate School of Education. The SLIS was selected as one of nine departments at USF to participate in a Pilot Study for Alternate Measures of Teaching Effectiveness, one of two foci in the USF Harvard Challenge Grant project. The SLIS developed student learning outcomes goals for the graduate program in library and information science. The goal for teaching is formulated in programmatic objectives for student learning outcomes and assessment measures for those objectives. These are reported along with a proposed model for the integrated assessment of student learning outcomes and teaching effectiveness. Follow this and additional works at: http://scholarcommons.usf.edu/si_facpub Part of the Curriculum and Instruction Commons, Higher Education Commons, and the Library and Information Science Commons Scholar Commons Citation Perrault, Anna H.; Gregory, Vicki L.; and Carey, James, "The Integration of Assessment of Student Learning Outcomes and Teaching Effectiveness" (2002). School of Information Faculty Publications. 15. http://scholarcommons.usf.edu/si_facpub/15 This Article is brought to you for free and open access by the School of Information at Scholar Commons. It has been accepted for inclusion in School of Information Faculty Publications by an authorized administrator of Scholar Commons. For more information, please contact scholarcommons@usf.edu.

SHAPING NEW STRATEGIES THE INTEGRATION OF ASSESSMENT OF STUDENT LEARNING OUTCOMES AND TEACHING EFFECTIVENESS by Anna H. Perrault, Vicki L. Gregory, and James O. Carey School of Library and Information Science University of South Florida Abstract This paper reports the conceptual phase of the USF School of Library and Information Science s participation in the Pew Trust Challenge Grant project Innovations in Faculty Work Life, administered by the Harvard Graduate School of Education. The SLIS was selected as one of nine departments at USF to participate in a Pilot Study for Alternate Measures of Teaching Effectiveness, one of two foci in the USF Harvard Challenge Grant project. The SLIS developed student learning outcomes goals for the graduate program in library and information science. The goal for teaching is formulated in programmatic objectives for student learning outcomes and assessment measures for those objectives. These are reported along with a proposed model for the integrated assessment of student learning outcomes and teaching effectiveness. Purpose The School of Library and Information Science at the University of South Florida is engaged in a Pilot Study on Alternate Measures of Teaching Effectiveness as a module of the USF Harvard Grant Project, Innovations in Faculty Work Life. The SLIS is the only unit in the USF Harvard Challenge Grant Project that is concentrating on the assessment of student learning outcomes and teaching effectiveness for graduate professional curricula. This effort fits the ALISE 2000 National Conference themes of developing strategies for excellence and innovation in teaching and learning meeting the challenges of accountability meeting the challenges of the changing nature of promotion and tenure Background In 1997, in response to ongoing faculty concerns about the adequacy of student evaluations to fully measure teaching effectiveness, the USF Provost appointed a task force from across 1

disciplines to consider the evaluation of teaching. The Committee on the Importance of Teaching did not believe it was possible, nor desirable, to articulate uniform definitions or practices of effective teaching that would be applicable across all disciplines. They did, however, identify several principles that ought to guide the assessment of quality teaching: Teaching is a multifaceted process that cannot be measured by a single instrument. Effective teaching practice varies between fields and individuals. As a result, a wider range of data should be considered in the assessment of teaching than student evaluations of teaching. Student learning outcomes should be a primary criterion of teaching effectiveness. Departments and faculty should identify their learning goals and demonstrate that those objectives are being met in teaching practice. Student learning is more than the sum of its parts. Students learning experiences transcend individual courses and reflect the cumulative experience of their major and general education courses. As a result, the assessment of teaching effectiveness should include departmental and programmatic units, as well as individual courses. Measures of teaching effectiveness should include the extent to which faculty and departments contribute to the larger units goals, as well as fulfilling the objectives of individual courses. The results of teaching assessment have multiple purposes: enhancement of individual faculty teaching, improvement of program development and delivery, and the provision of a broader base of material on which to base tenure and promotion decisions. In 1998, USF began participation in Innovations in Faculty Work Life, a Pew Trust Challenge Grant project administered by the Harvard University School of Education. USF is one of seven institutions in the project along with Olivet College, University of North Carolina, Arizona State University, University of South Dakota, Tulane University, and Seattle Pacific University. When USF was selected to participate in the Harvard Challenge Grant project, it was decided that the Pilot Study on alternate measures for effective teaching would become one of the foci of the project and it would build upon the efforts and recommendations of the Committee on the Importance of Teaching. i The initial meetings for the teaching effectiveness group took place in March/April, 1999. From these meetings the following Guiding Principles were formulated for the Pilot Study-- 1) The pilot departments/units will incorporate a broad range of data in their faculty evaluation process that reflect the multifaceted nature of effective teaching. 2) The pilot departments will focus on student learning outcomes as a primary, but not necessarily exclusive, criterion of teaching effectiveness. The faculty members will first identify the primary goals for both their units and for individual courses. Teaching effectiveness will be measured/demonstrated, at least in part, in relation to these 2

identified student outcomes. 3) Because student learning occurs across courses throughout a student s major, pilot departments will articulate unit-wide learning outcomes and measures of teaching effectiveness that reflect the unit s unique qualities. The pilot departments will create frameworks for evaluating individual faculty members on their contributions to the unitwide goals, as well as their individual course goals. 4) The pilot assessment measures should produce data that can be used effectively to enhance individual faculty teaching practices and serve as a means to review and revise the delivery of a department-wide curriculum. Process: USF SLIS Pilot Study There was a confluence of factors in 1998, all driving reconsideration and revision of the School s mission and goals statement USF designated a Florida Research 1 institution (USF is a Carnegie Research II) ALA Congress on Professional Education USF Pilot Study for Alternate Means of Teaching Effectiveness ALA COA accreditation in spring 2002 SLIS saw the Pilot Study for alternate measures of teaching effectiveness as an opportunity to integrate a number of planning initiatives. After selection by the Provost s office as one of nine units for participation in the Harvard Grant Challenge project the SLIS curriculum committee agreed that reconsideration and revision of the School s mission and goals statements to identify unit-wide goals on which to base assessments of student learning outcomes and teaching effectiveness would begin in fall 1999. Student Learning Outcomes Assessment Although research on the intended outcomes of education can be dated to the landmark work of Benjamin Bloom and his colleagues, The Taxonomy of Educational Objectives, ii it was not until the 1980s that assessment became a topic for educational research. And it was not until the latter 1980s that the focus of assessment became accountability. A number of practical projects to assist in the assessment process and the formulation of goals and objectives for student learning outcomes have taken place. Two prominent leaders in classroom assessment research have been Thomas A. Angelo and K. Patricia Cross who began developing a Teaching Goals Inventory in 1986 as a first step in classroom assessment. The first edition of Classroom Assessment Techniques: A Handbook for Faculty iii was published in 1988. Over the next four years their work as the Classroom Research Project was supported by the Harvard Graduate School of Education, the Ford Foundation and the Pew Charitable Trusts. 3

They found research on student learning outcomes objectives, but a lack of research or attention to helping educators think what should happen, i.e. goals for student learning outcomes. A second edition (1993) reflected the work of the Classroom Research Project. iv The Pew Charitable Trusts have funded a number of projects for quality assurance practices in higher education. The American Association for Higher Education (AAHE)began its Forum on Faculty Roles & Rewards in the early 1990s. In the latter half of the 1990s, the literature and conferences have begun to focus on the role of teaching faculty in the assessment process, both unit based and at the institutional level, and the role of the assessment process in the evaluation of teaching. Pervasive changes in the higher education environment, such as funding and distance learning, have led to serious consideration of the changing roles of faculty in higher education. In a study conducted under a grant from the Pew Charitable Trusts, Wergin and Swingen s purpose was to search for evaluation policies and practices that encourage constructive change in departments and a stronger culture of collective responsibility there. Our goal was to see how these models worked, what seemed most critical to their success, and how key ideas might be applied to other settings. v The researchers classify evaluation or assessment processes into five categories. 1. Program Review Nearly all institutions conduct program reviews, usually on a cyclical schedule. Typically these reviews consist of a department evaluating its strengths and weaknesses through a self study; there is a plan for improvement, an external panel visit and report; and subsequently an oversight of the follow-up to the program review. The focus is backward rather than forward. The process often unfolds in a way that encourages participants to get through it with a minimum of aggravation. The opportunity for critical reflection is lost in the desire to get the thing done. 2. Outcomes assessment Assessment has become common more recently, dating back to the late 1980s. The assessment process, which emphasizes accountability, has been adopted by most regional accrediting associations. While program review focuses on the academic department, institutional assessment programs cut across the educational mission at various levels. 3. Specialized accreditation There are now over one hundred specialized or professional accrediting agencies. Increasingly, specialized and professional accrediting agencies have revised their standards and review procedures to focus on how well a program meets its learning goals in light of the individual institution s mission, rather than on rigid, nationally-normed standards. 4. Financial accounting initiatives 4

As the heading implies, these are reviews based on models for allocating resources. More recently developed models such as the Stanford Cost Model and RCM (responsibility centered management) focus on locating budgetary responsibility at the unit level. 5. Internal quality assurance In models 1-4, the stimulus for the review or assessment is external to the department as is the accountability. The internal quality assurance model seeks to develop a more internalized sense of responsibility for quality. According to Wergin and Swingen, campus practices of quality assurance at the departmental level usually suffer from two debilitating problems: 12. Most departments and faculty do not see the relevance of such practices to the work they do. The notion of continuous quality improvement hasn t taken hold. Further, faculty view institutional measures of quality as off the mark, as not congruent with what their own definitions of quality might be. Consequently, most program review and outcomes assessment exercises have only marginal impact. 13. There is little coordination at most institutions among assessment, program review, and external accreditation. vi The researchers conclude that at the institutional level demands for unit accountability should focus less on accountability for achieving results and more on how well units conduct evaluations for themselves and use the data these evaluations generate. The study found several examples of continuous quality improvement at the departmental level, but few seemed to be able to make the idea work. vii Faculty are typically rewarded according to standards of quality dictated by their disciplines, not by standards specific to their institutions or departments. viii Wergin and Swingen propose that What is needed is a different kind of culture, one in which faculty members are able to focus their efforts on activities which best draw upon their own skills, talents, interests, and experience, and which allows them to negotiate with their colleagues how they might best use these strengths to contribute to the work of the department. Thus, some faculty might put relatively more effort into research, others into teaching, still others into institutional and professional service; and these areas of emphasis could shift throughout the course of a career. But this kind of restructuring will be possible only when institutions shift the unit of analysis from the individual to the academic department, when the work of the department is evaluated as a whole, rather than simply as the aggregate of individual faculty accomplishments, and when faculty are evaluated according to their differential contributions to the group. ix Hatfield in Department Level Assessment: promoting Continuous Improvement, says the goal of a continuous improvement initiative is for a department to become self- 5

regarding, self-monitoring and self-correcting. The result is that the department is confident of the quality of their graduates knowledge and skills before the students enter the work force. Finding out how students have fared years after graduation isn t good enough. Continuous improvement means that the correction cycle is shorter because progress is constantly being monitored. Deficiencies are identified and are corrected while the student is still on site, and corrections are made to department level processes to keep similar deficiencies from occurring. x Department level assessments must be self-renewing. Assessment data and information must feed back into the system, both on the university and departmental level. A department level assessment plan should identify the mission of the department, goals related to that mission, activities or processes supporting the achievement of the goal, and a number of measures which, when taken together, provide indication of the degree to which the goal is being achieved. Implementing the plan requires the collecting, analyzing, and benchmarking of data, revision of processes, and communication of results. xi These assessment processes have been incorporated into a number of the regional accrediting associations reporting procedures. The ALA COA looks for a continuous planning process. Might a continuous improvement process for programmatic effectiveness in reaching teaching, research, and service goals be a more efficacious system of assessment and accountability? USF SLIS Model The SLIS began with revision of the School s mission and goals statements. Since the last revision of the mission and goals in 1993-1994 in preparation for the last ALA COA reaccreditation in spring 1995, the University of South Florida has adopted a Vision and Values statement. The School s mission statement now incorporates the Vision and Values statement. xii Thus, the SLIS faculty felt that the School s mission statement could more specifically focus upon the broad mission of educating students for career and leadership roles in the library and information professions. A matrix for the student learning outcomes model was developed which begins with the University mission and proceeds through the SLIS mission to department intended outcomes and objectives. An example of the model with one SLIS instructional goal area with student learning outcomes objectives is shown in Appendix A. As a working matrix for the student learning outcomes, a content structure for Goal I: Teaching was developed. This matrix outlines six content goal areas with the objective areas for each. These are not meant to have direct correlation with specific courses, but are regarded as outlining the knowledge areas every student should master in the SLIS program. The threads of Accepted Practices, Trends and Issues, and Networks, Systems, and Technology, are shown as pervading all content areas. The Content matrix is shown in Appendix B. The implementation of the model revolves around a very simple reporting form for each 6

objective. (Appendix C) The form provides for the recording of the assessment measures used; brief results; and mostly importantly, in the last blank, the Use of the Results to Improve Instructional Program. It is in the use of the results that the student learning outcomes assessment data are applied to teaching effectiveness. The assessment process should be a continuous improvement process in which the program is evaluated for reaching its primary goal educating students to become professional leaders. If the program is meeting its goals then on that level, the faculty are contributing effectively as a faculty of the whole. Significance What is different about this approach? It is emphasizes student learning outcomes at the unit level -- in other words assessment of student learning outcomes at the exit, i.e. graduation point and several years into practice. This approach emphasizes the preparation of students for careers in the information professions and thus the major focal point of assessment is the cumulative learning of the student--outcomes of the program, rather than individual courses. Student evaluations of individual courses have been regarded as summative evaluation the last and only input from students, and only at the individual course level. The emphasis on student learning outcomes at the programmatic level turns individual course evaluations into formative evaluations. They are only pieces of a much larger picture. The programmatic assessment of student learning outcomes becomes the summative evaluation. The crucial matter is how effective is the program in reaching student learning outcomes goals? Individual courses contribute to the programmatic goals, but programmatic goals may not necessarily be achieved despite the excellence of many individual courses. In turn, the emphasis on programmatic effectiveness downplays or eases the pressure on individual instructors with regard to the course level evaluations. The external concerns of accountability to the institution and the profession for the effectiveness of the program and to the graduates in their preparation for a professional field are the concerns of the programmatic student learning outcomes goals. The evaluation of individual instruction on a course level should be an internal concern. That is, the school should be responsible for the quality of individual courses and instruction. The School reports and is accountable for the overarching goals of program effectiveness in meeting student learning outcomes goals. Boards of Education and administrations in institutions have misplaced the accountability in micro-managing it down to the individual course level. This approach emphasizes assessment of teaching effectiveness at the unit level i.e., how well is a faculty member contributing to the cumulative learning outcomes of students? It ties faculty teaching effectiveness to unit effectiveness - assessment of student learning outcomes. Thus, the programmatic assessment becomes an alternate measure of teaching effectiveness. The development of alternate measures for the assessment of student learning outcomes and 7

teaching effectiveness by the USF SLIS began as a component of a university-wide initiative from the Provost s office at USF. The model developed by the USF SLIS for graduate level professional program assessment and continuous improvement can serve as an example which other education programs in the library and information science fields may choose to use as a model for a different culture of assessment for accountability to graduates, employers, and the sponsoring institution. The integrated model can serve as a catalyst for changing the nature of faculty work life from that of individual instructor to that of team member in responsibility for the cumulative learning of students in the program. References i. The report from the Committee on the Importance of Teaching is available on the web. http://acad.usf.edu/cvisot/teachindex.html ii. Benjamin Bloom, et al. Taxonomy of Educational Objectives. Vol.1: Cognitive Domain. (New York: McKay, 1956) iii.thomas A. Angelo and K. Patricia Cross. Classroom Assessment Techniques: a Handbook for Faculty. (Ann Arbor: National Center for Research to Improve Postsecondary Teaching and Learning, University of Michigan, 1988). iv. Thomas A. Angelo and K. Patricia Cross. Classroom Assessment Techniques: A handbook for College Teachers. 2 nd ed. (San Francisco: Jossey-Bass Publishers, 1993) v. Jon Wergin and Judi N. Swingen. Evaluating Academic Departments: Best Practices, Institutional Implications. In press. (Washington DC: American Association for Higher Education, 2000). vi. Ibid. vii. Ibid. viii. Jon Wergin. Assessment of programs and units, in Architecture for Change, Proceedings of the AAHE Assessment conference, June 13-17, 1998, Cincinnati, OH: 59-65, p. 62. ix. Ibid. x. Susan R. Hatfield. Department Level Assessment: Promoting Continuous Improvement. IDEA paper #35, May 1999 (IDEA Center, Kansas State University): [1]. xi. Ibid. 8

xii. http://usfweb.usf.edu/usfpres/vis-val.htm 9