UPR Libraries Evaluation Process: Identification of indicators and creation of tools for data collection Julia Vélez, MLS, Ed. D.

Similar documents
Using Team-based learning for the Career Research Project. Francine White. LaGuardia Community College

Thesis Regulations for Dissertation Doctorates

The Characteristics of Programs of Information

WOMEN RESEARCH RESULTS IN ARCHITECTURE AND URBANISM

Education: Professional Experience: Personnel leadership and management

Procedia - Social and Behavioral Sciences 93 ( 2013 ) rd World Conference on Learning, Teaching and Educational Leadership WCLTA 2012

Assessment of Student Academic Achievement

Standards and Criteria for Demonstrating Excellence in BACCALAUREATE/GRADUATE DEGREE PROGRAMS

Curricular Innovations Outcomes Assessment and ABET 2000

Study Center in Santiago, Chile

KENTUCKY FRAMEWORK FOR TEACHING

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

CÉGEP HERITAGE COLLEGE POLICY #15

MSW POLICY, PLANNING & ADMINISTRATION (PP&A) CONCENTRATION

CHALLENGES FACING DEVELOPMENT OF STRATEGIC PLANS IN PUBLIC SECONDARY SCHOOLS IN MWINGI CENTRAL DISTRICT, KENYA

Developing an Assessment Plan to Learn About Student Learning

ESTABLISHING A TRAINING ACADEMY. Betsy Redfern MWH Americas, Inc. 380 Interlocken Crescent, Suite 200 Broomfield, CO

Executive Summary. DoDEA Virtual High School

PERFORMING ARTS. Unit 2 Proposal for a commissioning brief Suite. Cambridge TECHNICALS LEVEL 3. L/507/6467 Guided learning hours: 60

Community Unit # 2 School District Library Policy Manual

Guidelines for the Use of the Continuing Education Unit (CEU)

Delaware Performance Appraisal System Building greater skills and knowledge for educators

VI-1.12 Librarian Policy on Promotion and Permanent Status

Identifying Users of Demand-Driven E-book Programs: Applications for Collection Development

Chart 5: Overview of standard C

Chapter 9 The Beginning Teacher Support Program

ACCREDITATION STANDARDS

Instructions and Guidelines for Promotion and Tenure Review of IUB Librarians

Third Misconceptions Seminar Proceedings (1993)

WSU LIBRARIES DECISION MATRIX FY

Hawai i Pacific University Sees Stellar Response Rates for Course Evaluations

Reference to Tenure track faculty in this document includes tenured faculty, unless otherwise noted.

ACADEMIC AFFAIRS GUIDELINES

Librarian/Library Faculty Meeting

LMIS430: Administration of the School Library Media Center

PROVIDENCE UNIVERSITY COLLEGE

La Grange Park Public Library District Strategic Plan of Service FY 2014/ /16. Our Vision: Enriching Lives

The Ohio State University Library System Improvement Request,

SACS Reaffirmation of Accreditation: Process and Reports

Quality in University Lifelong Learning (ULLL) and the Bologna process

Executive Summary. Colegio Catolico Notre Dame, Corp. Mr. Jose Grillo, Principal PO Box 937 Caguas, PR 00725

Program Guidebook. Endorsement Preparation Program, Educational Leadership

THEORY OF PLANNED BEHAVIOR MODEL IN ELECTRONIC LEARNING: A PILOT STUDY

Focus Groups and Student Learning Assessment

JOSHUA GERALD LEPREE

STUDENT LEARNING ASSESSMENT REPORT

HEALTH SERVICES ADMINISTRATION

POL EVALUATION PLAN. Created for Lucy Learned, Training Specialist Jet Blue Airways

Strategic Planning for Retaining Women in Undergraduate Computing

School Leadership Rubrics

ASSESSMENT OF STUDENT LEARNING OUTCOMES WITHIN ACADEMIC PROGRAMS AT WEST CHESTER UNIVERSITY

VOL VISION 2020 STRATEGIC PLAN IMPLEMENTATION

PROPOSAL FOR NEW UNDERGRADUATE PROGRAM. Institution Submitting Proposal. Degree Designation as on Diploma. Title of Proposed Degree Program

Indiana Collaborative for Project Based Learning. PBL Certification Process

Assessment. the international training and education center on hiv. Continued on page 4

Procedures for Academic Program Review. Office of Institutional Effectiveness, Academic Planning and Review

National and Regional performance and accountability: State of the Nation/Region Program Costa Rica.

Certification Requirements

How Organizational Cybernetics can help to organize debates on complex issues

Intermediate Algebra

TULSA COMMUNITY COLLEGE

Delaware Performance Appraisal System Building greater skills and knowledge for educators

July 17, 2017 VIA CERTIFIED MAIL. John Tafaro, President Chatfield College State Route 251 St. Martin, OH Dear President Tafaro:

Executive Summary: Tutor-facilitated Digital Literacy Acquisition

Policy for Hiring, Evaluation, and Promotion of Full-time, Ranked, Non-Regular Faculty Department of Philosophy

School Concepts for Spanish Speaker Respondents

Volunteer State Community College Strategic Plan,

Lincoln School Kathmandu, Nepal

Records and Information Management Spring Semester 2016

ADDIE: A systematic methodology for instructional design that includes five phases: Analysis, Design, Development, Implementation, and Evaluation.

Greek Teachers Attitudes toward the Inclusion of Students with Special Educational Needs

MBA 5652, Research Methods Course Syllabus. Course Description. Course Material(s) Course Learning Outcomes. Credits.

MASTER S THESIS GUIDE MASTER S PROGRAMME IN COMMUNICATION SCIENCE

Basic Skills Plus. Legislation and Guidelines. Hope Opportunity Jobs

Occupational Therapist (Temporary Position)

Department of Plant and Soil Sciences

Barstow Community College NON-INSTRUCTIONAL

1. Amend Article Departmental co-ordination and program committee as set out in Appendix A.

Presentation Advice for your Professional Review

Expanded Learning Time Expectations for Implementation

Technical Report #1. Summary of Decision Rules for Intensive, Strategic, and Benchmark Instructional

CURRICULUM VITAE CECILE W. GARMON. Ground Floor Cravens Graduate Library 104 Fine Arts Center

DESIGNPRINCIPLES RUBRIC 3.0

Master of Science (MS) in Education with a specialization in. Leadership in Educational Administration

Creating an Information Literacy Plan

Learning Resource Center COLLECTION DEVELOPMENT POLICY

An Introduction and Overview to Google Apps in K12 Education: A Web-based Instructional Module

eportfolio Trials in Three Systems: Training Requirements for Campus System Administrators, Faculty, and Students

Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd. Hertfordshire International College

Practical Research. Planning and Design. Paul D. Leedy. Jeanne Ellis Ormrod. Upper Saddle River, New Jersey Columbus, Ohio

Guide to the University of Chicago, Phi Alpha Delta Law Fraternity Records

Deploying Agile Practices in Organizations: A Case Study

Class Numbers: & Personal Financial Management. Sections: RVCC & RVDC. Summer 2008 FIN Fully Online

SURVEY RESEARCH POLICY TABLE OF CONTENTS STATEMENT OF POLICY REASON FOR THIS POLICY

DOCTOR OF PHILOSOPHY BOARD PhD PROGRAM REVIEW PROTOCOL

Exams: Accommodations Guidelines. English Language Learners

Demography and Population Geography with GISc GEH 320/GEP 620 (H81) / PHE 718 / EES80500 Syllabus

Preliminary Report Initiative for Investigation of Race Matters and Underrepresented Minority Faculty at MIT Revised Version Submitted July 12, 2007

PROJECT RELEASE: Towards achieving Self REgulated LEArning as a core in teachers' In-SErvice training in Cyprus

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

Transcription:

UPR Libraries Evaluation Process: Identification of indicators and creation of tools for data collection Julia Vélez, MLS, Ed. D. Professional Accreditation and Evaluation, Academic Affair Vice-Presindent Office and Director of Information and Technology Center, Natural Science College, University of Puerto Rico, Río Piedras Campus Abstract: During the UPR Libraries Evaluation process one of the factors strongly worked was the establishment of indicators for ACRL Standards and the creation of tools for data collection. In this presentation we will discuss the process used for the selection of indicators, and the strategies used to encourage staff to use the indicators and tools developed. The experience shows how to enhance of UPR librarians skills to use qualitative and quantitative methods to get data and demonstrate the accomplishment with evidence. One of the results of the process is the creation of a data collection system that helps library information gathering of performance results. The discussion of the topics exposes the changes contributing to organizational performance and the mechanisms to integrate them in the daily work of libraries. Keywords: Library evaluation, Library Assessment, Evaluation Indicators, Qualitative and Quantitative Methodology, Library Self-Study, Library Assessment Culture, Systematic Library Evaluation 1. Introduction For more than five years, fourteen libraries of the University of Puerto Rico have been conducting an evaluation process. The initiative is part of the Evaluation and Professional Accreditation Project of the University of Puerto Rico (UPR). The assessment process is done in a systematic way using the standards of the Association of College and Research Libraries (ACRL) (2004, 2005) and set indicators to show the results of the work based in outcomes assessment. The ACRL standards are composed of twelve areas are: planning, evaluation, performance appraisal, service, instruction, resources, access, personnel, physical facilities, communication & cooperation, administration and budget. The design process is based on two stages that are: first, a summative and the second, a comparison between the units. The first stage has been carried out since 2005 and it is composed of five phases: a study of standards, preconditions, internal evaluation, external evaluation and integration of the two assessments to create a plan for improvement (see Figure 1). The improvement plan is based in demonstrating the effectiveness of resources and services based on a self-diagnosis and an external vision by experts from ACRL throughout UPR libraries. The visit of external evaluators selected by ACRL and UPR makes the first library system that has been evaluated and obtained a certificate for having gone through the process of meeting minimum standards of ACRL. 1

Figure 1: UPR Libraries Evaluation Model The evaluation process was developed in several steps aimed at identifying the strengths, weaknesses, opportunities and threats to generate a plan for improvements in areas that would be necessary and initiate the development of initiatives of assessment. This involves collecting, organizing and analyzing information in order to document the operation, demonstrate its effectiveness and reaffirm the collaborative and participatory process that the UPR libraries have. The internal and external evaluation was conducted using the ACRL Standards and the Indicators Guidelines of UPR Libraries (2006). One of the most controversial elements of the process was the creation and implementation of the indicators, tools and evidence. 2. Creation of indicators and tools In the first phase of the evaluation, it was found that the ACRL standards are composed of a series of questions that help describe the areas covered by the standard, but they don t have a measure to help identify if the information provided to answer the question, was based on evidence and not on subjective descriptions. For this reason it was decided to create indicators or identify a set of objective measures to provide a clear and precise answer to every standard s question. This is known as indicators. One of the most difficult jobs during the evaluation process was the creation of indicators, and the selection of the methodology and tools to demonstrate assessment results. A pilot group composed of six library assessment coordinators accomplished the task of creating or identifying indicators, methodologies and tools that would answer the questions from the ACRL standards. The group included the ancillary of Dr. Mariano Maura, an expert in quantitative and qualitative methodologies applied to librarianship. During one year, indicators and tools were created and the Indicators Guidelines of UPR Libraries (University of Puerto Rico, 2006) was completed. 2

A literature search was conducted to explore how to perform an evaluation process and the appropriate methodologies and tools needed to gather the data and accomplish the evaluation. Among the literature consulted were the resource standards of the Certificate of Quality of Spain (National Agency for Quality Assessment and Accreditation, 2004), Standards for University Libraries Chilenas (Council of Rectors of Chilean Universities, Library Advisory Commission and Documentation, 2003), Middle States Commission on Higher Education (2002), Higher Education Council of Puerto Rico, Alonso Arévalo (2003); Alonso Arévalo Martín Echeverría Cubillas & Hill (1999); Fernekes & Nelson (2002), (2005); Van House, Weil & McClure (1990); Whitmire (2002); Unit for Andalusian Universities Quality (2002) and others. From the papers reviewed a series of indicators that answered the ACRL questions and a list of possible evidence that could show that the answer is based on facts were selected. In some cases, indicators were created because none were found in the literature. To create these, various parameters were established; the criteria had to be accurate, consistent and clear. The aim was to establish indicators that respond to the questions from the ACRL standards and that strictly apply to the purpose intended to measure. A question may have several indicators, but the amount is limited in order to avoid confusion, and simplify the process. The time period needed to collect the information versus the amount of time for the assessment process was also taken in consideration. The Indicators Guidelines of UPR Libraries (University of Puerto Rico, 2006) is the product of this process and is document that was used to guide the evaluation process. It consists of 100 questions based on the ACRL standards, 151 indicators and various possible instruments or evidence. Table 1 shows the distribution of the number of questions and indicators in the ACRL Standards. Table 1 Distribution of the number of questions and indicators in the ACRL Standards ACRL Standard Questions Indicators 1. Planning 7 9 2. Assessment 3 6 3.Outcome Assessment 8 11 4.Service 10 20 5. Instrucction 9 11 6.Resources 9 15 7.Access 8 11 8.Human Resources 7 17 9.Facilitys 11 16 10.Comunication & Cooperation 8 8 11.Administration 7 9 12. Budget 13 18 Total 100 151 The 151 indicators were either adapted from some of the sources cited in the literature or created by the UPR pilot group or the library s evaluation committee s. A fourteen five percent of the indicators are created and fifteen five area selected for the review of the literature. For example, the indicators for Standards 1, 2 and 3 were newly created, while Spain and Chile s indicators were used for 6 and 7 standards. For instance, on question 6.1 there is a need for criteria to make decisions about the acquisition, retention and use of printed materials, and electronic and audiovisual resources. Thus, evaluation criteria 3

indicators from Spain for the collection development, use of room per capita rate of resource use, resource lending, loans per capita, per capita resources in loans were used. Another example is the standard 7, Access. In question 7.1 access to the catalog and other library resources, and if they are available on campus and beyond are considered. Indicators from Spain standards were used: the user s rate for resources, availability of titles, resource recovery and availability of requested titles. On Standard 5, about the instruction and information skills the indicators are based on the Information Literacy Competency Standards for Higher Education (Association of Colleges and Research 2000), which includes pre-defined and clear indicators. Another aspect included in the guide for each indicator is a list of possible technical, quantitative or qualitative methodologies to collect data, or documents that can reveal the answer to the questions. Some of the methodologies or techniques recommended were: focus groups, interviews, questionnaires, flow charts, observation, and collection of statistical data. Other instruments were set up as columns, matrix data analysis, schedules, questionnaires and forms to collect specific data. Each library made its own selection, based on the availability of time for implementation and the layout of the library to use the tool or methodology chosen. The instrument most selected was a user satisfaction, as most of the standards and indicators have questions that required the input of users. This leads to consider using a questionnaire or to create one. We reviewed several existing questionnaires and considered the potential use of LIBQUAL questionnaire, but at the time that we needed it, the instrument was in the process of being translated into Spanish. Therefore, questionnaires were developed based on questions from ACRL and indicators selected for students, teachers and researchers and library staff. The process involved the analysis of indicators that need this type of input, review of model questionnaires from other libraries and the hiring of an expert in developing questionnaires. The pilot group in conjunction with an expert in developing questionnaires made its creation. The questions in the questionnaire covered issues such as services, facilities and physical resources from several perspectives on how the users see the services and how the staff worked. Another method used to obtain input from users was focus groups. This methodology was used only in the larger campus because of its complexity, the implementation of the questionnaires proved to be too difficult. Focus groups were designed by a group of specialists in the field of business administration and social psychology that have expertise on the technique. Focus groups were designed for teachers, researchers and students. In addition, questionnaires were created for services. 2. Implementation Results The pilot Group ran a test in order to verify the applicability of the indicators and instruments at each library. After several meetings where the pilot group and the other coordinators of library evaluation discussed the applicability of the indicators and instruments, a decision was reached in relation to which indicators could be used and which should be removed. As a result a reasonable percent of the indicators with formulas were not used because they required many changes in the daily work procedures. For example, The Standard 6 Access, and 7 Resources, had many difficulties because it 4

involved drastic changes in how the daily work is performed, or had complex formulas to be applied to the daily routine of work. The standard 8 - Human Resources, had drastic changes from the ones stipulated in the guide because the selected indicators based on literature and other models did not consider the staff regulations of the Puerto Rico Government, the rules of the UPR and the stipulations of the labor unions. The implantation of the guide was arduous and difficult because at the end we had a very complex document to be used in heterogeneous group of libraries, with diverse academic focus, different size and varied resources. The first step to overcome was the development of the knowledge necessary to use methodologies, tools and processes required. Among the strategies used were professional trainings, consultation with experts in areas such as questionnaires design, focus groups, planning, observation, rubrics and others. Another strategy used was that the libraries that had accomplished the task and applied indicators became the mentor of the most rearward. Indicators that their data came from developed questionnaires, in some cases could not be used, due to limitations in the implementation of the instrument. Because the data was not considered reliable, the decision was made not to include them in the final evaluation report. Of all the questionnaires created, the one that provided the best data was the staff questionnaire. Another aspect that created confusion was having multiple indicators on a single question, persons focused on the first indicator and the others were left out. The use of other methodologies entailed hiring experts or companies to do the work because the library staff did not have the knowledge to do it. One limitation was that the economic factor, because that cost was not allocated in the budgets of libraries. The costs were covered with special assignments from the Professional Accreditation and Assessment Office, of the Vice-President of Academic Affairs. The use of focus groups took place only in a very complex Campus because it was difficult to implement the questionnaires. The focus groups were able to collect input from users, teachers and researchers needed to answer many of the questions. The evaluation process was highly dependent on the assistance offered by the Deans of Academic Affairs of each unit, and library staffs that made the greatest efforts to accomplish the evaluation process. Another aspect to overcome was the identification, location and organization of the documentation required to demonstrate achievement of results. This process costs time and effort, and at times the staff had to devote extra time to make the self study because they did not have the foundation and experience in assessment and appraisal. All this effort has been very favorable at the end of the period of implementation of the Indicators Guidelines of UPR Libraries (University of Puerto Rico, 2006) ended with the paper self-generated and the internal evaluation report for submission to the external evaluators. 4. Conclusion The evaluation process in UPR libraries was accomplished although the staff did not have the knowledge and experience for the task, but it was acquired in the practice. At the beginning, it was hard when the processes of assessment and evaluation started, but the results were rewarding. Fourteen libraries worked together to develop the assessment process and managed to create and 5

use indicators, methods and tools. The experience served to improve the UPR library services and resources in their libraries. The selection and development of indicators began conducting a review of the literature in library assessment initiatives that guided our experience. At one point there was a high number of indicators but the process of discussion and implementation by the pilot group led to its refining before it was implanted in all units. This led to the implementation using only 100 questions based on the ACRL standards, 150 indicators and various possible instruments or evidence. The development of the Indicators Guidelines of UPR Libraries (University of Puerto Rico, 2006) was achieved thanks to all members of the Library Evaluation Committee that worked additional time and effort to achieve the objective. For the implementation of the indicators more time was needed for the trial period in order to adapt to the peculiarities of each library. Another aspect that affected the implementation of indicators was the lack of knowledge and experience to handle the technical, quantitative and qualitative methodologies. This affected mostly the implementation of the questionnaires, because despite that general instruction were given, many units selected the sample and its application as they considered without following guidelines. In addition, one area that needs to be strengthened is the data analysis and reporting. Despite all the work involved and sometimes it was not pleasant, but it did offer the opportunity to learn new skills, review procedures and update existing documentation. The process led the library staff to recognize the way it was working, and the need to make changes in order to add qualitative and quantitative methodologies to obtain data that show how work is being carried out and the benefits of the service. A sample of the track that is left at the end of the evaluation process, is the request of staff to continue to work constantly to generate or enhance the mechanisms for evaluation and appraisal. An example is the creation of an Information Gathering System Library (SAIB for the Spanish acronym) in a centralized form, as a mechanism to integrate the daily work of a standardized data collection on the work of libraries. This will provide a review of working procedures, the data collection will demonstrate the work done, and libraries will be able to generate a series of reports needed by the university administration. Another contribution of the first stage is that the assessment-working group has been transformed into a community of practice of assessment and appraisal. This is no longer acting as a committee but a group that interacts in a structure without being imposed with flexible roles and with interest to continue to appraisal and assessment processes on a continuous basis. 5. References Alonso Arévalo, J.(2003). Evaluación de bibliotecas universitarias con el modelo EFQM.Associação Portuguesa de Bibliotecários, Arquivistas e Documentalistas. Disponible en: http://eprints.rclis.org/1605/2/lisboa5.pdf. Alonso Arévalo, Julio, Echeverría Cubillas, Mª José, andmartín Cerro, Sonia. (1999) 6

La gestión de las bibliotecas universitarias: indicadores para su evaluación. Seminario sobre Indicadores en la universidad: información y decisiones, 1(99), 1-12 Disponible en: http://www.rebiun.org/opencms/opencms/handle404?exporturi=/export/docreb/bib lio_alonsoarevalo_otros.pdf&%5d. Agencia Nacional de Evaluación de la Calidad y Acreditación. (2004). Manual de procedimiento para la emisión del informe conducente a la obtención del certificado de calidad para los servicios de biblioteca convocatoria 2004. Disponible en: http://www.aneca.es/publicaciones/docs/manual_bibliot_092004.pdf. Association of College and Research Libraries. (2000) Information Literacy Competency Standards for Higher Education. Disponible en: http://www.acrl.org/ala/mgrps/divs/acrl/standards/standards.pdf. Association of College and Research Libraries (2004). Standards for Libraries in Higher Education. Disponible en: http://www.ala.org/ala/mgrps/divs/acrl/standards/standardslibraries.cfm. Association of College and Research Libraries (2005). Guidelines for University Library Services to Undergraduate Students. Disponible en: http://www.ala.org/ala/mgrps/divs/acrl/standards/ulsundergraduate.cfm. Consejo de Rectores de Universidades Chilenas Comisión Asesora de Bibliotecas y Documentación (2003) Estándares para bibliotecas universitarias Chilenas. 2ª Ed. Disponible en: http://cabid.ucv.cl/files/estandares/standares.pdf. Consejo de Educación Superior de Puerto Rico (CES). Procedimientos para Solicitar Licencia Autorización o Renovación. Disponible en: http://www.gobierno.pr/cespr/licenciamientoinstituciones/procedimientos+para +solicitar+lic+autorizacion+renovacion.htm. Middle States Commission on Higher Education (2002) Characteristics of Excellence in Higher Education: Eligibility Requirements and Standards for Accreditation.Disponible en:http://www.msche.org/publications/characteristicsbook050215112128.pdf. Nelson, W. N. &Fernekes, R.W. (2002) Standards and assessment for academic libraries: a workbook. Association of College and Research Libraries Published by Assoc of College & Research Libraries. Nelson, W. N. &Fernekes, R.W. (2005) Who uses ACRL standards? : Gauging the use of standards for libraries in higher. College & Research Libraries, 66 (5), 359-364. 7

Unidad para la Calidad de las Universidades Andaluzas. (2002). Guía EFQM para la autoevaluación de Bibliotecas universitarias Cádiz: Unidad para la Calidad de las Universidades Andaluzas. Universidad de Puerto Rico. (2006) Guía de Indicadores de las Bibliotecas UPR (Indicators Guidelines of UPR Libraries) [documento sin publicar]. San Juan, PR: Universidad de Puerto Rico. Van House, N.; Weil, B. &McClure, Ch. (1990). Measuring academic library performance: a practical approach. Chicago: ALA. Whitmire, E. (2002) Academic library performance measures and undergraduates' library use and educational outcomes. Library & Information Science Research, 24 (2), 107-128. 8