ONGOING PROGRAM EVALUATION: A NEW APPROACH*

Similar documents
PROFESSIONAL INTEGRATION

CÉGEP HERITAGE COLLEGE POLICY #15

CÉGEP HERITAGE COLLEGE POLICY #8

Individual Interdisciplinary Doctoral Program Faculty/Student HANDBOOK

1. Amend Article Departmental co-ordination and program committee as set out in Appendix A.

Arts, Literature and Communication (500.A1)

Admission ADMISSIONS POLICIES APPLYING TO BISHOP S UNIVERSITY. Application Procedure. Application Deadlines. CEGEP Applicants

Developing an Assessment Plan to Learn About Student Learning

VANIER COLLEGE OF GENERAL AND VOCATIONAL EDUCATION

Teaching at the College Level. Profile of the Profession

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Academic Program Assessment Prior to Implementation (Policy and Procedures)

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Youth Sector 5-YEAR ACTION PLAN ᒫᒨ ᒣᔅᑲᓈᐦᒉᑖ ᐤ. Office of the Deputy Director General

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

PREPARING FOR THE SITE VISIT IN YOUR FUTURE

DRAFT Strategic Plan INTERNAL CONSULTATION DOCUMENT. University of Waterloo. Faculty of Mathematics

General study plan for third-cycle programmes in Sociology

Practice Learning Handbook

REFERENCE FRAMEWORK FOR THE TRAINING OF COOPERATING TEACHERS AND UNIVERSITY SUPERVISORS. (Abridged version)

SACS Reaffirmation of Accreditation: Process and Reports

Workload Policy Department of Art and Art History Revised 5/2/2007

Math Pathways Task Force Recommendations February Background

UNIVERSITY OF NEW BRUNSWICK

Loyalist College Applied Degree Proposal. Name of Institution: Loyalist College of Applied Arts and Technology

Inquiry Learning Methodologies and the Disposition to Energy Systems Problem Solving

WORK OF LEADERS GROUP REPORT

Practice Learning Handbook

Initial teacher training in vocational subjects

Davidson College Library Strategic Plan

Early Warning System Implementation Guide

USC VITERBI SCHOOL OF ENGINEERING

Audit Of Teaching Assignments. An Integrated Analysis of Teacher Educational Background and Courses Taught October 2007

Arts, Literature and Communication International Baccalaureate (500.Z0)

BSP !!! Trainer s Manual. Sheldon Loman, Ph.D. Portland State University. M. Kathleen Strickland-Cohen, Ph.D. University of Oregon

Guidelines for Incorporating Publication into a Thesis. September, 2015

Graduation Initiative 2025 Goals San Jose State

INNOVATING TO PROMOTE ACCESS TO HIGHER EDUCATION IN RURAL AREAS

PROVIDENCE UNIVERSITY COLLEGE

TABLE OF CONTENTS. By-Law 1: The Faculty Council...3

Strategic Planning for Retaining Women in Undergraduate Computing

NORTH CAROLINA STATE BOARD OF EDUCATION Policy Manual

State Parental Involvement Plan

REVIEW CYCLES: FACULTY AND LIBRARIANS** CANDIDATES HIRED ON OR AFTER JULY 14, 2014 SERVICE WHO REVIEWS WHEN CONTRACT

b) Allegation means information in any form forwarded to a Dean relating to possible Misconduct in Scholarly Activity.

Doctoral Student Experience (DSE) Student Handbook. Version January Northcentral University

UNIVERSITY OF THESSALY DEPARTMENT OF EARLY CHILDHOOD EDUCATION POSTGRADUATE STUDIES INFORMATION GUIDE

PERFORMING ARTS. Unit 2 Proposal for a commissioning brief Suite. Cambridge TECHNICALS LEVEL 3. L/507/6467 Guided learning hours: 60

AUTHORITATIVE SOURCES ADULT AND COMMUNITY LEARNING LEARNING PROGRAMMES

Preliminary Report Initiative for Investigation of Race Matters and Underrepresented Minority Faculty at MIT Revised Version Submitted July 12, 2007

THE 2016 FORUM ON ACCREDITATION August 17-18, 2016, Toronto, ON

Program Change Proposal:

Focus on. Learning THE ACCREDITATION MANUAL 2013 WASC EDITION

MAINTAINING CURRICULUM CONSISTENCY OF TECHNICAL AND VOCATIONAL EDUCATIONAL PROGRAMS THROUGH TEACHER DESIGN TEAMS

Final Teach For America Interim Certification Program

I. Proposal presentations should follow Degree Quality Assessment Board (DQAB) format.

Mathematics Program Assessment Plan

PEDAGOGY AND PROFESSIONAL RESPONSIBILITIES STANDARDS (EC-GRADE 12)

LITERACY ACROSS THE CURRICULUM POLICY

Indiana Collaborative for Project Based Learning. PBL Certification Process

Welcome to the session on ACCUPLACER Policy Development. This session will touch upon common policy decisions an institution may encounter during the

Curriculum and Assessment Policy

Self-Concept Research: Driving International Research Agendas

Self Assessment. InTech Collegiate High School. Jason Stanger, Director 1787 Research Park Way North Logan, UT

Admission Regulations

The University of British Columbia Board of Governors

GRADUATE PROGRAM IN ENGLISH

GRAND CHALLENGES SCHOLARS PROGRAM

MASTER S COURSES FASHION START-UP

ANNUAL CURRICULUM REVIEW PROCESS for the 2016/2017 Academic Year

Audit Documentation. This redrafted SSA 230 supersedes the SSA of the same title in April 2008.

TU-E2090 Research Assignment in Operations Management and Services

Internship Department. Sigma + Internship. Supervisor Internship Guide

EUROPEAN UNIVERSITIES LOOKING FORWARD WITH CONFIDENCE PRAGUE DECLARATION 2009

The IDN Variant Issues Project: A Study of Issues Related to the Delegation of IDN Variant TLDs. 20 April 2011

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd. Hertfordshire International College

White Paper. The Art of Learning

Subject Inspection of Mathematics REPORT. Marian College Ballsbridge, Dublin 4 Roll number: 60500J

The College of Law Mission Statement

RESEARCH INTEGRITY AND SCHOLARSHIP POLICY

The Site Visit: How to Prepare for It & What to Expect. STARTALK Conference Atlanta, GA May 3, 2012

Position Statements. Index of Association Position Statements

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

Program Assessment and Alignment

EQuIP Review Feedback

Master of Science (MS) in Education with a specialization in. Leadership in Educational Administration

PROJECT RELEASE: Towards achieving Self REgulated LEArning as a core in teachers' In-SErvice training in Cyprus

FREQUENTLY ASKED QUESTIONS (FAQs) ON THE ENHANCEMENT PROGRAMME

Business. Pearson BTEC Level 1 Introductory in. Specification

Definitions for KRS to Committee for Mathematics Achievement -- Membership, purposes, organization, staffing, and duties

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

FACULTY OF ARTS & EDUCATION

Group Assignment: Software Evaluation Model. Team BinJack Adam Binet Aaron Jackson

UCB Administrative Guidelines for Endowed Chairs

2015 Academic Program Review. School of Natural Resources University of Nebraska Lincoln

Lincoln School Kathmandu, Nepal

Chart 5: Overview of standard C

Personal Project. IB Guide: Project Aims and Objectives 2 Project Components... 3 Assessment Criteria.. 4 External Moderation.. 5

Transcription:

ONGOING PROGRAM EVALUATION: A NEW APPROACH* Undertaking a comprehensive program evaluation that addresses all six required evaluation criteria (CEEC, 1994a) within an academic year requires extensive and dedicated CEGEP resources. It represents a huge undertaking for most CEGEPs as well as a time-consuming task for faculty and professional staff. Recognizing the importance of program evaluation to program excellence and student success, Heritage College felt compelled to look for a suitable and sustainable approach that respected the guidelines and requirements of the Commission d évaluation de l enseignement collégial (CEEC). The college came to the conclusion that the most manageable approach was to conduct program evaluation on an ongoing basis. Following collaborative committee work between 2007 and 2009, we revised Policy #17, our institutional policy concerning program evaluation, currently posted on our website 1. Its focus is on annual program evaluation activities that lead to a comprehensive program evaluation report every seven years. The responsibility for conducting the ongoing program evaluations rests with each program committee, which reports annually to the Academic Dean. This paper takes a closer look at the rationale for taking an ongoing approach to program evaluation, our progress to date, and what we hope to achieve in the next few years. OUR CONTEXT Heritage College is a small English-language college located in the Outaouais. We presently offer twelve regular education programs, four of which are pre-university and eight, career. We register approximately 1,000 students and employ about 120 faculty, three education advisors, and four academic advisors. Our particular context as a small CEGEP has led us to reflect on several aspects of program evaluation. LEE ANNE JOHNSTON Education Advisor Cégep Heritage College CEGEPs develop their programs locally based on ministry curricular documents and standards, and are responsible for evaluating them (CEEC, 1994b). To ensure a degree of standardization, the CEEC provides guidelines (1994a), which CEGEPs implement in their local contexts. The guidelines require all programs to address six evaluation criteria: program relevance, program coherence, value of teaching methods and student supervision, program resources, effectiveness, and quality of program management. To implement program evaluation, CEGEPs are expected to prepare an institutional policy concerning program evaluation stipulating how, by whom, and when program evaluation will be carried out. At Heritage, this is Policy #17 concerning Program Evaluation which, prior to 2009, endorsed the comprehensive evaluation of each program every ten years. Successful programs pay close attention to their contexts, their target groups, their resources, and the opportunities that lie ahead for their graduates, be it in the workplace or higher education. In our ever-changing world, these factors are in constant flux. Today s freshman classes are a good example of recent changes; not only has their formative preparation been very different under the Quebec Education Program reform, but they are truly born of the Net Generation. Their lives are wired ; they trust media and value connectivity. Understanding the capabilities of these students and learning how to work with them to ensure their successful graduation is a significant factor in program excellence. At the same time, as CEGEPs are increasingly being asked to do more with less, resource optimization is a critical factor, making it imperative that CEGEPs work in the most efficient ways possible. Finding local program evaluation expertise, an essential resource in this process, can be challenging. Expecting program committees, which tend to be primarily comprised of faculty members, who are not required to have program evaluation expertise, to develop and implement program evaluation on their own is neither realistic nor efficient. * Originally submitted and reviewed in English, this article was written with the assistance of Maureen Hillman, a retired teacher from Cégep de Sherbrooke, and translated into French by les Services d édition Guy Connolly. Both the English- and French-language versions have been published on the AQPC website with the financial support of the Quebec-Canada Entente for Minority Language Education. 1 [www.cegep-heritage.qc.ca/pdf_files/policies/policy-17.pdf] FALL 2012 VOL. 26, N O 1 PÉDAGOGIE COLLÉGIALE 1

Although the ideal solution to program evaluation might be to hire a specialist to lead the comprehensive evaluation of each program, this is simply not feasible for many CEGEPs. Given this, CEGEPs must look internally for program evaluation expertise. In most instances, program evaluation is one of the many responsibilities of the education advisor, 2 whose position requires some expertise in this area. Typically, the education advisor takes the lead in program evaluation and works collaboratively with program-designated committees. Since several aspects of our context are shared across the college network, the approach described in the next section might be of interest to other CEGEPs. OUR APPROACH AND KEY DECISIONS Heritage College needed an approach that would allow us to continually work on evaluating all programs concurrently, spreading out the evaluation activities and collecting data in an ongoing manner, while building our program evaluation experience and expertise at the same time. It had become increasingly clear to us that our context could not support the comprehensive program evaluation approach mandated by our original institutional policy. Therefore, it seemed to us that our context required an ongoing, collaborative approach led by the education advisor. Our first step was to revise our institutional policy to call for an annual cycle of ongoing program evaluation activities and less frequent in-depth program evaluation activities (Heritage College, 2009, p. 2). We believed this approach could be sustainable through close collaborative work between Academic Services and the 12 program committees. The education advisor would define a program evaluation framework, develop tools and templates, and offer guidance and assistance to enable programs to carry out their own program evaluation activities. This approach was intended to build program evaluation capacity to ensure that each program developed the ability to respond to the ever-changing needs of their students and the workplace. One of the pivotal decisions we made was to place the responsibility for conducting the ongoing program evaluations within the program committees. At Heritage, in respect of the Collective Agreement, program committees are comprised of professors who teach in the program, an academic advisor, and an education advisor. They are typically led by the program coordinator, also a faculty member, who has been given some release time for program administration. It should also be noted that program coordination responsibilities usually rotate every two to three years resulting in cyclical leadership. The program committees report annually to the academic dean, who is ultimately responsible for program evaluation. The other pivotal decision was to ask each program committee to select one program evaluation criterion each academic year on which to focus its evaluation activities. Every seven years, after programs have taken a close look at each evaluation criterion, they will be in a position to prepare a synthesis report of how their program is doing across all criteria. This forms the basis of their comprehensive (in-depth) program evaluation study and report. Exceptionally, in compliance with CEEC guidelines, new programs or newly revised programs are required to do a comprehensive evaluation after one cycle of implementation: after two years for pre-university programs and after three years for career programs. Academic Senate makes decisions and oversees policies that impact our programs of study and the evaluation of student achievement. In support of the ongoing approach to program evaluation, it struck a standing committee called Program Evaluation and Development (PED). A member of Senate leads the PED soliciting representatives from our career, pre-university, and general-education programs. The PED mandate is to be proactive in sharing program evaluation processes, tools, and best practices, as well as to provide valuable insight about the feasibility and value of the ongoing evaluation process and the usability of its tools. [...] it seemed to us that our context required an ongoing, collaborative approach led by the education advisor. IMPLEMENTING ONGOING PROGRAM EVALUATION Being aware of the direction and leadership required to initiate ongoing program evaluation, the education advisor designed a program evaluation framework consisting of timelines and a series of templates and tools to address the six required evaluation criteria, all derived from CEEC guidelines (1994a). The templates and tools were made available to all program committees and have been the focus of several professional development sessions. They can be used as is or adapted to the program s particular needs. 2 Education advisor is a general professional title. This position is also known as pedagogical consultant, pedagogical advisor, etc. 2 PÉDAGOGIE COLLÉGIALE VOL. 26, N O 1 FALL 2012

In an effort to integrate program evaluation into existing program and institutional structures, the education advisor developed a time frame document (Figure 1). Academic Services felt it to be important that program evaluation work follow the natural flow of the academic year. In addition, for programs wanting to make changes that require Academic Senate approval, adherence to the time frames suggested will help them be most efficient in preparing their submissions and documentation for the following academic year. Leaving the choice of evaluation criterion up to the programs seemed logical when Academic Services considered that each program is unique and is challenged by a wide range of circumstances. To help program committees decide on the criterion each year, we created a checklist (Figure 2) to facilitate their decision-making. Once they have chosen their criterion, the program committees are encouraged to prepare a work plan of evaluation activities to ensure completion in a timely manner. For the criterion chosen, they use the corresponding page of evaluation questions to guide their work. Figure 3 shows an example of the guiding questions for one of the criteria, program coherence. Program committees are expected to address each question, supporting their answers with available data. Recommended sources of data can be found in the toolbox. It is important to note that, at present, we consider all tools and templates to be in development and intend on improving and validating them in the coming years. FIGURE 1 2011 2012 ONGOING PROGRAM EVALUATION TIME LINES WHEN ACTIVITIES Use the program evaluation questions in your program evaluation binder (hard copy) and on the Guide to Program Evaluation Moodle page (e-copy). OUTCOMES September Program Committee: Reviews recommendations made in last year s annual program report and confirms planned actions Strikes an ad hoc Program Evaluation Committee, when appropriate Program Committee / Ad hoc Program Evaluation Committee: Receive data updates from Student Services Review the evaluation activities, tools and reports from previous year(s) Confirm time frames, roles and responsibilities Create program evaluation work plans for the year Program evaluation work plan for: Evaluation activities to be done Changes to be made according to planned actions December Program Committee / Ad hoc Program Evaluation Committee review and assess progress with program department Update work plan April-May Program Committee : Review results of evaluation activities Agree on recommendations and action plan Write draft report(s) Data Draft report May Program Committee : Consult with committee members to finalize report Finalize action plan for the upcoming year The ad hoc Program Evaluation Committee presents a report on in-depth/ comprehensive evaluation to the Program Committee Final report Action plan June Program Committee submits annual program report to the academic dean Annual report FALL 2012 VOL. 26, N O 1 PÉDAGOGIE COLLÉGIALE 3

FIGURE 2 2011 2012 PROGRAM EVALUATION CHECKLIST RELEVANCE How well does your program prepare students for university / the labour market? How well does your program prepare students for today s society? How well does your program meet the expectations of its students? Overall, rate the relevance of your program COHERENCE How clear are your program s learning outcomes and standards? How well do the learning activities prepare students to achieve the program s learning outcomes? How well are your courses sequenced? How demanding are course workloads and requirements for today s student? TEACHING & SUPERVISION RESOURCES Overall, rate the coherence of your program How well does your program engage its students? How well does your program identify students at risk? How accessible are your teachers? Overall, rate the value of teaching methods and supervision How well staffed is your program (expertise, complementary skills)? How well is your program supported (professional and support staff)? How well do you support teacher engagement, professional development, and care? How adequate are your program teaching and learning resources? How adequate are your program s financial resources? Overall, rate your resources EFFECTIVENESS How well prepared are the students entering your program? How well do your student activities demonstrate program success? Is your course success rate satisfactory? How well do your graduates achieve the program s learning outcomes and standards? MANAGEMENT Overall, rate your program effectiveness How well do you manage your program? How well do your monitor its ongoing success? How well known is your program? Overall, rate the quality of program management 4 PÉDAGOGIE COLLÉGIALE VOL. 26, N O 1 FALL 2012

FIGURE 3 1 3 Related Questions Does each teacher have a copy of and refer to the program curriculum? Do teachers prepare their course outlines based on department approved course plans? Does your program use its Program Overview? Does each teacher have the opportunity to provide feedback, make requests, etc.? 2 2011 2012 GUIDING QUESTIONS FOR PROGRAM COHERENCE How clear are your program s learning outcomes and standards? How well do course-learning activities prepare students to achieve the program s learning outcomes? Do teachers demonstrate how the learning outcomes of their courses help the students achieve the program learning outcomes and standards? Do you hold at least one program committee meeting per semester? Is your Program Exit Assessment (PEA) coherent with your program learning outcomes? Does your PEA attest to the student s achievement of the program learning outcomes? How well are your courses sequenced? Courses should be sequenced to encourage the development of critical thought and integration of skills and knowledge. Is your program profile sequenced and developed progressively based on the students achievement of program learning outcomes? Is this evident in your course plans? Do the pre-requisites and co-requisites support the student s progressive learning and achievement? TOOLBOX: INFORMATION & DATA SOURCES Course evaluations: student & teacher Program curriculum documents Course plans and outlines Pre-requisites & co-requisites Program / Department meeting notes Program Overview ICT Profile Academic Services provides each program with program information statistics indicating cohort retention and success rates across the same program in the CEGEP network, as well as local statistics. These statistics are available in late April for recent academic years. The college also provides templates for each of the student surveys, collating data centrally for the graduating student survey and the alumni survey. 3 As the academic year draws to an end, early to mid June, programs submit their Annual Program Report to Academic Services, including their program evaluation activities, findings and recommendations (Figure 4). The report is meant to be succinct, supported by data generated by the evaluation activities. The Academic Dean and Education Advisor read the reports in preparation for follow-up in the next academic year. Programs recommending changes that require Academic Senate approval are contacted to establish timeframes. The ongoing approach to program evaluation intentionally reinforces the program approach. The ongoing collection of data and the preparation of annual reports invite programs to openly discuss their strengths and weaknesses among all teachers involved in their program. Over time, the data are meant to help the programs identify emerging trends and be proactive in their ongoing program development. Kept centrally in designated binders, the data and reports provide evidence when program changes are needed. 4 How demanding are course workloads and requirements for today s student? Are course ponderations and credit allocations realistic? Are learning activities within each course realistic, given the course ponderation? 3 We have yet to develop the incoming student survey and the current student survey. This work is planned for 2012 13. FALL 2012 VOL. 26, N O 1 PÉDAGOGIE COLLÉGIALE 5

FIGURE 4 2011 2012 ANNUAL PROGRAM REPORT TEMPLATE 1/2-page summary; append your Program Overview. Provide an overview of your program evaluation activities of this academic year. 1/2-page summary; append your Program Evaluation work plan. Describe your findings in terms of program strengths and weaknesses. 1-page summary; append available data and tools. State your conclusions, decisions, and recommendations for program changes to enhance the following three points. Include an action plan. 1 page, to include: 1. Student success, i.e. students achieving the competencies; 2. Readiness of students to pursue further education or to enter the job market; 3. Program viability. Desired Change APPENDICES Provide a general description of your program. Required Actions Responsibility Program Overview Program Evaluation Work Plan Findings Key Performance Indicators (KPI) data provided by Academic Services Copies of all survey reports Time Frame LESSONS LEARNED As we complete the third year of ongoing program evaluation, we are learning how to better implement this approach. Initially, we simply made the tools and templates available in print version. The PED Committee soon after took the initiative to place them all on the college s learning management system. The next step was to create a program evaluation binder for each program and each general education department in preparation for the 2011 2012 academic year. The binder includes a print copy of Policy 17, the program s specific data set, and all program evaluation tools and templates. It also includes a tracking sheet to help provide continuity should there be a change of program coordinator. Feedback suggests that the binder has served to structure and consolidate the process, as well as to help programs understand the purpose and outcomes of program evaluation. Through professional development sessions, calls for program committee meetings, annual program reports, and continual reference to its importance, we increasingly hear faculty talking about ongoing program evaluation. From the annual program reports received over the last two years, however, we have noticed that some programs do not see the value of carrying out all evaluation activities. To help address this, the PED has initiated the sharing some best practices of specific criterion evaluation activities. Academic Services understands that adopting an ongoing approach to program evaluation is a major change that requires programs to become directly involved in assessing their own program excellence. Because change always brings resistance, some programs have been slow to comply. Other programs have embraced the approach and have addressed two or three criteria in a rigorous fashion. Exceptionally, two programs were given release time to complete a comprehensive program evaluation because of unique circumstances. They both agreed to use the templates and tools provided for the ongoing approach. As a result, both programs have realized that significant changes should be made in program coherence and profile. Through implementation, it has become clear that some minor amendments are needed to Policy #17 concerning Program Evaluation. The roles of the various committees and some time frames require updating to better reflect our realities. These amendments will be proposed early in 2012 13. The presentation that Heritage College made at the 2011 AQPC Symposium provided an opportunity to present our early work on program evaluation. As a follow-up to this session, it was 6 PÉDAGOGIE COLLÉGIALE VOL. 26, N O 1 FALL 2012

suggested that this work be further shared within a committee of representatives from interested CEGEPs. This committee has been formed and meetings held, made possible by the generous support available through APOP-Tandem. Since March, representatives from the English-language colleges have met virtually to share program evaluation approaches, tools, and insights. We met again face-to-face during an open networking session at the AQPC 2012 Symposium, where we shared our ongoing program evaluation approaches with other CEGEPs. At Heritage, developing ongoing program evaluation is a work in progress. We are acutely aware of how it changes the responsibilities of the program coordinators and program committees. We are also committed to building our capacity to evaluate our programs. Our goal is to integrate program evaluation activities into ongoing program management practices in an effective and efficient manner. COMMISSION D ÉVALUATION DE L ENSEIGNEMENT COLLÉGIAL (CEEC). 1994b. Évaluation des politiques institutionnelles d évaluation des programmes d études. Cadre de référence. Québec: Gouvernement du Québec, October. Lee Anne JOHNSTON is an education professional with over 35 years experience in education-related employment. Prior to Heritage College, she ran an educational consulting business, taught secondary school, worked as a senior learning design specialist, worked as a coach education consultant in Canadian amateur sport, and developed and monitored several instructional programs for children. She holds an MA, B.Ed, and BA-BPHE. ljohnston@cegep-heritage.qc.ca LOOKING AHEAD Like all CEGEPs, Heritage College strives for student success across all programs. We have chosen ongoing program evaluation as one of our key strategies basic to program excellence and student success. Our implementation of Policy #17 will be self-evaluated in 2014, a process that will help us determine the efficacy of our approach. Ongoing program evaluation enables programs to address any concern rapidly and bring about change quickly. It also enables programs to be responsive and accommodating to the changing needs of students, universities, and the workplace, becoming stronger and more viable each year. Furthermore, ongoing evaluation requires and inspires a lot of collaboration internally and within the college community. We anticipate that future collaborative work with other CEGEPs will lead to the development of better, more effective program evaluation tools and practices, hence building our collective capacity to evaluate our own programs, which puts the ownership of program evaluation back where it belongs with the programs themselves. RÉFÉRENCES BIBLIOGRAPHIQUES CÉGEP HERITAGE COLLEGE. 2009. Policy #17 Concerning Program Evaluation, Québec: Cégep Heritage College. COMMISSION D ÉVALUATION DE L ENSEIGNEMENT COLLÉGIAL (CEEC). 1994a. L évaluation des programmes d études. Cadre de référence. Québec: Gouvernement du Québec. January. Both the English- and French-language versions of this article have been published on the AQPC website with the financial support of the Quebec-Canada Entente for Minority Language Education. FALL 2012 VOL. 26, N O 1 PÉDAGOGIE COLLÉGIALE 7