Guidelines for Collecting and Using Stakeholders Input

Similar documents
Higher education is becoming a major driver of economic competitiveness

SURVEY RESEARCH POLICY TABLE OF CONTENTS STATEMENT OF POLICY REASON FOR THIS POLICY

Procedures for Academic Program Review. Office of Institutional Effectiveness, Academic Planning and Review

Swinburne University of Technology 2020 Plan

I. Proposal presentations should follow Degree Quality Assessment Board (DQAB) format.

The Characteristics of Programs of Information

Programme Specification

School Inspection in Hesse/Germany

Colorado State University Department of Construction Management. Assessment Results and Action Plans

Bachelor of International Hospitality Management

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

Standards and Criteria for Demonstrating Excellence in BACCALAUREATE/GRADUATE DEGREE PROGRAMS

PROVIDENCE UNIVERSITY COLLEGE

ACCREDITATION STANDARDS

Planning a Dissertation/ Project

Unit 7 Data analysis and design

First Line Manager Development. Facilitated Blended Accredited

Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd. Hertfordshire International College

Tun your everyday simulation activity into research

An Introduction and Overview to Google Apps in K12 Education: A Web-based Instructional Module

Programme Specification

Sharing Information on Progress. Steinbeis University Berlin - Institute Corporate Responsibility Management. Report no. 2

ANNUAL CURRICULUM REVIEW PROCESS for the 2016/2017 Academic Year

AUTHORITATIVE SOURCES ADULT AND COMMUNITY LEARNING LEARNING PROGRAMMES

DOCTOR OF PHILOSOPHY BOARD PhD PROGRAM REVIEW PROTOCOL

Drs Rachel Patrick, Emily Gray, Nikki Moodie School of Education, School of Global, Urban and Social Studies, College of Design and Social Context

Higher Education Review (Embedded Colleges) of Kaplan International Colleges UK Ltd

Quality in University Lifelong Learning (ULLL) and the Bologna process

Delaware Performance Appraisal System Building greater skills and knowledge for educators

Chapter 2. University Committee Structure

Developing an Assessment Plan to Learn About Student Learning

(Still) Unskilled and Unaware of It?

Common Core Path to Achievement. A Three Year Blueprint to Success

Kristin Moser. Sherry Woosley, Ph.D. University of Northern Iowa EBI

Evaluation Report Output 01: Best practices analysis and exhibition

Loyalist College Applied Degree Proposal. Name of Institution: Loyalist College of Applied Arts and Technology

Changing Majors. You can change or add majors, minors, concentration, or teaching fields from the Student Course Registration (SFAREGS) form.

MASTER OF ARTS IN APPLIED SOCIOLOGY. Thesis Option

MBA 5652, Research Methods Course Syllabus. Course Description. Course Material(s) Course Learning Outcomes. Credits.

PREPARING FOR THE SITE VISIT IN YOUR FUTURE

The College of Law Mission Statement

Multiple Measures Assessment Project - FAQs

BSc (Hons) Banking Practice and Management (Full-time programmes of study)

Navitas UK Holdings Ltd Embedded College Review for Educational Oversight by the Quality Assurance Agency for Higher Education

Programme Specification. MSc in Palliative Care: Global Perspectives (Distance Learning) Valid from: September 2012 Faculty of Health & Life Sciences

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

Volunteer State Community College Strategic Plan,

Bachelor of Software Engineering: Emerging sustainable partnership with industry in ODL

elearning OVERVIEW GFA Consulting Group GmbH 1

PROPOSAL FOR NEW UNDERGRADUATE PROGRAM. Institution Submitting Proposal. Degree Designation as on Diploma. Title of Proposed Degree Program

Stakeholder Engagement and Communication Plan (SECP)

AC : BIOMEDICAL ENGINEERING PROJECTS: INTEGRATING THE UNDERGRADUATE INTO THE FACULTY LABORATORY

Envision Success FY2014-FY2017 Strategic Goal 1: Enhancing pathways that guide students to achieve their academic, career, and personal goals

Report of External Evaluation and Review

BSM 2801, Sport Marketing Course Syllabus. Course Description. Course Textbook. Course Learning Outcomes. Credits.

Bold resourcefulness: redefining employability and entrepreneurial learning

RETURNING TEACHER REQUIRED TRAINING MODULE YE TRANSCRIPT

Strategic Plan Update Year 3 November 1, 2013

The Site Visit: How to Prepare for It & What to Expect. STARTALK Conference Atlanta, GA May 3, 2012

OCR LEVEL 3 CAMBRIDGE TECHNICAL

Nottingham Trent University Course Specification

PROGRAMME SPECIFICATION

D direct? or I indirect?

Learning Objectives by Course Matrix Objectives Course # Course Name Psyc Know ledge

Assessment. the international training and education center on hiv. Continued on page 4

Case of the Department of Biomedical Engineering at the Lebanese. International University

ACBSP Related Standards: #3 Student and Stakeholder Focus #4 Measurement and Analysis of Student Learning and Performance

MBA6941, Managing Project Teams Course Syllabus. Course Description. Prerequisites. Course Textbook. Course Learning Objectives.

ABET Criteria for Accrediting Computer Science Programs

Position Statements. Index of Association Position Statements

2015 Academic Program Review. School of Natural Resources University of Nebraska Lincoln

ASSESSMENT OF STUDENT LEARNING OUTCOMES WITHIN ACADEMIC PROGRAMS AT WEST CHESTER UNIVERSITY

Pharmaceutical Medicine

Setting the Scene: ECVET and ECTS the two transfer (and accumulation) systems for education and training

Preliminary Report Initiative for Investigation of Race Matters and Underrepresented Minority Faculty at MIT Revised Version Submitted July 12, 2007

FACULTY OF PSYCHOLOGY

PROPOSED MERGER - RESPONSE TO PUBLIC CONSULTATION

Marketing Committee Terms of Reference

Course Specification Executive MBA via e-learning (MBUSP)

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

Knowledge for the Future Developments in Higher Education and Research in the Netherlands

Building a Vibrant Alumni Network

IMSH 2018 Simulation: Making the Impossible Possible

BSc (Hons) Property Development

Skillsoft Acquires SumTotal: Frequently Asked Questions. October 2014

University of Toronto

College of Education & Social Services (CESS) Advising Plan April 10, 2015

The Use of Statistical, Computational and Modelling Tools in Higher Learning Institutions: A Case Study of the University of Dodoma

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Carolina Course Evaluation Item Bank Last Revised Fall 2009

Assessment and Evaluation for Student Performance Improvement. I. Evaluation of Instructional Programs for Performance Improvement

GRADUATE PROGRAM IN ENGLISH

Promotion and Tenure Policy

Chiltern Training Ltd.

Abstract. Janaka Jayalath Director / Information Systems, Tertiary and Vocational Education Commission, Sri Lanka.

Henley Business School at Univ of Reading

Youth Sector 5-YEAR ACTION PLAN ᒫᒨ ᒣᔅᑲᓈᐦᒉᑖ ᐤ. Office of the Deputy Director General

value equivalent 6. Attendance Full-time Part-time Distance learning Mode of attendance 5 days pw n/a n/a

BSc Food Marketing and Business Economics with Industrial Training For students entering Part 1 in 2015/6

MAINTAINING CURRICULUM CONSISTENCY OF TECHNICAL AND VOCATIONAL EDUCATIONAL PROGRAMS THROUGH TEACHER DESIGN TEAMS

Transcription:

A) Introduction Stakeholder input is a necessity for ensuring that academic s are meeting the expectations of current and future students, graduates (alumni) and present and potential employers. For this reason, the engagement of students, faculty, and employers in evaluating offerings is embedded in local and international accreditation standards as essential to evaluating s and their continuous quality improvement. B) Who are the stakeholders? Current Students Alumni/Graduates Employers: graduate employers, potential employers & work placement employers Faculty Curriculum Developers and Evaluators C) Why do we want the input? Enhance stakeholder engagement by collecting their feedback & input and sharing with them results and conclusions Inform academic planning for improvement: curriculum, assessment, delivery, resources, collection tools; evaluation of change. See the following table. For Improvements in Inform Academic Planning Current Students Graduates (Alumni) Input from Employers Student learning: learning outcomes Student experience: progression Employment Effectiveness: delivery; resources Currency: curriculum Relevance: industry, present & future needs Faculty Page 1 of 6

D) How do we gather input? Principles We have to be confident that the input we use is: Accurate and trustworthy Reasonable and useful information about what is being analyzed. 1) Surveys Divisional Surveys available from Academic Division System Surveys available on Institutional Effectiveness Management System (IEMS), Intranet Portal by academic year (Refresh & enter year e.g. 201310, 201410 etc.). See the following table. System Survey Focus Scope Student Exit Survey (Student Satisfaction Survey) Teaching quality; preparation for employment; support from teachers; knowledge gained; academic quality; Graduate Outcomes; workplace knowledge & skills; knowledge of career options; overall HCT experience; planning & carrying out projects Campus; division; level; Graduate Satisfaction Survey Industry/Employer Satisfaction Survey Employment/FE status; Graduate Outcomes; teaching facilities; preparation for employment; HCT education; academic preparation; employment relevant to qualification/field Graduate Outcomes, workplace knowledge & skills, academic preparation at HCT Campus; division; level; Campus; division; level; Faculty Course Evaluation Survey Program Chair (Program Effectiveness) Survey Student Services Survey Curriculum design, content & organization; teaching, learning & assessment; course resources; delivery issues Curriculum; facilities; faculty & staff; health & safety; learning resources; technology; workplace relevance Student support & academic services; library; facilities; IT support; special needs Course Evaluation Survey Campus; division; course Division; ; campus; year (checkbox) Campus; division; Staff Services Survey Educational technology services; facilities; library; IT support Campus; division Page 2 of 6

and of System Surveys System surveys are managed centrally without using divisional resources Results are generally easy to analyze Divisional surveys can focus on specific areas of interest to the Typically low response rate to system surveys means questionable reliability and validity System surveys are often not updated until the following year leading to issues of currency Designing, delivering and analyzing divisional surveys requires resources and expertise Survey Response fatigue: too many surveys resulting in low response rates Survey Fatigue: respondents become disengaged or bored resulting in lack of completion or mechanical responses to finish quickly. Results over a number of survey administrations (longitudinal data) is more useful in identifying trends that may affect the than results from one or two administrations. Interpretation supported by a clear rationale should be communicated to stakeholders. 2) External Committees Examples: Industry Advisory Committee, PACs, Advisory Boards etc. Academic Division: Meeting Minutes and Reports of IAC and Advisory Boards. Input can provide in-depth analysis, insights and recommendations from the labour market and experts in the field Input can deal with future needs and concerns based on sector trends Discussions can explore unexpected areas providing insight into areas of interest Minutes of meetings may not reflect the depth of the discussion but simply record approvals or recommendations Evaluation of input is subjective and can affect the weight to be given to any recommendations Membership must be appropriate to the nature of the input for example HR Managers may not be competent to evaluate the currency or relevance of curriculum. Consider establishing a curriculum development interest group with employer nominees e.g. training managers/specialists. Records of discussion must be sufficiently detailed to allow for later evaluation. Page 3 of 6

3) Interviews Records of interviews as conducted by each academic division. Can provide expert input in specific areas of interest to present and future development Logistics: access to a sufficiently large sample of experts may be difficult to arrange Time consuming to conduct Interviews should have common clear objectives. Structure the interview to ensure feedback is obtained on each area of interest but allow time for the interviewee to add their own points/issues. Questions should be consistent across interviews to facilitate evaluation. Records of interviews must be sufficiently detailed to allow evaluation e.g. opinions of interviewees should be supported by evidence or justified with a rationale. 4) Focus groups Examples: student groups, alumni groups, industry groups, training managers, faculty groups). Records of focus group meetings as conducted by each academic division. Can provide stakeholder input in specific areas Provides open channel for stakeholders to express concerns and expectations Logistics: may be difficult to identify appropriate members and arrange Time consuming to conduct Budget required for hospitality Members should be carefully selected to ensure that they are appropriate people to provide feedback on the topics of interest. There should be clear objectives for each meeting. Meetings must be well managed/moderated to avoid problems such as loss of focus, a few people dominating, running out of time etc. Explanations of opinions are essential to avoid misinterpretation. Clear records of what was said and why are essential for effective input. Page 4 of 6

E) How is the input analyzed? Good practice: Evaluation procedure should be clear to all stakeholders. Analysis is improved when carried out by a range of stakeholders. Analysis rarely provides answers, interpretation of the analysis is critical Quantitative Analysis Examples: Closed Survey questions requiring responses such as Yes-No ; using a 5 point scale from Strongly Agree to Strongly Disagree. Large amounts of data / number of responses can be analyzed quickly Enhances objectivity in the analysis Selection of appropriate statistical analysis is key Data/Responses are defined by the instrument (e.g. survey question) and may not capture key feedback or may lack detail/reasoning for indepth analysis Interpretation of results dependent on analyst(s) and conclusions can be debatable Get advice from statistical analysts with appropriate experience Where available, longitudinal data is most powerful in identifying trends and issues Qualitative Examples: Open survey questions requiring written responses expressing perceptions, reasoning etc.; Focus Groups; Interviews etc. Can provide in-depth responses Openness allows unexpected feedback Enhances positive sense of engagement among stakeholders Usually small data set/sample size Difficult to generalize findings confidently Analysis is often subjective and may be affected by analysts own opinions Categorization and weighting of responses difficult to assign Experience in analyzing qualitative data is invaluable selecting the right analysts is crucial Be sure to distinguish between your analysis and the feedback provided i.e. analysis is more than listing quotes Discussing possible interpretations is a vital stage involve others in the discussions wherever practicable Page 5 of 6

F. Practical Tips Make use of what is available as best you can. Surveys: results of system surveys of graduating students, graduates (alumni), and graduate employers are published on the Portal if you do not have access, request though the Division. Committees: the IAC and other employer groups (e.g. PACs) are useful for identifying appropriate interviewees and focus groups Course Evaluation: student course evaluations (i.e. not the student evaluation of the teaching faculty) and faculty course evaluations are an excellent source for guiding improvements in teaching and learning Get the timing right. Changes to s and curricula are typically implemented in the following academic year so it is crucial to collect and interpret stakeholder input as early in the academic year as feasible to allow adequate time in finalizing improvements. Page 6 of 6