A) Introduction Stakeholder input is a necessity for ensuring that academic s are meeting the expectations of current and future students, graduates (alumni) and present and potential employers. For this reason, the engagement of students, faculty, and employers in evaluating offerings is embedded in local and international accreditation standards as essential to evaluating s and their continuous quality improvement. B) Who are the stakeholders? Current Students Alumni/Graduates Employers: graduate employers, potential employers & work placement employers Faculty Curriculum Developers and Evaluators C) Why do we want the input? Enhance stakeholder engagement by collecting their feedback & input and sharing with them results and conclusions Inform academic planning for improvement: curriculum, assessment, delivery, resources, collection tools; evaluation of change. See the following table. For Improvements in Inform Academic Planning Current Students Graduates (Alumni) Input from Employers Student learning: learning outcomes Student experience: progression Employment Effectiveness: delivery; resources Currency: curriculum Relevance: industry, present & future needs Faculty Page 1 of 6
D) How do we gather input? Principles We have to be confident that the input we use is: Accurate and trustworthy Reasonable and useful information about what is being analyzed. 1) Surveys Divisional Surveys available from Academic Division System Surveys available on Institutional Effectiveness Management System (IEMS), Intranet Portal by academic year (Refresh & enter year e.g. 201310, 201410 etc.). See the following table. System Survey Focus Scope Student Exit Survey (Student Satisfaction Survey) Teaching quality; preparation for employment; support from teachers; knowledge gained; academic quality; Graduate Outcomes; workplace knowledge & skills; knowledge of career options; overall HCT experience; planning & carrying out projects Campus; division; level; Graduate Satisfaction Survey Industry/Employer Satisfaction Survey Employment/FE status; Graduate Outcomes; teaching facilities; preparation for employment; HCT education; academic preparation; employment relevant to qualification/field Graduate Outcomes, workplace knowledge & skills, academic preparation at HCT Campus; division; level; Campus; division; level; Faculty Course Evaluation Survey Program Chair (Program Effectiveness) Survey Student Services Survey Curriculum design, content & organization; teaching, learning & assessment; course resources; delivery issues Curriculum; facilities; faculty & staff; health & safety; learning resources; technology; workplace relevance Student support & academic services; library; facilities; IT support; special needs Course Evaluation Survey Campus; division; course Division; ; campus; year (checkbox) Campus; division; Staff Services Survey Educational technology services; facilities; library; IT support Campus; division Page 2 of 6
and of System Surveys System surveys are managed centrally without using divisional resources Results are generally easy to analyze Divisional surveys can focus on specific areas of interest to the Typically low response rate to system surveys means questionable reliability and validity System surveys are often not updated until the following year leading to issues of currency Designing, delivering and analyzing divisional surveys requires resources and expertise Survey Response fatigue: too many surveys resulting in low response rates Survey Fatigue: respondents become disengaged or bored resulting in lack of completion or mechanical responses to finish quickly. Results over a number of survey administrations (longitudinal data) is more useful in identifying trends that may affect the than results from one or two administrations. Interpretation supported by a clear rationale should be communicated to stakeholders. 2) External Committees Examples: Industry Advisory Committee, PACs, Advisory Boards etc. Academic Division: Meeting Minutes and Reports of IAC and Advisory Boards. Input can provide in-depth analysis, insights and recommendations from the labour market and experts in the field Input can deal with future needs and concerns based on sector trends Discussions can explore unexpected areas providing insight into areas of interest Minutes of meetings may not reflect the depth of the discussion but simply record approvals or recommendations Evaluation of input is subjective and can affect the weight to be given to any recommendations Membership must be appropriate to the nature of the input for example HR Managers may not be competent to evaluate the currency or relevance of curriculum. Consider establishing a curriculum development interest group with employer nominees e.g. training managers/specialists. Records of discussion must be sufficiently detailed to allow for later evaluation. Page 3 of 6
3) Interviews Records of interviews as conducted by each academic division. Can provide expert input in specific areas of interest to present and future development Logistics: access to a sufficiently large sample of experts may be difficult to arrange Time consuming to conduct Interviews should have common clear objectives. Structure the interview to ensure feedback is obtained on each area of interest but allow time for the interviewee to add their own points/issues. Questions should be consistent across interviews to facilitate evaluation. Records of interviews must be sufficiently detailed to allow evaluation e.g. opinions of interviewees should be supported by evidence or justified with a rationale. 4) Focus groups Examples: student groups, alumni groups, industry groups, training managers, faculty groups). Records of focus group meetings as conducted by each academic division. Can provide stakeholder input in specific areas Provides open channel for stakeholders to express concerns and expectations Logistics: may be difficult to identify appropriate members and arrange Time consuming to conduct Budget required for hospitality Members should be carefully selected to ensure that they are appropriate people to provide feedback on the topics of interest. There should be clear objectives for each meeting. Meetings must be well managed/moderated to avoid problems such as loss of focus, a few people dominating, running out of time etc. Explanations of opinions are essential to avoid misinterpretation. Clear records of what was said and why are essential for effective input. Page 4 of 6
E) How is the input analyzed? Good practice: Evaluation procedure should be clear to all stakeholders. Analysis is improved when carried out by a range of stakeholders. Analysis rarely provides answers, interpretation of the analysis is critical Quantitative Analysis Examples: Closed Survey questions requiring responses such as Yes-No ; using a 5 point scale from Strongly Agree to Strongly Disagree. Large amounts of data / number of responses can be analyzed quickly Enhances objectivity in the analysis Selection of appropriate statistical analysis is key Data/Responses are defined by the instrument (e.g. survey question) and may not capture key feedback or may lack detail/reasoning for indepth analysis Interpretation of results dependent on analyst(s) and conclusions can be debatable Get advice from statistical analysts with appropriate experience Where available, longitudinal data is most powerful in identifying trends and issues Qualitative Examples: Open survey questions requiring written responses expressing perceptions, reasoning etc.; Focus Groups; Interviews etc. Can provide in-depth responses Openness allows unexpected feedback Enhances positive sense of engagement among stakeholders Usually small data set/sample size Difficult to generalize findings confidently Analysis is often subjective and may be affected by analysts own opinions Categorization and weighting of responses difficult to assign Experience in analyzing qualitative data is invaluable selecting the right analysts is crucial Be sure to distinguish between your analysis and the feedback provided i.e. analysis is more than listing quotes Discussing possible interpretations is a vital stage involve others in the discussions wherever practicable Page 5 of 6
F. Practical Tips Make use of what is available as best you can. Surveys: results of system surveys of graduating students, graduates (alumni), and graduate employers are published on the Portal if you do not have access, request though the Division. Committees: the IAC and other employer groups (e.g. PACs) are useful for identifying appropriate interviewees and focus groups Course Evaluation: student course evaluations (i.e. not the student evaluation of the teaching faculty) and faculty course evaluations are an excellent source for guiding improvements in teaching and learning Get the timing right. Changes to s and curricula are typically implemented in the following academic year so it is crucial to collect and interpret stakeholder input as early in the academic year as feasible to allow adequate time in finalizing improvements. Page 6 of 6