From below average to above average: Six years of reflecting and responding to the National Student Survey

Similar documents
Higher Education Review of University of Hertfordshire

Programme Specification. MSc in International Real Estate

BSc (Hons) Banking Practice and Management (Full-time programmes of study)

Programme Specification

Programme Specification. MSc in Palliative Care: Global Perspectives (Distance Learning) Valid from: September 2012 Faculty of Health & Life Sciences

A Note on Structuring Employability Skills for Accounting Students

Navitas UK Holdings Ltd Embedded College Review for Educational Oversight by the Quality Assurance Agency for Higher Education

FACULTY OF PSYCHOLOGY

An APEL Framework for the East of England

Exploring the Development of Students Generic Skills Development in Higher Education Using A Web-based Learning Environment

Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd. Hertfordshire International College

Teacher of Art & Design (Maternity Cover)

Student Experience Strategy

Mandatory Review of Social Skills Qualifications. Consultation document for Approval to List

Thameside Primary School Rationale for Assessment against the National Curriculum

P920 Higher Nationals Recognition of Prior Learning

The Referencing of the Irish National Framework of Qualifications to EQF

PROGRAMME SPECIFICATION

Programme Specification. BSc (Hons) RURAL LAND MANAGEMENT

Programme Specification

Programme Specification

Qualification handbook

Interim Review of the Public Engagement with Research Catalysts Programme 2012 to 2015

Programme Specification

Curriculum and Assessment Policy

Programme Specification

University of Cambridge: Programme Specifications POSTGRADUATE ADVANCED CERTIFICATE IN EDUCATIONAL STUDIES. June 2012

Teaching Excellence Framework

Stakeholder Engagement and Communication Plan (SECP)

PROGRAMME SPECIFICATION KEY FACTS

HARPER ADAMS UNIVERSITY Programme Specification

Accreditation of Prior Experiential and Certificated Learning (APECL) Guidance for Applicants/Students

The Keele University Skills Portfolio Personal Tutor Guide

PROGRAMME SPECIFICATION

1. Welcome and introduction from the Director of Undergraduate Studies

Interview on Quality Education

2013/Q&PQ THE SOUTH AFRICAN QUALIFICATIONS AUTHORITY

Institutional review. University of Wales, Newport. November 2010


IMPLEMENTING THE EARLY YEARS LEARNING FRAMEWORK

Quality Assurance of Teaching, Learning and Assessment

Digital Media Literacy

Course Specification Executive MBA via e-learning (MBUSP)

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Practice Learning Handbook

Nottingham Trent University Course Specification

A pilot study on the impact of an online writing tool used by first year science students

Practice Learning Handbook

Henley Business School at Univ of Reading

Course Brochure 2016/17

Lismore Comprehensive School

Arts, Humanities and Social Science Faculty

DICE - Final Report. Project Information Project Acronym DICE Project Title

School Leadership Rubrics

Module Title: Teaching a Specialist Subject

Faculty of Social Sciences

LITERACY ACROSS THE CURRICULUM POLICY

Aligning learning, teaching and assessment using the web: an evaluation of pedagogic approaches

EQuIP Review Feedback

Idsall External Examinations Policy

Qualitative Site Review Protocol for DC Charter Schools

Draft Budget : Higher Education

PUPIL PREMIUM POLICY

5 Early years providers

MSc Education and Training for Development

School of Education. Teacher Education Professional Experience Handbook

CERTIFICATE OF HIGHER EDUCATION IN CONTINUING EDUCATION. Relevant QAA subject benchmarking group:

1. Programme title and designation International Management N/A

Programme Specification 1

Plans for Pupil Premium Spending

Educational Leadership and Administration

Aurora College Annual Report

Youth Sector 5-YEAR ACTION PLAN ᒫᒨ ᒣᔅᑲᓈᐦᒉᑖ ᐤ. Office of the Deputy Director General

Unit 7 Data analysis and design

Regional Bureau for Education in Africa (BREDA)

INSIGHTS INTO THE IMPLEMENTATION OF MATHEMATICAL LITERACY

An Analysis of the Early Assessment Program (EAP) Assessment for English

University of Suffolk. Using group work for learning, teaching and assessment: a guide for staff

BSc (Hons) Marketing

Special Educational Needs and Disabilities Policy Taverham and Drayton Cluster

PROJECT RELEASE: Towards achieving Self REgulated LEArning as a core in teachers' In-SErvice training in Cyprus

POST-16 LEVEL 1 DIPLOMA (Pilot) Specification for teaching from September 2013

Successful Personal Tutoring. Margaret Postance Dr Chris Beaumont Fay Sherringham

The Curriculum in Primary Schools

Certificate of Higher Education in History. Relevant QAA subject benchmarking group: History

Introduction 3. Outcomes of the Institutional audit 3. Institutional approach to quality enhancement 3

LLB (Hons) Law with Business

Referencing the Danish Qualifications Framework for Lifelong Learning to the European Qualifications Framework

Personal Tutoring at Staffordshire University

Using Team-based learning for the Career Research Project. Francine White. LaGuardia Community College

University of Essex Access Agreement

Business. Pearson BTEC Level 1 Introductory in. Specification

POL EVALUATION PLAN. Created for Lucy Learned, Training Specialist Jet Blue Airways

FACULTY GUIDE ON INTERNSHIP ADVISING

Drs Rachel Patrick, Emily Gray, Nikki Moodie School of Education, School of Global, Urban and Social Studies, College of Design and Social Context

Promotion and Tenure Guidelines. School of Social Work

Whole School Evaluation REPORT. Tigh Nan Dooley Special School Carraroe, County Galway Roll Number: 20329B

Evaluation Report Output 01: Best practices analysis and exhibition

- COURSE DESCRIPTIONS - (*From Online Graduate Catalog )

LIVERPOOL JOHN MOORES UNIVERSITY Department of Electrical Engineering Job Description

Transcription:

From below average to above average: Six years of reflecting and responding to the National Student Survey Lizann Bonnar & Steve Kelly University of Strathclyde, Scotland ABSTRACT: In recent years, the National Student Survey (NSS) has focussed higher education institutions efforts in improving various aspects of the student experience. The lowest level of student satisfaction recorded by the survey has been in the area of assessment and feedback. Here, a case study is presented, depicting one department s journey (Psychology at the University of Strathclyde) of reflection and responding to students responses on assessment and feedback questions of the NSS. The journey depicted begins with results of our first NSS participation in 2007, revealing low student satisfaction with our assessment and feedback practices, through to a picture of improvement in 2008-2009, stagnation in 2010, and transformation in 2011 and 2012. NSS data from 2007 to 2012 is presented, alongside actions taken to improve our assessment and feedback practices. We believe some of these changes lay a foundation for more innovative changes suggested in current assessment and feedback literature. Background While not without its critics, it is undeniable that the publication of the results of the annual UK National Student Survey (NSS) has focused further and higher education institutions efforts to improve the student experience. Introduced in 2005, the NSS comprises 22 questions covering six areas of students experience at university: the teaching on the course, assessment and feedback, academic support, organisation and management, learning resources, personal development, and overall satisfaction. A glance at the figures for the sector as a whole shows a year-on-year increase in student satisfaction, although satisfaction continues to be lowest for questions in the Assessment and Feedback category. Liam Burns, President of the National Union of Students commenting on the 2012 results, stated, progress has not been as rapid as we would have liked, particularly in areas such as assessment and feedback, results have continued to improve year on year and they must continue to do so. The aim of the present paper is to chart the journey of reflection and action taken by one department, Psychology at the University of Strathclyde, in response to students feedback on assessment and feedback via the National Student Survey (as well as informal feedback and feedback from module evaluations, Student-Staff Liaison Committee, etc). As noted earlier, the NSS is not without its critics; arguments against the survey have related to the marketisation of higher education, and to the survey itself as a flawed instrument that provides little information regarding the quality of education. While we do share these concerns, we have taken the view that discussions arising from NSS student feedback has illuminated some practices in our own department that were not fit for purpose. Take as an example the practice of returning feedback on a piece of coursework that could be used to inform a subsequent coursework, but is returned too late for the student to use the feedback to improve their work (see Gibbs and Simpson, 2004 on the timeliness of feedback to support learning). This type of practice has been widespread, although it is fair to note that such practices are not always avoidable. Our annual analysis of the NSS results contributes one strand of a wider evaluation of our assessment and feedback practices. In our own context, student participation in the NSS is higher than local student surveys such as module evaluations, and therefore represents the views of a wider sample of students, rather than those of students at the extreme ends of the spectrum.

The structure of this paper will firstly review the NSS results for Psychology at the University of Strathclyde from 2007 to 2012, and will then present the actions taken by the department to improve our assessment and feedback practices to better support students learning process. Lastly, we consider current developments in the research literature on assessment and feedback and how this may further impact our assessment and feedback practices. A look at our NSS results Table 1 shows the five questions in the NSS related to assessment and feedback. The data represents, from 2007 (our first participation in the survey) to 2012, the percentage of Psychology students at the University of Strathclyde who definitely or mostly agreed with each of these five questions. For comparison, the data in parentheses is the UK sector wide data. Looking initially at the 2007 data, the results show student satisfaction lower than the sector average for the fairness of assessment arrangements (question six), the promptness of feedback (question seven), and the usefulness of feedback in clarifying things students did not understand (question nine). While we were aware of instances of individual students expressing unhappiness with their feedback (e.g. the consistency of feedback) and with our assessment and feedback practices (e.g. feedback on coursework not returned prior to a subsequent coursework deadline), the extent of the dissatisfaction was surprising and disappointing. The data from 2008 show modest increases in student satisfaction for the fairness of assessment arrangements, detailed feedback, and the usefulness of feedback to help clarify things students did not understand, but generally continued to be below the sector average. Some of the changes we made to our practices following the 2007 results were implemented in our second and third year classes, the latter of which completed the 2009 survey. It is in the results of the 2009 survey that one sees increases in student satisfaction, relative to our previous results and the national average (with the exception of question five), but is a trend which does not continue into the 2010 results, where instead the picture is one mainly of stagnation. The implementation of enhanced assessment and feedback practices between 2008 and 2011, we believe, is responsible for the increase in student satisfaction levels, in the 2011 and 2012 results, from below the national average to above the national average. Table 1. Percentage of respondents (final year Psychology students at the University of Strathclyde) who definitely or mostly agreed with the Assessment and Feedback questions (Questions Five to Nine) in the National Student Survey 2007-2012. 1 The figures in parenthesis represent the UK sector wide data (Higher Education Funding Council for England, 2012). Question 2007 2008 2009 2010 2011 2012 5. The criteria used in marking have been clear in advance. 6. Assessment arrangements in marking have been fair. 7. Feedback on my work has been prompt. 71 (69) 65 (70) 67 (71) 71 (72) 82 (73) 86 (75) 55 (72) 60 (73) 81 (73) 71 (74) 82 (74) 79 (76) 50 (54) 48 (57) 66 (58) 66 (61) 65 (63) 75 (66) 1 Prior to 2010, students in their third year of study were registered on a 3 year faculty BA degree (before being transferred to a named honours programme in 4th year) and were included in the NSS. Since at least 2010, only students in 4th year have been included in the NSS. When comparing between years these cohort differences should be noted.

8. I have received detailed comments on my work. 9. Feedback on my work has helped me clarify things I did not understand. 61 (60) 70 (62) 78 (63) 79 (65) 81 (67) 86 (70) 34 (54) 48 (57) 63 (58) 63 (60) 71 (61) 76 (64) Our journey to improve assessment and feedback practices Student feedback on our assessment and feedback practices greatly informed our strategy to improve our practices. This feedback came from multiple sources: the NSS, the Student- Staff Liaison Committee, module evaluation feedback, and informal feedback. In addition, in 2010, we also surveyed our second year students to gain their views on our assessment and feedback practices, the findings of which also contributed to the review of our practices. We also engaged with our institution s recommendations, 12 Principles of Good Assessment and Feedback (University of Strathclyde, 2008), informed by research conducted by the Reengineering Assessment Practices (REAP) project, and we consulted the wider research literature on assessment and feedback. Getting the basics right In 2008, we completed the transition from paper-based coursework and hand-written feedback to a fully electronic system, via our virtual learning environment (VLE), for all coursework submission across our undergraduate degree. As a result, feedback is always legible and students do not need to be on campus for the submission process, or for collecting feedback. We also ensure that the timetabling of summative assessments across Psychology classes, within a given level of the undergraduate degree, are organised so that students always receive feedback prior to submitting a further piece of coursework, allowing the opportunity for students to reflect on and use the feedback. As with other departments with large student numbers, we employ Graduate Teaching Assistants (GTA) to assist with coursework assessment and feedback duties at the early levels of the undergraduate degree. Feedback from students in first and second year highlighted issues such as feedback being inconsistent with staff instructions and feedback that was uninformative as to how students could improve their work in future. As part of our strategy to address these types of issues, we enhanced our departmental-based GTA training programme and added mentoring support, provided by staff. Clarifying good performance Criteria for assessing students work To clarify to students, and staff assessing students work, we revised our assessment criteria to include qualitative descriptors of the characteristics expected of a piece of work for a given grade. This was part of our strategy to clarify what good performance is (Nicol and MacFarlane-Dick, 2006). Assessment criteria were written for each of the four levels of our undergraduate degree (the characteristics expected in an essay for a mark of 65 for a third year student is different from that expected from a first year student, with higher expectations as students progress through the degree), with additional criteria written for unique assessments (e.g. presentations, personal development portfolios). Documents containing criteria are disseminated to all students, staff and Graduate Teaching Assistants for each assessment task. We have incorporated activities into the degree that involve students working with the assessment criteria to grade and give feedback on sample coursework in order to familiarise them with the criteria and develop their capacity to judge academic work.

Provision of model coursework Further to clarifying our assessment criteria, we also followed the suggestion of Nicol and MacFarlane-Dick (2006) to make available exemplars of coursework in order to demonstrate the fulfillment of assessment criteria. A survey of our second year students views of our assessment and feedback practices showed that 81% of students reported that they would find the provision of exemplars useful for their learning. Since 2010, we have provided exemplar essays and reports (albeit on different topics from those the students are working on) at the start of the academic year, and following permission from high performing students, we have made available exemplar coursework following the return of marked coursework and feedback. Feeding back and feeding forward on exams Similar to many institutions, our students have not traditionally received qualitative feedback on exam work due to the significant additional resources involved. In 2010, however, we implemented the provision of generic feedback on exams for all undergraduate classes. This feedback typically points out the characteristics of strong and weaker exam answers, and is fed forward to the subsequent cohort in advance of the exams. Introducing a dialogic approach Enabling students to provide and receive feedback from their peers Recent research conducted by Nicol (2011) and Boud and Molloy (2012) have urged the use of self- assessment and peer-assessment or review activities to enhance students capacity to make judgments about their own and others work. Boud and Molloy (2012) have also argued that feedback practices should be more aligned to workplace practices where feedback on performance comes from multiple sources. We recently incorporated a peer review activity into a third year class, where part of the coursework is a practical report based on an empirical group project. Students were asked to submit the Method section of this report for peer review six weeks prior to the submission deadline for the full report. Students were randomly allocated to give (and receive) anonymous feedback to three of their peers. The benefits of peer review for learning are multiple and are documented by Nicol (2011); in addition to the benefits outlined by Nicol, this particular implementation provides students an opportunity to close any gap between current and desired performance (one of the University of Strathclyde s 12 principles of good assessment and feedback practice) for this section of their final report. Enabling dialogue between student and marker The works cited above by Boud and Molloy (2012) and Nicol (2011) have also emphasised the need for a shift from a feedback approach based on a transmission model, where a feedback message is communicated from marker to student, to an approach which encourages dialogue between students and teachers. Nicol (2011) argues for feedback to be based on co-construction between students, peers, and teachers. A small step towards incorporating such an approach is our introduction of a new cover page for coursework in 2010, which contains a box stating Dear Marker, I would like feedback on the following. This space allows students to request specific feedback from the marker (e.g. how well have I critiqued the literature? ), and for the marker to further tailor the feedback to the development needs identified by the student. The cover page also includes a space for students to record the action they have taken since receiving feedback on a previous piece of coursework, which markers can again provide feedback on the extent to which the action is effective. These measures invite the student to reflect on their learning needs and identify these to the marker, allowing the marker to respond to the needs of the student, as identified by the student, as well as those the marker identifies. Improving communication

Pointing out different forms of feedback Taking Nicol & Macfarlane-Dick s (2006) broad definition of good feedback as anything that might strengthen the student s capacity to self-regulate their own performance (p. 205), we considered it important to point out to students the different sources of feedback they receive, beyond feedback on summative assessments. A statement tailored to each class on such feedback has been inserted into all undergraduate class statements (In one class this reads: Students will receive verbal feedback from their peers and verbal and brief written feedback from a member of staff on presentations, as well as electronic feedback on practical reports. Feedback, however, comes in many forms and at various other points: when a discussion post is responded to, this is feedback; a response to a question before, after, or during a lecture, or by email, is feedback. ). Informing students of when they can expect feedback We acknowledged earlier the importance of returning feedback with sufficient time for students to use it to effectively support their learning, but we also considered it important to inform students of when they can expect their feedback. From the students perspective, receiving feedback, particularly on summative work, can be an emotional experience, so knowing the date by which feedback will be returned allows students to prepare. Like most higher education institutions, all of our undergraduate classes have a class site on our virtual learning environment (VLE). In the VLE class sites, where students electronically submit their coursework, we included an area called I ve uploaded my coursework, what happens next? (See the screenshot below), to communicate the date when students can expect their feedback to be returned, as well as highlighting other relevant resources. Figure 1. A screenshot of our VLE detailing the processes following coursework submission. Using the language of the survey when communicating with students As psychologists we have some knowledge of the processes involved in memory encoding and retrieval and the importance of overlap in content between these for subsequent recognition (e.g. Thomson & Tulving, 1973). We changed the language we used to communicate established practices to the language of the survey. For example, our document containing our procedures for ensuring fairness and accuracy in assessment and feedback was renamed Ensuring fairness and accuracy in assessment and feedback, to reflect the wording of Question six of the survey. We also renamed the documents containing our marking criteria (previously called e.g. Marking criteria for third year ) to

Criteria used in marking 3 rd year work, mirroring more closely the language of Question five of the NSS. Feeding back to students on feedback Following attendance at a Higher Education Academy workshop on the National Student Survey where we learned about the usefulness of a You said, we did document, to respond to students on feedback they have provided. We use this type of newsletter to communicate our responses to students feedback, including on assessment and feedback. The aim of this newsletter is also to demonstrate to students that we are genuinely interested in their views of the course, and thereby also encourage them to give feedback in future, and following Carless (2009) greater communication is one way in which to foster the development of trust between students and staff around assessment issues. Current discussions in the research literature on assessment and feedback We have presented here some of the actions taken following consideration of students feedback on our assessment and feedback practices. Some of the actions described follow what Boud and Molloy (2012) call the Engineering or Feedback Mark 1 approach, where a large proportion of feedback is transmitted from teacher to student. Other changes in our practice fit more closely the dialogic approach (Boud and Molloy, 2012; Nicol, 2011), which emphasises co-construction of feedback between students, peers, and teachers. Other changes around improved communication aimed at building trust through transparency (Carless, 2009), and at raising awareness amongst students of existing resources and practices aimed at supporting their learning (e.g. renaming assessment criteria documents and making them available in spaces students are continually using). While these implementations were tailored to the perceived gaps in our assessment and feedback practices, they are in accord with growing consensus of what constitutes good practice, and may have contributed to the dramatic improvement in our NSS scores on assessment and feedback. References Boud, D and Molloy, E (2012) Rethinking models of feedback for learning: the challenge of design, Assessment & Evaluation in Higher Education, DOI:10.1080/02602938.2012.691462 Burns, L (2012) www.hefce.ac.uk/news/newsarchive/2012/name,75522,en.html Carless, D (2009) Trust, distrust and their impact on assessment reform, Assessment & Evaluation in Higher Education, 34, pp 79-89 Gibbs, G and Simpson, C (2004) Conditions Under Which Assessment Supports Students Learning, Learning and Teaching in Higher Education, 1, pp 3-31 Higher Education Funding Council For England (2012), National Student Survey, www.hefce.ac.uk/whatwedo/lt/publicinfo/nationalstudentsurvey/ Nicol, D (2006) Increasing success in first year courses: assessment re-design, selfregulation and learning technologies Paper presented at ASCILITE Conference, Sydney, Dec 3-6, 2006 Nicol, D (2011) Developing the students ability to construct feedback. Paper presented at QAA Enhancement Themes Conference, Heriot-Watt University, March 2-3, 2011

University of Strathclyde (2008) 12 Principles of Good Assessment and Feedback www.strath.ac.uk/learnteach/informationforstaff/staff/assessfeedback/12principles/ Last accessed May 2 nd, 2013 Thomson, D and Tulving, E (1973) Encoding specificity and retrieval processes in episodic memory. Psychological Review, 80, pp 352 373 Williams, J, and Kane, D (2009) Assessment and Feedback: Institutional Experiences of Student Feedback 1996 to 2007, Higher Education Quarterly, 63, pp 264-286