The score doesn t mean much : Students and staff understandings of text matching software

Similar documents
HISTORY COURSE WORK GUIDE 1. LECTURES, TUTORIALS AND ASSESSMENT 2. GRADES/MARKS SCHEDULE

Unit 7 Data analysis and design

TU-E2090 Research Assignment in Operations Management and Services

THESIS GUIDE FORMAL INSTRUCTION GUIDE FOR MASTER S THESIS WRITING SCHOOL OF BUSINESS

TROY UNIVERSITY MASTER OF SCIENCE IN INTERNATIONAL RELATIONS DEGREE PROGRAM

Academic Integrity RN to BSN Option Student Tutorial

SOCIAL PSYCHOLOGY. This course meets the following university learning outcomes: 1. Demonstrate an integrative knowledge of human and natural worlds

Effective practice of using digital portfolios: how can Queensland teachers inform teacher education practice?

School Size and the Quality of Teaching and Learning

Procedia - Social and Behavioral Sciences 98 ( 2014 ) International Conference on Current Trends in ELT

National Survey of Student Engagement (NSSE) Temple University 2016 Results

Greek Teachers Attitudes toward the Inclusion of Students with Special Educational Needs

Academic Success at Ohio State. Caroline Omolesky Program Officer for Sponsored Programs and Academic Liaison Office of International Affairs

CHMB16H3 TECHNIQUES IN ANALYTICAL CHEMISTRY

THEORY OF PLANNED BEHAVIOR MODEL IN ELECTRONIC LEARNING: A PILOT STUDY

Aronson, E., Wilson, T. D., & Akert, R. M. (2010). Social psychology (7th ed.). Upper Saddle River, NJ: Prentice Hall.

Effective Recruitment and Retention Strategies for Underrepresented Minority Students: Perspectives from Dental Students

SAMPLE. PJM410: Assessing and Managing Risk. Course Description and Outcomes. Participation & Attendance. Credit Hours: 3

The Use of Statistical, Computational and Modelling Tools in Higher Learning Institutions: A Case Study of the University of Dodoma

Norms How were TerraNova 3 norms derived? Does the norm sample reflect my diverse school population?

ASTR 102: Introduction to Astronomy: Stars, Galaxies, and Cosmology

Developing Students Research Proposal Design through Group Investigation Method

ACADEMIC POLICIES AND PROCEDURES

MSE 5301, Interagency Disaster Management Course Syllabus. Course Description. Prerequisites. Course Textbook. Course Learning Objectives

A pilot study on the impact of an online writing tool used by first year science students

MANAGERIAL LEADERSHIP

CS 100: Principles of Computing

Laporan Penelitian Unggulan Prodi

Programme Specification

EPI BIO 446 DESIGN, CONDUCT, and ANALYSIS of CLINICAL TRIALS 1.0 Credit SPRING QUARTER 2014

Course Content Concepts

Executive summary (in English)

Ph.D. in Behavior Analysis Ph.d. i atferdsanalyse

An Introduction and Overview to Google Apps in K12 Education: A Web-based Instructional Module

E-learning Strategies to Support Databases Courses: a Case Study

Social Media Journalism J336F Unique ID CMA Fall 2012

Evidence-based Practice: A Workshop for Training Adult Basic Education, TANF and One Stop Practitioners and Program Administrators

Exploring the Development of Students Generic Skills Development in Higher Education Using A Web-based Learning Environment

George Mason University Graduate School of Education Education Leadership Program. Course Syllabus Spring 2006

Khairul Hisyam Kamarudin, PhD 22 Feb 2017 / UTM Kuala Lumpur

Mktg 315 Marketing Research Spring 2015 Sec. 003 W 6:00-8:45 p.m. MBEB 1110

WP 2: Project Quality Assurance. Quality Manual

SAMPLE SYLLABUS. Master of Health Care Administration Academic Center 3rd Floor Des Moines, Iowa 50312

Introduction to Questionnaire Design

Instructor: Mario D. Garrett, Ph.D. Phone: Office: Hepner Hall (HH) 100

ECON 6901 Research Methods for Economists I Spring 2017

Texas A&M University - Central Texas PSYK PRINCIPLES OF RESEARCH FOR THE BEHAVIORAL SCIENCES. Professor: Elizabeth K.

Be aware there will be a makeup date for missed class time on the Thanksgiving holiday. This will be discussed in class. Course Description

MBA 5652, Research Methods Course Syllabus. Course Description. Course Material(s) Course Learning Outcomes. Credits.

NATIONAL CENTER FOR EDUCATION STATISTICS RESPONSE TO RECOMMENDATIONS OF THE NATIONAL ASSESSMENT GOVERNING BOARD AD HOC COMMITTEE ON.

TAI TEAM ASSESSMENT INVENTORY

Institutional repository policies: best practices for encouraging self-archiving

BUS Computer Concepts and Applications for Business Fall 2012

Application of Multimedia Technology in Vocabulary Learning for Engineering Students

English for Specific Purposes World ISSN Issue 34, Volume 12, 2012 TITLE:

Types of curriculum. Definitions of the different types of curriculum

A Case Study: News Classification Based on Term Frequency

OPAC and User Perception in Law University Libraries in the Karnataka: A Study

USE OF ONLINE PUBLIC ACCESS CATALOGUE IN GURU NANAK DEV UNIVERSITY LIBRARY, AMRITSAR: A STUDY

Ruggiero, V. R. (2015). The art of thinking: A guide to critical and creative thought (11th ed.). New York, NY: Longman.

Georgetown University School of Continuing Studies Master of Professional Studies in Human Resources Management Course Syllabus Summer 2014

Reducing Spoon-Feeding to Promote Independent Thinking

Social Media Marketing BUS COURSE OUTLINE

AC : DEVELOPMENT OF AN INTRODUCTION TO INFRAS- TRUCTURE COURSE

Sociology 521: Social Statistics and Quantitative Methods I Spring 2013 Mondays 2 5pm Kap 305 Computer Lab. Course Website

Individual Interdisciplinary Doctoral Program Faculty/Student HANDBOOK

BUS 4040, Communication Skills for Leaders Course Syllabus. Course Description. Course Textbook. Course Learning Outcomes. Credits. Academic Integrity

Chemistry 106 Chemistry for Health Professions Online Fall 2015

Curriculum Assessment Employing the Continuous Quality Improvement Model in Post-Certification Graduate Athletic Training Education Programs

Oakland Unified School District English/ Language Arts Course Syllabus

Title:A Flexible Simulation Platform to Quantify and Manage Emergency Department Crowding

Online Marking of Essay-type Assignments

Software Security: Integrating Secure Software Engineering in Graduate Computer Science Curriculum

Practice Examination IREB

HEALTH SERVICES ADMINISTRATION

Western University , Ext DANCE IMPROVISATION Dance 2270A

Arizona s English Language Arts Standards th Grade ARIZONA DEPARTMENT OF EDUCATION HIGH ACADEMIC STANDARDS FOR STUDENTS

STUDENT ASSESSMENT, EVALUATION AND PROMOTION

University of Victoria School of Exercise Science, Physical and Health Education EPHE 245 MOTOR LEARNING. Calendar Description Units: 1.

Chemistry Senior Seminar - Spring 2016

Student Course Evaluation Class Size, Class Level, Discipline and Gender Bias

Anglia Ruskin University Assessment Offences

IDS 240 Interdisciplinary Research Methods

e-teaching craft and practice

An application of student learner profiling: comparison of students in different degree programs

PSCH 312: Social Psychology

LSC 555 Information Systems in Libraries and Information Centers Syllabus - Summer Description

Student Handbook 2016 University of Health Sciences, Lahore

DATA MANAGEMENT PROCEDURES INTRODUCTION

MGMT 479 (Hybrid) Strategic Management

Inoffical translation 1

Course specification

University of Toronto

PSY 1010, General Psychology Course Syllabus. Course Description. Course etextbook. Course Learning Outcomes. Credits.

PSYC 2700H-B: INTRODUCTION TO SOCIAL PSYCHOLOGY

ATW 202. Business Research Methods

Politics and Society Curriculum Specification

School: Business Course Number: ACCT603 General Accounting and Business Concepts Credit Hours: 3 hours Length of Course: 8 weeks Prerequisite: None

Demography and Population Geography with GISc GEH 320/GEP 620 (H81) / PHE 718 / EES80500 Syllabus

Transcription:

The score doesn t mean much : Students and staff understandings of text matching software Lee Adam, Carol Bond Higher Education Development Centre University of Otago Ari Samaranayaka Dunedin School of Medicine University of Otago We report on a 2012 study exploring students and staff understandings of SafeAssign text matching software. A total of 326 students and 216 staff at a New Zealand university responded to a questionnaire asking about their knowledge of SafeAssign. Although SafeAssign does not identify plagiarism, 90% of students and 70% of staff thought that it did, indicating confusion about the difference between text matching and plagiarism. Moreover, both students and staff assumed that SafeAssign can do more than it does and responses indicated a lack of understanding of the sources with which SafeAssign is able to match text. Furthermore, many students were not aware that SafeAssign was being used in their papers, and some thought its function was simply an assignment submission platform. These results indicate that staff and students need more information about what SafeAssign does in order to make more effective use of the software. Keywords: Plagiarism, SafeAssign, text matching software Introduction Many higher education institutions worldwide have turned to the use of text matching software (Badge, 2010) in response to concern over the increase in prevalence of student plagiarism (Atkinson & Yeoh, 2008; Park, 2003). The use of text matching software (TMS), designed for the specific purpose of providing a tool to assist in the detection of students plagiarism, has increased over the last decade. There are a wide variety of tools available, including Turnitin, EVE2, and SafeAssign (SA). The use of TMS is based on users assumptions about its function and usefulness. But how accurate are these assumptions? The current research exposes the assumptions of students and staff about the functions of SA, and shows that the software may be being (mis)used for functions beyond that which is intended (Bond, Adam, Samaranayaka, Finn, Senior, Cross & Deaker, 2013). The emergence of TMS has occurred in parallel with an exponential increase in knowledge, increased access to information as a result of the Internet, plus the impact of increasing participation in higher education. One effect on academic staff is an increase in workloads, and particularly marking. Where once it was possible to recognise the writing styles of individual students and detect plagiarism manually, in many classes this is no longer the case (McKeever, 2006). The growth of the Internet has not only made it easier to plagiarise, but also easier to detect plagiarism using web-based plagiarism detection software (Evans, 2006; McKeever, 2006). However, the literature on such software tends to be limited to a small pool of empirically based research on effectiveness, institutionally based evaluation studies (Bond et al, 2013) students and staff perceptions (e.g., Atkinson & Yeoh, 2008) and opinion pieces. Much of the literature focuses on the tool Turnitin - a package that provides a similar service to SA, and is reportedly widely used (Badge, 2010). Most TMS have at least three claimed or emphasised functions: text matching (sometimes called the detection of plagiarism, or plagiarism prevention); the deterrence of plagiarism; and an educative tool useful in teaching academic skills such as paraphrasing and citation (Green et al, 2005). The text matching function of TMS requires the software to compare an electronic copy of an individual s work by searching for matches in an online environment. The software searches for strings of text (McKeever, 2006, p. 157) and produces an originality report denoting the percentage of matched text, a hyper-link to the possible sources, and a percentage score allocated to the number of words that match another source. Thus, the name plagiarism detection software is a misnomer as TMS does not identify plagiarism but only matches text submitted to it with text that it can access in its archives or on the Internet (Ledwith & Risquez, 2008). The deterrence function of TMS is reliant on the perception that implementing the software will reduce plagiarism as students knowledge that the tool is being used acts as deterrence (Badge et al, 2007). The educative function of TMS is dependent on students knowledge of plagiarism and what the TMS does, alongside using the TMS report to identify the problem, and the student being allowed the opportunity to resolve the problem and resubmit

(McKeever, 2006). However, research shows that few staff use TMS in an educative way (Löfström & Kupila, 2013). Methodology This mixed methods study explored students and staff understandings of SA. The research was part of a larger study, requested by the University s Committee for the Advancement of Teaching and Learning (CALT), that also explored students and staff understandings of plagiarism and their experiences with SA. The purpose of the research was to inform the management of plagiarism and SA the University. A sample of students and staff at the University were asked to respond to a questionnaire. The student sample was selected using one-stage cluster sampling. Students enrolled in seven papers using SA in 2012, representing a range of undergraduate paper levels from Health Sciences, Business, and Humanities, were surveyed. Although Science papers were not represented as SA is not commonly used in these papers, the sample included a first year service course that attracts students from a range of different programmes including sciences. Students were administered an inclass, paper-based questionnaire. Because the selected papers used SA, an assumption was made that all the students were familiar with SA. Students were assured that the survey was anonymous and they could choose whether to participate or not. Questionnaires were collected and data from the 326 respondents were manually entered into an Excel spreadsheet. The staff sample comprised those who chose to respond to an open email invitation to complete an online questionnaire, making this a self-selected sample. Responses were received electronically and managed by a University staff member outside of the research team. Identifiers were removed and anonymised data from the 216 respondents were supplied to the research team electronically on an Excel spreadsheet. Where possible similar questions were used for students and staff to allow a comparison of views. The wording of the two questionnaires was modified to suit the different experiences of students and staff. Likert like scales that included a neutral position and a don t know response were used. Open-ended questions were used to generate more qualitative data. General understandings of what SA did and how it worked were obtained by questions to all student and staff respondents whether they were users or not. Analysis The quantitative data generated by the two surveys were separately subjected to a simple descriptive analysis. Frequencies of Likert scale items were aggregated, for example, strongly agree and agree were treated as one. All analyses were completed using the svy suite of commands in Stata12.1 software (2011). Descriptive statistics such as percentages of respondents with relevant characteristics were generated. Confidence intervals were calculated that accounted for the specificities of the corresponding sampling design such as sampling weights and possible cluster effects by students in the same paper. In relevant situations, percentages were compared for staff and students samples (e.g., Austin & Hux, 2002). The qualitative data generated by the open-ended questions and requests for comment were analysed and loosely categorized according to topic using a simple text analysis (Bond et al, 2013). Students and staff understandings of SafeAssign All the students in the student sample were enrolled in papers that used SA, however many students were unaware that SA was being used in their papers. Only 258 of the 326 student respondents (79%) identified that they had used SA, and some of these students indicated their belief that SA s main function was as a platform for assignment submission. Of the 216 staff who responded to the survey, only 97 (45%) used SA in the papers they taught, however all staff were asked about the functions of SA. Students and staff responses about the functions of SA are displayed in Figure 1 below. Most students and staff displayed awareness of the main functions of SA as text matching and production of a percentage score. Slightly more students (84%) than staff (79%) strongly agreed or agreed that SA identified words and phrases that matched words and phrases in published text. However, fewer staff (59%) than students (74%) were aware that SA provides a percentage score of matched text. In the staff responses, there was little difference between the responses of users and non-users for these two items, but all of the don t know responses belonged to nonusers. One user commented, SA can also indicate very good referencing quotations etcetera will match so you have to look at the highlighted document the score doesn t mean much. A very high proportion of students (90%) but fewer staff (70%) thought that SA identified plagiarism. Again 400

there was little difference between the responses of staff users and non-users, and all the don t know responses belonged to non-users. One staff noted, SA can detect other students work but only if it has been previously submitted to SA. However it cannot always tell you the name of the original student. One student commented, All I know is that it checks for plagiarism. Students and staff were less aware of other aspects of SA, and their responses also differed from each other in varying degrees. About 43% of students and 32% of staff thought that SA was able to indicate the need for a reference, though many respondents disagreed with this assertion - approximately 16% of students and 31% of staff indicated that they did not know this was the case. Staff user don t know responses accounted for 27%. Only 24% of students thought that SA provided information on parts of the assignment that could be changed, compared with 35% of staff. Slightly more staff (15%) than students (8%) thought that SA taught students about referencing, however 32% of staff indicated their belief that SA identified poor referencing. When asked if SA deterred students from plagiarising, 61% of staff either agreed or strongly agreed. One student responded, I do not know anything about it, but would like to learn. Percent strongly agree/agree 100 90 80 70 60 50 40 30 20 10 0 Identifies words and phrases that match words and phrases in text Identifies plagiarism Identifies the % score of how much the assignment is plagiarised Identifies the need for a reference Provides information on parts of the assignment that could be changed Teaches me about referencing* Indicates poor referencing (staff only) Deters students from plagiarising (staff only) Students Staff Figure 1: Students and staff understandings of the functions of SafeAssign with 95%CI. *Staff were asked a slightly different version: Provides feedback about what the student needs to learn about referencing 401

Percent 90 80 70 60 50 40 30 20 10 0 Academic journals Internet web pages Theses Subsscription databases Textbooks Other students' assignments from the university Search engines such as Google, Google Scholar Assignments written by classmates Students' assignments from other universities Other books (staff only) Don't know (students only) Students Staff Figure 2: Students and staff understandings of the sources available to SA to match text with 95% CI All students and all staff were asked about the sources SA is able to access for text matching. Responses showed a lack of knowledge. The results are summarised in Figure 2 above. One of the main sources SA access are subscription data bases such as ProQuest, Academic One File or JStor, however only 53% of students and 40% of staff were aware of this. Although SA does not access theses or textbooks, approximately 50% of students and 25% of staff thought that it did. Furthermore, only 60% of students and 50% of staff were aware that SA checks against archives of previous students assignments, and only 54% of students and 48% of staff identified that it also checks against assignments written by others in the same class. Both staff users and non-users indicated similar levels of agreement for these items. Discussion The results of this research show that students and staff believe that SA can do more than it does. Although SA does not identify plagiarism, 90% of the student respondents, and 70% of the staff respondents in this study thought that it did. These results indicate that both students and staff are confused about the difference between text matching and plagiarism, and may be conflating them. This misconception may lead to SA being more likely to be perceived as a plagiarism detection tool, thus ignoring the software s educative function. The lack of awareness of participants in this study of SA as a tool to identify areas of an assignment that could be reworked or more effectively referenced provides further indication that the educative function of SA was not being fully utilised by students and staff. This finding is consistent with the literature that reports students in particular are unaware of the role TMS can play in academic writing (Löfström & Kupila, 2003). In order for TMS to function as an educative tool, staff must use the report when working with students to identify the problem and a possible resolution. In addition, students need to have the opportunity to resubmit the assignment to ensure the solution is effective (McKeever, 2006). In this way, staff can effectively work with students struggling with effective paraphrase practices or correct citation. However, this research suggests that staff and students focused on the percentage score produced by the software and may not understand that the SA report can be used in this way. Despite overestimating the function of SA, neither students nor staff had an accurate understanding of the sources of text that SA can use for matching. In general, both students and staff underestimated the range of sources that SA uses for checking, and indicated their belief that it can check against sources that it cannot (for example textbooks). One of the more concerning results of this research is that many students were unaware that SA was being used in their papers, and many students believed SA s main function was assignment submission. Although one of the reported features of TMS is its ease and convenience for the submission of work (Evans, 2006; Ledwith & 402

Risquez, 2008), submission of assignments is not a primary function of SA. Students in this study indicated that they should be made aware of the use of SA in their papers. The results of this research indicate that both students and staff required more information on the functions and use of SA. In particular, they required more information on the educative function of SA as a means to reduce unintentional plagiarism. The results of the survey have been fed back to CALT, with a recommendation that the use of SA be continued, but that more resources are directed into education and support for student and staff using SA. Acknowledgements This research was funded by a University of Otago Quality Enhancement Grant. References Atkinson, D., & Yeoh, S. (2008). Student and staff perceptions of the effectiveness of plagiarism detection software. Australasian Journal of Educational Technology, 24(2), 222-240. Austin, P. & Hux, J. (2002). A brief note on overlapping confidence intervals. Journal of Vascular Surgery, 36, 1, 194-195. Badge, J. (2010). How effective are electronic plagiarism detection systems and does it matter how you use them? Reviewing the evidence. Paper presented at the 4 th International Plagiarism Conference Towards an Authentic Future, Northumbria University. Retrieved from http://plagiarismadvice.com/documents/conference2010/papers/4ipc_0023_final.pdf. Badge, J., Cann, A. J., & Scott, J. (2007). To cheat or not to cheat? A trial of the JISG plagiarism detection service with biological sciences students. Assessment & Evaluation in Higher Education, 32(4), 433-439. Bond, C., Adam, L., Samaranayaka, A., Finn, K., Senior, A., Cross, D., & Deaker, L. (2013). Plagiarism and the use of SafeAssign at the University of Otago: A Report prepared for the Quality Advancement Office and the Committee for the Advancement of Learning and Teaching (CALT): University of Otago. Evans, R. (2006). Evaluating an electronic plagiarism detection service: The importance of trust and the difficulty of proving students don't cheat. Active Learning in Higher Education, 7(87), 87-99. Green, D., Lindemann, L., Marshall, K., & Wilkinson, G. (2005). Student perceptions of a trial of electronic text matching software: A preliminary investigation. Journal of University Teaching and Learning Practice, 2(3). Ledwith, A., & Risquez, A. (2008). Using anti-plagiarism software to promote academic honesty in the context of peer reviewed assignments. Studies in Higher Education, 33(4), 371-384. Löftström, E., & Kupila, P. (2013). The instructional challenges of student plagiarism. Journal of Academic Ethics, 11, 231-242. McKeever, L. (2006). Online plagiarism detection services - saviour or scourge? Assessment & Evaluation in Higher Education, 31(2), 155-165. Park, C. (2003). In other (people's) words: Plagiarism by university students - literature and lessons. Assessment & Evaluation in Higher Education, 28(5), 471-488. Please cite as: Adam, L., Bond, C., & Samaranayaka, A. (2014). The score doesn t mean much : Students and staff understandings of text matching software. In B. Hegarty, J. McDonald, & S.-K. Loke (Eds.), Rhetoric and Reality: Critical perspectives on educational technology. Proceedings ascilite Dunedin 2014 (pp. 399-403). Note: All published papers are refereed, having undergone a double-blind peer-review process. The author(s) assign a Creative Commons by attribution licence enabling others to distribute, remix, tweak, and build upon their work, even commercially, as long as credit is given to the author(s) for the original creation. 403