Improving Student Writing Abilities in Geography: Examining the Benefits of Criterion-Based Assessment and Detailed Feedback

Similar documents
University of Toronto Mississauga Degree Level Expectations. Preamble

Inquiry Learning Methodologies and the Disposition to Energy Systems Problem Solving

Graduate Program in Education

ASSESSMENT REPORT FOR GENERAL EDUCATION CATEGORY 1C: WRITING INTENSIVE

Scoring Guide for Candidates For retake candidates who began the Certification process in and earlier.

Copyright Corwin 2015

TU-E2090 Research Assignment in Operations Management and Services

Statistical Analysis of Climate Change, Renewable Energies, and Sustainability An Independent Investigation for Introduction to Statistics

ACADEMIC AFFAIRS GUIDELINES

HDR Presentation of Thesis Procedures pro-030 Version: 2.01

Number of students enrolled in the program in Fall, 2011: 20. Faculty member completing template: Molly Dugan (Date: 1/26/2012)

University of Waterloo School of Accountancy. AFM 102: Introductory Management Accounting. Fall Term 2004: Section 4

Senior Project Information

Rubric for Scoring English 1 Unit 1, Rhetorical Analysis

Sectionalism Prior to the Civil War

Summer Assignment AP Literature and Composition Mrs. Schwartz

Teachers Guide Chair Study

Just in Time to Flip Your Classroom Nathaniel Lasry, Michael Dugdale & Elizabeth Charles

EQuIP Review Feedback

MYP Language A Course Outline Year 3

Ryerson University Sociology SOC 483: Advanced Research and Statistics

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

EDUC-E328 Science in the Elementary Schools

Audit Of Teaching Assignments. An Integrated Analysis of Teacher Educational Background and Courses Taught October 2007

A pilot study on the impact of an online writing tool used by first year science students

The College Board Redesigned SAT Grade 12

MASTER S THESIS GUIDE MASTER S PROGRAMME IN COMMUNICATION SCIENCE

School Leadership Rubrics

November 2012 MUET (800)

Biological Sciences, BS and BA

Highlighting and Annotation Tips Foundation Lesson

The ELA/ELD Framework Companion: a guide to assist in navigating the Framework

Secondary English-Language Arts

Observing Teachers: The Mathematics Pedagogy of Quebec Francophone and Anglophone Teachers

Writing for the AP U.S. History Exam

PAGE(S) WHERE TAUGHT If sub mission ins not a book, cite appropriate location(s))

Facing our Fears: Reading and Writing about Characters in Literary Text

MAINTAINING CURRICULUM CONSISTENCY OF TECHNICAL AND VOCATIONAL EDUCATIONAL PROGRAMS THROUGH TEACHER DESIGN TEAMS

A Study of the Effectiveness of Using PER-Based Reforms in a Summer Setting

AC : DEVELOPMENT OF AN INTRODUCTION TO INFRAS- TRUCTURE COURSE

STUDENT LEARNING ASSESSMENT REPORT

English Policy Statement and Syllabus Fall 2017 MW 10:00 12:00 TT 12:15 1:00 F 9:00 11:00

Arizona s English Language Arts Standards th Grade ARIZONA DEPARTMENT OF EDUCATION HIGH ACADEMIC STANDARDS FOR STUDENTS

HISTORY COURSE WORK GUIDE 1. LECTURES, TUTORIALS AND ASSESSMENT 2. GRADES/MARKS SCHEDULE

Last Editorial Change:

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Examining the Structure of a Multidisciplinary Engineering Capstone Design Program

PSYCHOLOGY 353: SOCIAL AND PERSONALITY DEVELOPMENT IN CHILDREN SPRING 2006

TRAITS OF GOOD WRITING

STANDARDS AND RUBRICS FOR SCHOOL IMPROVEMENT 2005 REVISED EDITION

Reference to Tenure track faculty in this document includes tenured faculty, unless otherwise noted.

Create A City: An Urban Planning Exercise Students learn the process of planning a community, while reinforcing their writing and speaking skills.

M.S. in Environmental Science Graduate Program Handbook. Department of Biology, Geology, and Environmental Science

Timeline. Recommendations

DISTRICT ASSESSMENT, EVALUATION & REPORTING GUIDELINES AND PROCEDURES

Assessment and Evaluation

Ministry of Education General Administration for Private Education ELT Supervision

Textbook: American Literature Vol. 1 William E. Cain /Pearson Ed. Inc. 2004

Handbook for Graduate Students in TESL and Applied Linguistics Programs

English 491: Methods of Teaching English in Secondary School. Identify when this occurs in the program: Senior Year (capstone course), week 11

EDIT 576 (2 credits) Mobile Learning and Applications Fall Semester 2015 August 31 October 18, 2015 Fully Online Course

Writing an Effective Research Proposal

Personal Tutoring at Staffordshire University

Reducing Spoon-Feeding to Promote Independent Thinking

Improvement of Writing Across the Curriculum: Full Report. Administered Spring 2014

CÉGEP HERITAGE COLLEGE POLICY #15

Appendix. Journal Title Times Peer Review Qualitative Referenced Authority* Quantitative Studies

Smarter Balanced Assessment Consortium: Brief Write Rubrics. October 2015

1. Professional learning communities Prelude. 4.2 Introduction

Intensive Writing Class

Curriculum and Assessment Policy

Degree Qualification Profiles Intellectual Skills

Indiana Collaborative for Project Based Learning. PBL Certification Process

Demography and Population Geography with GISc GEH 320/GEP 620 (H81) / PHE 718 / EES80500 Syllabus

Update on Standards and Educator Evaluation

Calculators in a Middle School Mathematics Classroom: Helpful or Harmful?

Arkansas Tech University Secondary Education Exit Portfolio

International School of Kigali, Rwanda

APA Basics. APA Formatting. Title Page. APA Sections. Title Page. Title Page

ENG 111 Achievement Requirements Fall Semester 2007 MWF 10:30-11: OLSC

Achievement Level Descriptors for American Literature and Composition

Guidelines for Writing an Internship Report

Using Team-based learning for the Career Research Project. Francine White. LaGuardia Community College

BIOH : Principles of Medical Physiology

Classroom Assessment Techniques (CATs; Angelo & Cross, 1993)

Effective practices of peer mentors in an undergraduate writing intensive course

MGMT 3362 Human Resource Management Course Syllabus Spring 2016 (Interactive Video) Business Administration 222D (Edinburg Campus)

- COURSE DESCRIPTIONS - (*From Online Graduate Catalog )

Business 712 Managerial Negotiations Fall 2011 Course Outline. Human Resources and Management Area DeGroote School of Business McMaster University

What does Quality Look Like?

Social Emotional Learning in High School: How Three Urban High Schools Engage, Educate, and Empower Youth

Personal Project. IB Guide: Project Aims and Objectives 2 Project Components... 3 Assessment Criteria.. 4 External Moderation.. 5

Unit 3. Design Activity. Overview. Purpose. Profile

Mini Lesson Ideas for Expository Writing

Prentice Hall Literature: Timeless Voices, Timeless Themes Gold 2000 Correlated to Nebraska Reading/Writing Standards, (Grade 9)

Language Arts Methods

Metadiscourse in Knowledge Building: A question about written or verbal metadiscourse

Going back to our roots: disciplinary approaches to pedagogy and pedagogic research

A Note on Structuring Employability Skills for Accounting Students

EDIT 576 DL1 (2 credits) Mobile Learning and Applications Fall Semester 2014 August 25 October 12, 2014 Fully Online Course

Transcription:

Journal of Geography ISSN: 0022-1341 (Print) 1752-6868 (Online) Journal homepage: http://www.tandfonline.com/loi/rjog20 Improving Student Writing Abilities in Geography: Examining the Benefits of Criterion-Based Assessment and Detailed Feedback Joseph Leydon, Kathi Wilson & Cleo Boyd To cite this article: Joseph Leydon, Kathi Wilson & Cleo Boyd (2014) Improving Student Writing Abilities in Geography: Examining the Benefits of Criterion-Based Assessment and Detailed Feedback, Journal of Geography, 113:4, 151-159, DOI: 10.1080/00221341.2013.869245 To link to this article: http://dx.doi.org/10.1080/00221341.2013.869245 Published online: 23 Jan 2014. Submit your article to this journal Article views: 182 View related articles View Crossmark data Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/action/journalinformation?journalcode=rjog20 Download by: [Turku University] Date: 15 September 2015, At: 03:29

Improving Student Writing Abilities in Geography: Examining the Benefits of Criterion-Based Assessment and Detailed Feedback Joseph Leydon, Kathi Wilson, and Cleo Boyd Downloaded by [Turku University] at 03:29 15 September 2015 ABSTRACT Poor quality writing skills among undergraduate geography students is a significant concern among university instructors. This article reports on a multipronged strategy aimed at improving student writing in a large, first-year human geography course. The strategy emphasized ways to provide effective feedback through teaching assistant training, criterion referenced assessment, draft and final submission, peer review, and in-class writing exercises. Writing activities focused on building geographic understanding by emphasizing geographical content and spatial connections through map and data interpretation. Success of the strategy was evaluated by examining student grades, as well as the quality and content of their written work. Key Words: writing skills, critical pedagogy, geographic education, undergraduate education Joseph Leydon is a lecturer in the Department of Geography at the University of Toronto Mississauga, Canada, where he teaches a number of large classes in human geography. His primary research interests are in pedagogy and experiential learning. Kathi Wilson is professor and chair of the Department of Geography at the University of Toronto Mississauga, Canada. She has taught a range of first-year and upper-level human geography courses and is a strong advocate of Universal Design for Learning. Cleo Boyd is an instructional designer and educational developer and former director of the Robert Gillespie Academic Skills Centre at the University of Toronto Mississauga, Canada. INTRODUCTION Many human geography courses require students to annotate articles, write short paragraphs that describe and interpret geographic data, and to write essays and answer examination questions. However, the writing skills of university students remain an ongoing cause of concern among professors (Cadwallader and Scarboro 1982; Gambell 1987; Guise et al. 2008). Poor writing skills are a particular concern given that writing is directly linked with academic progress (English et al. 1999). Other researchers suggest that professors do not teach students how to write because they assume students enter university with the required writing skills (Kautzman 1996; Fitzgerald 2004; Moni et al. 2007) or will acquire such skills over time with practice (Pain and Mowl 1996). In addition, large class sizes, and limited time and resources can prevent professors from being able to teach writing to first-year students in courses that are not explicitly designed to do so (Hay and Delaney 1994; Moni et al. 2007). Writing in geography at the university level is further impacted by the lack of adequate preparation in geography at the high school level with limited course offerings and few students taking geography beyond grade nine (Bednarz et al. 2006). Consequently, many students enter first-year courses with partial understanding of the subject matter or ability to think or communicate geographic content. In addition, students in introductory geography courses fail to recognize the importance of writing to the discipline and frequently consider writing an onerous task (Slinger-Friedman and Patterson 2012). In recent years, instructors have adopted a number of innovative approaches in university-level geography courses to improve skills in written communication and to enhance students abilities to think geographically. Indeed, many geography instructors argue that writing in the classroom is important for students to learn to think like geographers (Libbee and Young 1983; Hooey and Bailey 2005; McGuinness 2009; Slinger-Friedman and Patterson 2012). Building upon innovative approaches adopted by other geography instructors, in this article we describe and evaluate a multipronged strategy for improving the writing skills and geographic knowledge of undergraduates in a first-year human geography course. IMPROVING STUDENT WRITING IN GEOGRAPHY AN OVERVIEW OF INNOVATIONS The literature on writing in geography demonstrates that merely increasing the number of written assignments alone does not result in a direct improvement in writing abilities. Rather, the use of writing assignments coupled with clear instruction, effective feedback, and opportunities for revision are key (Kennedy- Kalafatis and Carleton 1996; Manzo 2002; Heyman 2004). Kennedy-Kalafatis and Carleton (1996) show that the submission of multiple drafts and peer review and self-editing are linked with improvements in writing, including reductions in grammatical and punctuation errors and an increase in identifiable thesis statements between draft to final submissions. A number of strategies designed to improve student writing (and learning) demonstrate the importance of providing students with detailed criteria to help them understand how their written work will be evaluated (for example, see Hay and Delaney 1994). Work by Summerby-Murray (2010) shows that in the absence of detailed and explicit criteria students struggle to engage in narrative writing. Both Cook (2000) and Park (2003) demonstrate the utility of marking criteria to Journal of Geography 113: 151 159 C 2014 National Council for Geographic Education 151

Downloaded by [Turku University] at 03:29 15 September 2015 Joseph Leydon, Kathi Wilson, and Cleo Boyd clarify the importance of specific tasks and to provide students with more detailed feedback in journal writing. Evaluations of writing initiatives in geography courses tend to focus on both instructor and student reflections (Cook 2000; Sivan 2000; Haigh 2001; Hyers 2001; Park 2003; Savin-Baden and Van Nierkerk 2007; McGuinness 2009; Summerby-Murray 2010, Slinger-Friedman and Patterson 2012). With the exception of Hooey and Bailey (2005), little quantitative research focuses on the relationship between writing intervention strategies and student scores on tests and examinations. This lack of quantitative analysis, in part, relates to the use of writing interventions such as reflective journals, creative writing assignments, and narratives that are difficult to assess empirically. In a recent paper, Slinger-Friedman and Patterson (2012) report that even when students believe an intervention results in a positive effect, their writing in examinations remains poor. They conclude that the lack of improvement in performance indicates problems in alignment between writing activities and examination questions. The findings in the literature with respect to group writing or peer review are quite mixed. In their assessment of the utility of peer review for improving essay writing, Hay and Delaney (1994) find that the majority of students believe that peer review (in this case writing groups) increased their understanding of the essay-writing process. In contrast, Pharo and Kristy (2009), show that while instructors perceive peer review to be useful (i.e., because it emphasizes active learning), only a minority of firstyear geography students actually believe it helps to better understand the assessment. Mowl and Pain (1995) (see also Pain and Mowl 1996) conclude that the value of peer review in improving student learning lies mainly in formative assessment. While a number of different strategies exist to improve and enhance students written communication skills, the literature indicates that no one strategy will universally result in improvements. Integrated approaches that involve the use of multiple strategies (e.g., a combination of criterion-based assessment, peer review) are effective for improving writing in geography (Hay and Delaney 1994; Kennedy-Kalafatis and Carleton 1996; Heyman 2004; Levia and Quiring 2008). COURSE BACKGROUND AND MULTIPRONGED WRITING INITIATIVE STRATEGY The writing initiative occurred in a first-year introductory geography course that runs from September April with an annual enrollment of approximately 250 300 students. The class meets twice weekly for lectures and once weekly for tutorials, with approximately twenty-five students in each tutorial group. Weaknesses in student writing included problems with grammar and spelling, organization, an inability to analyze geographic ideas, and conceptualization and understanding of course content. The writing problems identified in our course are reflective of problems in first-year courses across departments at the university. In response to this, the Office of the Dean recognized student writing as a priority focus and established the Dean s Pilot Project for Writing Development, a fund that supports the implementation of writing initiatives in individual courses. Individual course instructors worked with the director of the Robert Gillespie Academic Skills Centre (RGASC) to develop a writing strategy for their courses. Initiatives received funding initially for one year with the possibility of renewal based on the success of the initiative. The funding supports an increase in teaching assistant allocations to facilitate writing initiatives. The goal of our writing initiative was to move students from a surface to a deep approach to learning geography as exhibited through their writing (Marton and Saljo 1976a, 1976b; Biggs 1987; Biggs 1999; Drummer et al. 2008). Our strategy emphasizes writing as a process and way of learning to think geographically. The strategy includes a number of important elements: criterion-based assessment, the submission of both a draft and a final product, and the use of multiple forms of feedback and in-class writing activities. The writing initiative has been in place since 2006/2007 and has had a number of changes made to it over the past six years. In the first three years of the initiative, the strategy focused on improving written assignments, and more recently has focused on improving written test answers. Criterion-Based Assessment A common weakness in student writing is that the work they submit often does not reflect what instructors want (Gambell 1987; Pain and Mowl 1996; Biggs 1999; Weimer 2002). We used criterion-based assessment (CBA) to help students better understand expectations ahead of time. We developed criterion-based assessments that are authentic to each activity (i.e., avoid generic criteria) and are transparent (i.e., everyone knows what is being assessed, how it is being assessed, and why it is being assessed). For each assignment, the course instructor and the Director of the RGASC codesigned a grading rubric. The rubrics obviously varied because they are unique to each assignment; however, all contain expectations of precision in response to the task, the selection, and organization of relevant information, depth, and clarity of thought. Drafts The literature reveals that students may exhibit poor writing skills because they submit work that is not adequately developed or considered (i.e., first drafts) (Nightingale 1991, 5). The use of near-final and final drafts has been shown to be an effective strategy for improving student writing (Kennedy-Kalafatis and Carleton 1996). As Heyman (2004, 148) asserts, without revision... students have no incentive for improving their writing or learning from the instructors comments. As part of our strategy, 152

Downloaded by [Turku University] at 03:29 15 September 2015 we required students to submit both a draft and a final submission for their assignments. The draft and final written version received equal weighting to ensure that all students submitted a high-quality draft. This encouraged students to begin working on assignments much earlier and gave us the opportunity to provide them with important feedback that they could act on prior to their final submission. The questions in the assignments required the construction of descriptive paragraphs (what and where?) through data interpretation (why?) and logical argument development to move students to deeper learning. Effective Feedback Levia and Quiring (2008, 217) argue that in order to maximize student learning, instructors must provide students with appropriate, explicit and timely feedback. Similarly, others suggest feedback is the single most important influence on student achievement (Hattie 1987 cited in Hulme and Forshaw 2009, 34). An important part of the initiative was preparing teaching assistants to engage in the instruction of undergraduates and provide effective feedback. We selected teaching assistants with a high degree of knowledge of the course content and good basic writing skills. Teaching assistants attended training sessions led by the director of the RGASC on how to apply the criteria to the writing, explain criteria to the students, and provide effective feedback. We also adopted a mandatory undergraduate writing textbook for the course. Making Sense: A Student s Guide to Research and Writing (Northey and Knight 2007). Teaching assistants were required to become familiar with the textbook and to refer students to appropriate sections. We held mock grading sessions for each written assignment during which each teaching assistant, the course instructor, and the director of the RGASC individually graded two or three assignments. The group as a whole discussed and evaluated the grading performed by each teaching assistant. Finally, the course instructor and the director of the RGASC conducted reliability checks throughout the semester to ensure consistency in grading and effectiveness of feedback. To facilitate the process of providing meaningful feedback, two deadlines were set: one for the draft and one for the final submission. The teaching assistants assessed the draft according to the criteria and provided detailed written feedback to students within one week of submission. The written feedback not only focused on syntax and organization but also on whether or not students addressed the task and conveyed an understanding of geographic concepts and ideas. Students revised their written work and submitted their final assignment the following week. Feedback on the final submission not only focused on areas of weakness but also identified areas where students improved their writing, relative to their draft submission. The criterion checklist, which contained the draft and final grade assigned to each component, also enabled students to see numerically areas in which their writing improved. Improving Student Writing Abilities in Geography We introduced peer assessment as part of the writing strategy in 2008/2009 to provide students with firsthand experience of how to apply the criteria. We hoped that by assessing another student s written work (according to the assignment criteria), students would gain a better understanding of how the teaching assistants would apply the criteria to their own work. To evaluate the impact of the training that the teaching assistants received the instructor and the director of the RGASC conducted focus groups. The teaching assistants reported that the training and their subsequent interactions with the students added depth to their own geographic thinking and improved their confidence in assessing and giving effective feedback to students. Several teaching assistants reported that for the first time they understood how using writing as a process to gain depth of understanding of content actually works. EVALUATING THE WRITING INITIATIVE DRAFTS AND FINAL ASSIGNMENTS We evaluated the assignment portion of the initiative by comparing the average grades students earned on their draft and final submissions. Table 1 indicates the average grade earned between the draft and final submission increased for all three years. This appears to suggest that student writing not only improved between the draft and final submission but over the course of the semester. It is difficult to make direct comparisons since each writing assignment was unique. The teaching assistants indicated that the majority of students demonstrated a higher level of geographic understanding and made significant organizational and structural changes to their assignments based on the comments provided on the drafts. In 2008/2009, we conducted a short, anonymous formal feedback survey at the end of the course. The survey collected information on students perceptions of the writing initiative focused on assignments and its effect on their overall writing abilities. One hundred and eighteen students, representing 54 percent of the final class enrollment, completed the survey. The results of the feedback survey revealed that the majority of students who responded perceived the draft and final submission to be helpful, with 56 percent of respondents giving a rating of very helpful, 41 Table 1. Draft and final assignment grades. 2006/2007 2007/2008 2008/2009 (Average %) (Average %) (Average %) Assignment # Assignment # Assignment # 1st. 2nd. 1st. 2nd. 1st. 2nd. Draft 53 66 60 55 62 59 Final 75 79 77 72 77 73 153

Downloaded by [Turku University] at 03:29 15 September 2015 Joseph Leydon, Kathi Wilson, and Cleo Boyd percent stating it was somewhat helpful and only 3 percent indicating it was not very helpful. In addition, 90 percent of the students who completed the survey reported that their writing skills had improved because of the initiative. Of the eleven students who did not think their writing skills improved, two students were in their final year and felt that they had acquired ample writing skills during the course of their university studies. Three of the students stated that their writing did not improve because they did not put forth an honest effort in writing their final assignments. The remaining students reported that assessments are too subjective or that the feedback focused too much on content. The students who indicated that their writing skills improved identified key areas of improvement. Since the question was open-ended, some students reported more than one area of improvement. Students cited improvements related to understanding of geographic content, improvements in grammar, and that the feedback provided by their teaching assistants helped them in understanding how to organize written answers to questions, including the proper use of paragraphs and keeping similar themes together. Evaluation of their written performance demonstrated an improvement in students understanding of geographic concepts and ideas between the first and final drafts. For example, one assignment based on population geography required students to construct two population pyramids (one for the province of Québec and one for the territory of Nunavut). The students were required to answer two written questions based on the population pyramids they had created (Appendix 1). With respect to the first draft, most students did not compose their answers as a true comparison of the two pyramids. Instead they wrote as if describing each population pyramid separately (i.e., with respect to the shape they did not indicate that the pyramid for Nunavut had a broader base than the pyramid for Quebec, which in turn was relatively wider at the top than the pyramid for Nunavut). In addition, the implications of population growth described in the first draft led us to believe that many of the students did not grasp a basic understanding of the geographic implications of population dynamics. Answers were simplistic, superficial, and reflected a lack of critical thought (e.g., many students simply stated that the faster growing population in Nunavut would result in a younger population). In contrast, in the final submission not only did most students provide a clearly written and detailed comparison of the two pyramids, they also identified convincing implications of the population growth observed (e.g., students noted that faster growth results in a younger population but also that a younger population places specific demands on the province including health care, education, and raises questions about competition for employment when they enter the job market etc.) thereby demonstrating an improvement in their geographic understanding. We observed similar improvements in geographic understanding and critical thinking through their writing in the final submission of the other course assignments. CLASS TESTS AND FINAL EXAMINATIONS For the final two years of the initiative, the focus changed from assignments to student written responses in class tests and final examinations. While the first writing initiative we developed improved student writing on assignments, student writing on class tests and the final examination remained problematic (see also Slinger-Friedman and Patterson 2012). Key problems related to organization, an inability to analyze ideas, conceptualization, and understanding of course content. Evidence of deep learning was absent from test/examinations with answers often superficial and reflecting a lack of critical thought. This is not surprising given the context of tests and examinations with students facing time constraint and pressure to complete a task without access to resources. Consequently, we switched the focus to improving student performance in their tests and the examination to help students make a transition to time pressured intensive examination writing. In the new initiative, we continued to emphasize deep learning, the effective communication of geographic concepts, and clarity in written expression. Again, we employed criterion-based assessment with an evaluation of three criteria: quality of the response (i.e., content), quality of organization, and quality of writing (Appendix 2). We also had students write drafts and receive feedback in weekly tutorials. Students completed tasks including writing answers to specific questions with emphasis on content, organization, and writing quality. Similar types of questions appeared in formal course class tests and the final examination to ensure appropriate alignment between the tutorial activities and the formal assessment. Questions typically had multiple parts following a scaffolding strategy with a progression from information provision to assessments of the information to commentary on the relevance of the information. Questions often involved assessment of geographic data and the interpretation of graphs and maps, which are essential components of a geographical education. In the example of population pyramids, questions tested geographic knowledge (e.g., the key characteristics of a population pyramid), the application and interpretation of such knowledge (e.g., using the data to describe population growth or decline), and their ability to contextualize geographic knowledge (e.g., regional variations in population growth/decline, relevance of geographic data). Teaching-assistant training followed a similar pattern to the writing initiative focused on assignments. We held weekly meetings to train teaching assistants to deliver instruction in content and skills during tutorials to ensure that they understood the criteria, knew how to apply the criteria to the writing, explain the criteria to the students and how to provide effective individual feedback to students on their performance. We conducted mock grading sessions where teaching assistants, the course 154

Improving Student Writing Abilities in Geography Downloaded by [Turku University] at 03:29 15 September 2015 Table 2. Scores on written tests 2011/2012. Quality of response Quality of organization Quality of writing % of students % of students % of students Score Test 1 Test 2 Final Test 1 Test 2 Final Test 1 Test 2 Final 0 10 7 6 5 3 3 2 2 2 1 40 33 25 32 23 17 29 25 23 2 34 40 43 41 46 48 41 39 36 3 12 14 17 18 22 25 25 28 33 4 4 6 9 4 6 8 3 6 6 instructor, and the director of the RGASC scored, for each category of the grading criteria, a number of randomly selected student responses and discussed the scores with reference to the grading criteria. Such sessions were vital to ensure that the teaching assistants evaluated student answers through application of the grading criteria rather than through relative grading. The instructor carried out reliability checks prior to returning work to students. EVALUATING THE WRITING INITIATIVE TESTS AND FINAL EXAMINATIONS We recorded the grades for quality of the response, quality of organization, and quality of writing for each test and monitored the changes in performance. Student scores improved in all categories over the course of the academic year with most students achieving a full point increase and some a two-point increase (see Table 2). The scores for the quality of the response (i.e., a student s ability to convey geographic concepts and ideas) were consistently lower than the scores for the other criteria. This indicates that students continued to have difficulty identifying the appropriate information with which to answer questions. The quality of student answers was consistently higher in description than in application or analysis. On the other hand, improvements in all three areas occurred over the course of the term. It is also important to note that as quality of response skills improved so also did writing skills. The more clearly students were able to analyze the task contained within a question, the more clearly they were able to organize their thinking and write about the task. In essence, as a student s ability to understand a particular geographic concept improved so also did their ability to organize a coherent answer and write about these concepts. There were marked improvements in precision and level of detail between responses in the in-class written exercises and in test/examination answers. Students improved in their ability to respond with information that addressed the specifics of the questions. There was also an increase in engagement with geographic concepts such as location and spatial variation, and an improved ability to identify and comment on location specific issues, and to address interrelationships between locations. 1 These improvements further indicate enhanced ability to deal with the timing pressures associated with tests and examinations. The results suggest that with practice and focused instruction students can isolate relevant information to respond to examination questions and to structure a response that demonstrates an understanding of geographic concepts. Peer assessment in this writing initiative also produced mixed results. Specifically, when applying grading criteria students scores were either too high or too low with few grading accurately. This is obviously a concern as it raises questions about the ability of students to apply criteria in a uniform manner when formulating their own written responses to questions. Criterion-based assessment requires scoring based on a set of explicitly stated criteria rather than comparison of peer answers. It demands higher-order skills that undergraduate students, especially those in their first year of study, rarely possess. In addition, the teaching assistants frequently reported that students who assigned low scores when grading sample answers had difficulty during one-on-one consultations in office hours applying the same low scores to their own work even though it was of similar quality. These observations lead us to question the effectiveness of peer assessment for this type of initiative, especially with first-year students who are just beginning to develop higher order thinking skills. DISCUSSION AND CONCLUSIONS We designed the strategy evaluated in this article to improve the written communication of geographic concepts and ideas using criterion-based assessment, draft submissions, and peer assessment. Our evaluation indicates that the writing strategy did result in improved student writing. Specifically, we observed an enhancement in the development of geographic understanding. This was evident by the differences in the content of written answers between the draft and final submissions and in improvements in responses to written tests with most students written work demonstrating a clearer understanding of the relevant geographic concepts and ideas. Similarly, teaching assistants noticed major organizational and content-based changes to the final assignments. As such, our results suggest that the submission of a draft and final version of assignments and in-class writing exercises provides an ideal opportunity for the provision of effective feedback. Similar to other authors (Nightingale 1991; Kennedy-Kalafatis and Carleton 1996) the results of our initiative show that providing students with meaningful and effective feedback enhances their ability to improve the content (i.e., geographic concepts and ideas), organization, and writing quality of their work. 155

Downloaded by [Turku University] at 03:29 15 September 2015 Joseph Leydon, Kathi Wilson, and Cleo Boyd While feedback is an important mechanism for improving student writing, our attempt to improve writing through peer assessment received mixed reviews. Other research has also shown that while peer assessment results in improved writing and that students often enjoy the peer assessment process, most students are not convinced that it is of benefit (Pain and Mowl 1996; Pharo and Kristy 2009). The two main complaints our students raised about the peer assessment process related to peers not having the expertise to assess work and peers not taking the process seriously (i.e., providing minimal feedback and assigning higher grades than the teaching assistants). In hindsight, a weakness of the peer review process was our failure to emphasize strongly enough to the students that the real purpose was for them to gain a better understanding of the application of the criteria to their own work when assessed by the teaching assistants. In the future, this could be explained to students in the form of a workshop (see Pain and Mowl 1996) in which students learn why and how written work will be assessed and participate in discussions about assessment criteria. Nevertheless, we feel that even with such changes peer assessment will remain problematic as it calls upon higher order skills not developed sufficiently in a first-year undergraduate student population. Regardless of the problems in using peer assessment, our results indicate that a multipronged strategy can be effective at improving the ability of students to communicate geographic concepts and ideas in written form. Such strategies should begin in first-year courses and continue through to upper-level undergraduate courses in geography. Students critical writing abilities will accrue only through a consistent developmental approach across the geography curriculum. NOTE 1. For additional examples of assignment and examination questions, please contact the authors. REFERENCES Bednarz,S.W.,R.S.Bednarz,S.DicksonMansfield,R.Dorn, and M. Libbee. 2006. Geographical education in North America. In Geographical Education in a Changing World: Past Experience, Current Trends and Future Challenges,ed. J. Lidstone and M. Williams, pp. 107 126. Dordrecht, The Netherlands: Kluwer Academic Publishers. Biggs, J. 1999. Teaching for Quality Learning at University: What the Student Does. Buckingham, UK: Society for Research into Higher Education and Open University Press. Biggs, J. 1987. Process and outcome in essay writing. Research and Development in Higher Education 9:114 125. Bowman, N. A. 2010. Can 1st year college students accurately report their learning and development? American Educational Research Journal 47 (2): 466 496. Cadwallader, M. L., and M. C. Scarboro. 1982. Teaching writing within a sociology course. Teaching Sociology 9 (4): 359 382. Cook, I. 2000. Nothing can ever be the case of us and them again: Exploring the politics of difference through border pedagogy and student journal writing. Journal of Geography in Higher Education 24 (1): 13 27. Drummer, T. J. B., I. G. Cook, S. L. Parker, G. A. Barrett, and A. P. Hull. 2008. Promoting and assessing deep learning in geography fieldwork: An evaluation of reflective field diaries. Journal of Geography in Higher Education 32 (3): 450 479. English, L., H. Bonanno, T. Ihnatko, C. Webb, and J. Jones. 1999. Learning through writing in a first-year accounting course. Journal of Accounting Education 17 (3): 221 254. Fitzgerald, M. A. 2004. Making the leap from high school to college: Three new studies about information literacy skills of first-year college students. Knowledge Quest 32 (4): 19 24. Gambell, T. J. 1987. Education professors perceptions of and attitudes toward student writing. Canadian Journal of Education 12 (4): 495 510. Guise, J., J. Goosney, S. Gordon, and H. Pretty. 2008. Evolution of a summer research/writing workshop for first-year university students. New Library World 109 (5/6): 232 250. Haigh, M. J. 2001. Constructing Gaia: Using journals to foster reflective learning. Journal of Geography in Higher Education 25 (2): 167 189. Hay, I., and E. J. Delaney. 1994. Who teaches learns: Writing groups in geographical education. Journal of Geography in Higher Education 18 (3): 317 334. Heyman, R. 2004. Inventing geography: Writing as a social justice pedagogy. Journal of Geography 103 (4): 139 152. Hooey, C. A., and T. J. Bailey. 2005. Journal writing and the development of spatial thinking skills. Journal of Geography 104 (6): 257 261. Hulme, J., and M. Forshaw. 2009. Effectiveness of feedback provision for undergraduate psychology students. Psychology Learning and Teaching 8 (1): 34 38. Hyers, A. D. 2001. Predictable achievement patterns for student journals in introductory earth science courses. Journal of Geography in Higher Education 25 (1): 53 66. Kautzman, A. M. 1996. Teaching critical thinking: The alliance of composition studies and research instruction. Reference Services Review 18 (2): 113 120. 156

Improving Student Writing Abilities in Geography Downloaded by [Turku University] at 03:29 15 September 2015 Kennedy-Kalafatis, S., and D. Carleton. 1996. Encouraging peer dialogue in the geography classroom: Peer editing to improve student writing. Journal of Geography in Higher Education 20 (3): 323 341. Levia, D., and S. Quiring. 2008. Assessment of student learning in a hybrid PBL capstone seminar. Journal of Geography in Higher Education 32 (2): 217 231. Libbee, M., and D. Young. 1983 Teaching writing in geography classes. Journal of Geography 82 (6): 292 293. Manzo, J. T. 2002. Grading book review assignments. Journal of Geography 101 (3): 121 135. Marton, F., and R. Saljo. 1976a. On qualitative approaches to learning: 1 outcome and process. British Journal of Educational Psychology 46 (1): 4 11.. 1976b. On qualitative approaches to learning: 2 outcome as a function of learners conception of task. British Journal of Educational Psychology 46 (2): 115 127. McGuinness, M. 2009. Putting themselves in the picture: Using reflective diaries in the teaching of feminist geography. Journal of Geography in Higher Education 33 (3): 339 349. Moni, R. W., K. B. Moni, L. J. Lluka, and P. Poronnik. 2007. A novel writing assignment to engage first year students in large human biology classes. Biochemistry and Molecular Biology Education 35 (2): 89 96. Mowl, G., and R. Pain. 1995. Using self and peer assessment to improve students essay writing: a case study from geography. Innovations in Education and Technology 32 (4): 324 335. Nightingale, P. 1991. Speaking of student writing... Journal of Geography in Higher Education 15 (1): 3 13. APPENDIX 1: SAMPLE ASSIGNMENT Northey, M., and D. B. Knight. 2007. Making Sense: A Student s Guide to Research and Writing. New York: Oxford University Press. Pain, R., and G. Mowl. 1996. Improving geography essay writing using innovative assessment. Journal of Geography in Higher Education 20 (1): 19 31. Park, C. 2003. Engaging students in the learning process: The learning journal. Journal of Geography in Higher Education 27 (2): 183 199. Pharo, E., and D. S. Kristy. 2009. Implementing student peer review: Opportunity versus change management. Journal of Geography in Higher Education 33 (2): 199 207. Savin-Baden, M., and L. Van Nierkerk. 2007. Narrative inquiry: Theory and practice. Journal of Geography in Higher Education 31 (3): 459 472. Schwartz, M. 1984. Response to writing: A college-wide perspective. College English 46 (1): 55 62. Sivan, A. 2000. The implementation of peer assessment: An action research approach. Assessment in Education 7(2): 193 213. Slinger-Friedman, V., and L. Patterson. 2012. Writing in geography: Student attitudes and assessment. Journal of Geography in Higher Education 36 (2): 179 195. Summerby-Murray, R. 2010. Writing for immediacy: Narrative writing as a teaching technique in undergraduate cultural geography. Journal of Geography in Higher Education 34 (2): 231 245. Weimer, M. 2002. Learner-Centered Teaching: Five Key Changes to Practice. San Francisco: Jossey-Boss. The purpose of this assignment is to examine Canadian population pyramids using data from the 2006 Census of Canada. Assignment Part A: Creating Population Pyramids (30 marks) Using Excel, create one population pyramid for British Columbia and one for Nunavut. Draw lines on both to identify the labor force, young dependents, and old dependents. Assignment Part B: Interpreting Population Pyramids (30 marks) Answer each question in no more than 250 words. All answers must be typed, double-spaced, and in 12pt font. 1) Compare and contrast the two pyramids (i.e., are they similar? how do they differ? (in terms of the type of population growth depicted, dependents etc.))? (10 marks) 2) Identify one implication of population growth in British Columbia and one implication of population growth in Nunavut. Describe one policy that you would introduce in each region (they should be different) to address the implication you identified? (20 marks) (Continued on next page) 157

Joseph Leydon, Kathi Wilson, and Cleo Boyd APPENDIX 1: SAMPLE ASSIGNMENT (CONTINUED) Part A: Assessment Criteria 30 marks (15 marks for each pyramid) Does not meet expectations Has not completed two population pyramids Calculations are incorrect Figures not completely and appropriately labeled Incorrect placement of lines to identify labor force, young, and old dependents Meets expectations Has completed two population pyramids Calculations are correct Each figure is properly labeled Correct placement of lines Downloaded by [Turku University] at 03:29 15 September 2015 Part B: Question #1 Assessment Criteria (10 marks) Does not meet expectations Meets expectations Exceeds expectations Does not compare and contrast the Provides basic description that Answer demonstrates insight and two population pyramids compares and contrasts the two thoughtfulness Answer does not reflect question pyramids Answer exhibits originality Answer does not reflect basic Answer reflects question understanding of concepts (does not Answer supported by data in reflect data in pyramid) population pyramid Answer lacks clarity No spelling and grammatical errors Spelling and grammatical errors Part B: Question #2 Assessment Criteria (20 marks) Does not meet expectations Meets expectations Exceeds expectations Does not identify valid implications Implications are identified and Implications and policies of population growth in Nunavut demonstrate basic understanding of identified demonstrate initiative, and British Columbia concepts originality, and in-depth Does not identify policies to address Identifies policies understanding of concepts implications APPENDIX 2: SAMPLE QUESTION AND RESPONSE DETAILS FOR TESTS/EXAMINATIONS Question Focus In-Class Written Exercise Response Test/Examination Response Part A: Identify the main features of a population pyramid. Most answers focus on population divided by age and sex Part B: Describe a population pyramid for a country experiencing rapid population growth. Part C: How is a pyramid displaying rapid growth different from a pyramid illustrating negative growth? General Information/Description of the structure of a population pyramid Information and Description specific to the pyramid type Information specific to the pyramid type Ability to identify/focus on differences Most answers refer to general shape Some reference to a high birth rate Answers tend to describe the pyramid types rather than enunciate the differences Greater precision with reference to calculations of age cohorts, reference to dependency Greater precision with clear identification of high fertility, large young age population, each successive age group larger than the one before it Focus on the differences between young and old populations in both pyramids Identified differences in fertility Commented on population replacement (Continued on next page) 158

Improving Student Writing Abilities in Geography APPENDIX 2: SAMPLE QUESTION AND RESPONSE DETAILS FOR TESTS/EXAMINATIONS (CONTINUED) Downloaded by [Turku University] at 03:29 15 September 2015 Question Focus In-Class Written Exercise Response Test/Examination Response Part D: Why are population pyramids important to the study of human geography? Analytical/ conceptualization Connect population pyramids to a broader discussion in human geography Very general statements on both population geography and human geography Answers not well related to the question Some connections made to population characteristics and development Identification of some of the central themes in human geography (location, regional difference) Comments on regional variations in population (growth, stability, decline) Connecting population pyramids to development (pyramid types typical of developing, developed countries) Comments of connections between population pyramids and future population concerns (youth dependency, age dependency, growth, decline) Grading Rubric for Tests and Final Examination Score Quality of Answer Quality of Organization Quality of Writing 0 Question not attempted No organization Writing is illegible 1 Little understanding of the question Mostly irrelevant information 2 Some understanding of the question Some relevant information but much irrelevant information 3 Good understanding of the question Relevant information that is linked to the question 4 Excellent understanding of the question All relevant information provided Answer exhibits insight and thoughtfulness Ineffective organization Difficult to follow Some organization Some linking of ideas Poor transition between ideas Good organization Most ideas are linked Some effective transitions between ideas Excellent organization All ideas are linked Excellent transitions between ideas Serious and recurring errors in grammar, punctuation, and structure Errors in grammar, punctuation, and structure that interfere with understanding Minor error in grammar, punctuation, and structure Occasional interference with understanding No errors in grammar, punctuation, or structure Writing is clear, concise, and fluid 159