Worked Examples are more Efficient for Learning than High-Assistance Instructional Software

Similar documents
What s in a Step? Toward General, Abstract Representations of Tutoring System Log Data

A politeness effect in learning with web-based intelligent tutors

Predicting Students Performance with SimStudent: Learning Cognitive Skills from Observation

Session 2B From understanding perspectives to informing public policy the potential and challenges for Q findings to inform survey design

Effect of Word Complexity on L2 Vocabulary Learning

Introduction to WeBWorK for Students

A Game-based Assessment of Children s Choices to Seek Feedback and to Revise

Conceptual and Procedural Knowledge of a Mathematics Problem: Their Measurement and Their Causal Interrelations

Guru: A Computer Tutor that Models Expert Human Tutors

SECTION 12 E-Learning (CBT) Delivery Module

Completing the Pre-Assessment Activity for TSI Testing (designed by Maria Martinez- CARE Coordinator)

Different Requirements Gathering Techniques and Issues. Javaria Mushtaq

Millersville University Degree Works Training User Guide

Running head: METACOGNITIVE STRATEGIES FOR ACADEMIC LISTENING 1. The Relationship between Metacognitive Strategies Awareness

Study Guide for Right of Way Equipment Operator 1

Running head: DELAY AND PROSPECTIVE MEMORY 1

Principal Survey FAQs

Online ICT Training Courseware

DESIGN, DEVELOPMENT, AND VALIDATION OF LEARNING OBJECTS

Longman English Interactive

Field Experience Management 2011 Training Guides

PRD Online

Introduction. Chem 110: Chemical Principles 1 Sections 40-52

BUILD-IT: Intuitive plant layout mediated by natural interaction

GED Manager. Training Guide For Corrections Version 1.0 December 2013

What is beautiful is useful visual appeal and expected information quality

User Guide. LSE for You: Graduate Course Choices. London School of Economics and Political Science Houghton Street, London WC2A 2AE

Faculty Feedback User s Guide

The Impact of Instructor Initiative on Student Learning: A Tutoring Study

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

Houghton Mifflin Online Assessment System Walkthrough Guide

EdX Learner s Guide. Release

EMPOWER Self-Service Portal Student User Manual

Enhancing Van Hiele s level of geometric understanding using Geometer s Sketchpad Introduction Research purpose Significance of study

Stephanie Ann Siler. PERSONAL INFORMATION Senior Research Scientist; Department of Psychology, Carnegie Mellon University

INTERMEDIATE ALGEBRA PRODUCT GUIDE

STUDENT MOODLE ORIENTATION

Using GIFT to Support an Empirical Study on the Impact of the Self-Reference Effect on Learning

When Student Confidence Clicks

12- A whirlwind tour of statistics

Appendix L: Online Testing Highlights and Script

How To Enroll using the Stout Mobile App

STEM Academy Workshops Evaluation

An ICT environment to assess and support students mathematical problem-solving performance in non-routine puzzle-like word problems

Information Event Master Thesis

Starting an Interim SBA

Save Children. Can Math Recovery. before They Fail?

Chemistry 106 Chemistry for Health Professions Online Fall 2015

Test How To. Creating a New Test

How to set up gradebook categories in Moodle 2.

Robot manipulations and development of spatial imagery

MOODLE 2.0 GLOSSARY TUTORIALS

THE ROLE OF TOOL AND TEACHER MEDIATIONS IN THE CONSTRUCTION OF MEANINGS FOR REFLECTION

Process Evaluations for a Multisite Nutrition Education Program

WiggleWorks Software Manual PDF0049 (PDF) Houghton Mifflin Harcourt Publishing Company

Activity Insight Faculty User Guide

CHANCERY SMS 5.0 STUDENT SCHEDULING

Evidence-based Practice: A Workshop for Training Adult Basic Education, TANF and One Stop Practitioners and Program Administrators

Greek Teachers Attitudes toward the Inclusion of Students with Special Educational Needs

Teaching Algorithm Development Skills

Students Understanding of Graphical Vector Addition in One and Two Dimensions

Creating a Test in Eduphoria! Aware

Does the Difficulty of an Interruption Affect our Ability to Resume?

Best Colleges Main Survey

DegreeWorks Advisor Reference Guide

School Year 2017/18. DDS MySped Application SPECIAL EDUCATION. Training Guide

Educational software reflecting two philosophical approaches to ethics education Marike Hettinga* and Betty Collis**

Fostering social agency in multimedia learning: Examining the impact of an animated agentõs voice q

Creating an Online Test. **This document was revised for the use of Plano ISD teachers and staff.

K5 Math Practice. Free Pilot Proposal Jan -Jun Boost Confidence Increase Scores Get Ahead. Studypad, Inc.

Skyward Gradebook Online Assignments

A Study of Metacognitive Awareness of Non-English Majors in L2 Listening

Storytelling Made Simple

Getting Started Guide

PUBLIC CASE REPORT Use of the GeoGebra software at upper secondary school

Using Virtual Manipulatives to Support Teaching and Learning Mathematics

PART 1. A. Safer Keyboarding Introduction. B. Fifteen Principles of Safer Keyboarding Instruction

Municipal Accounting Systems, Inc. Wen-GAGE Gradebook FAQs

Towards a Collaboration Framework for Selection of ICT Tools

TK20 FOR STUDENT TEACHERS CONTENTS

Sight Word Assessment

Using SAM Central With iread

A MULTI-AGENT SYSTEM FOR A DISTANCE SUPPORT IN EDUCATIONAL ROBOTICS

STUDENT SATISFACTION IN PROFESSIONAL EDUCATION IN GWALIOR

San Marino Unified School District Homework Policy

Preferences...3 Basic Calculator...5 Math/Graphing Tools...5 Help...6 Run System Check...6 Sign Out...8

Massachusetts Department of Elementary and Secondary Education. Title I Comparability

Reinforcement Learning by Comparing Immediate Reward

Specification of the Verity Learning Companion and Self-Assessment Tool

The Evolution of Random Phenomena

ACADEMIC TECHNOLOGY SUPPORT

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

TIPS PORTAL TRAINING DOCUMENTATION

Introduction to Moodle

Level 1 Mathematics and Statistics, 2015

Moodle 3.2 Backup and Simple Restore

Pre-Algebra A. Syllabus. Course Overview. Course Goals. General Skills. Credit Value

IS FINANCIAL LITERACY IMPROVED BY PARTICIPATING IN A STOCK MARKET GAME?

LMS - LEARNING MANAGEMENT SYSTEM END USER GUIDE

The Role of Test Expectancy in the Build-Up of Proactive Interference in Long-Term Memory

Transcription:

DOI 10.1007/s40593-015-0046-z ARTICLE Worked Examples are more Efficient for Learning than High-Assistance Instructional Software Bruce M. McLaren 1 Tamara van Gog 2 Craig Ganoe 1 David Yaron 1 Michael Karabinos 1 # International Artificial Intelligence in Education Society 2015 Abstract The assistance dilemma, an important issue in the Learning Sciences, is concerned with how much guidance or assistance should be provided to help students learn. A recent study comparing three high-assistance approaches (worked examples, tutored problems, and erroneous examples) and one low-assistance (conventional problems) approach, in a multi-session classroom experiment, showed equal learning outcomes, with worked examples being much more efficient. To rule out that the surprising lack of differences in learning outcomes was due to too much feedback across the conditions, the present follow-up experiment was conducted, in which feedback was curtailed. Yet the results in the new experiment were the same: there were no differences in learning outcomes, but worked examples were much more efficient. These two experiments suggest that there are efficiency benefits of worked example study. Yet, questions remain. For instance, why didn t high instructional assistance benefit learning outcomes and would these results hold up in other domains? Keywords Assistance dilemma. Classroom studies. Empirical studies. Worked examples. Erroneous examples. Tutored problems to solve. Problem solving * Bruce M. McLaren bmclaren@cs.cmu.edu 1 2 Tamara van Gog vangog@fsw.eur.nl Craig Ganoe ganoe@acm.org David Yaron yaron@cmu.edu Michael Karabinos mk7@andrew.cmu.edu Carnegie Mellon University, Pittsburgh, PA, USA Erasmus University Rotterdam, Rotterdam, The Netherlands

Introduction An important question for the Learning Sciences to answer is how much guidance or assistance should be provided in order to help students learn, i.e., the assistance dilemma (Koedinger and Aleven 2007). In a recent experiment (McLaren et al. 2014), the effectiveness and efficiency of three high-assistance instructional formats (which differ in the amount of student activity required) was compared to lowassistance problems to solve, which students have to attempt to solve largely on their own: Worked examples, which present students with a fully worked-out problem solution to study; Tutored problems, which provide step-by-step feedback and hints, either when an error is made or on demand; and Erroneous examples, which are worked examples with errors in one or more of the problem-solving steps that students have to find and fix. It was found that worked example study resulted in a large efficiency benefit compared to all other conditions. Equal learning outcomes were attained in 50 65 % less time and with less self-reported effort invested in the study phase (McLaren et al. 2014). That the more passive high-assistance format was most efficient is interesting in light of the assistance dilemma. However, it was remarkable that none of the highassistance instructional formats improved learning outcomes compared to problem solving. Possibly, the feedback received in all of the conditions, including problem solving, in the form of a worked example if they made mistakes, could explain the lack of effect. To find out whether worked example feedback contributed to the equal performance across conditions, a second study was conducted, reported here. Instead of receiving a correct worked example as feedback, students in all conditions would instead see highlighting of their correct steps in green, and their incorrect steps in red. Thus, the second study, like the earlier one, also directly compared the four instructional conditions, but under different (and lesser) feedback circumstances. Method Participants and Design Participants were 116 tenth and eleventh grade students from two high schools in the U.S. (M age =16.45, SD=0.76; 48 male; 15 of an original 131 participants were excluded for not fully completing all phases). Participants were randomly assigned to one of the four instructional conditions: (1) Worked Examples (WE), (2) Erroneous Examples (ErrEx), (3) Tutored Problems to Solve (TPS;), or (4) Problems to Solve (PS). Materials and Procedure We used the same web-based stoichiometry-learning environment as (McLaren et al. 2014). Stoichiometry is a subdomain of chemistry in which basic mathematics (i.e., multiplication of ratios) is applied to chemical quantities such as mass and solution concentration. The experiment was conducted at students schools

as replacement for their regular science class. In total, the study took 6 class periods to complete. Students received a login for the web-based environment and could work at their own pace on the materials they encountered in the learning phase. They first completed a demographic questionnaire, followed by the pretest, consisting of four stoichiometry problems to solve (isomorphic to the Intervention Problems, described below) and four conceptual knowledge questions (max. score: 101 points). Subsequently, each condition watched an introductory video explaining how to interact with the web-based user interface. They then watched instructional videos introducing new stoichiometry concepts and procedures (the same in all conditions), after which students were presented with a total of 10 intervention problems, in an instructional format specific to their condition (explained below). The problems were grouped in five isomorphic pairs (e.g., WE-1 and WE-2 are an isomorphic pair, WE-3 and WE-4 are an isomorphic pair, etc.) and each pair was followed by an isomorphic embedded test problem (max. total score: 122 points). After each intervention item, students indicated how much mental effort they invested in studying/ completing it, on a 9-point rating scale (Paas 1992). The complexity of the stoichiometry problems gradually increased, with each pair of intervention problems being preceded by instructional videos explaining new concepts and procedures. When they had finished with the intervention phase, however, they could not immediately progress to the posttest; this test took place on the sixth and final period for all students and was isomorphic to the pretest (max. score: 101 points). Performance was automatically scored, along with time on task and student self-reports of effort. The worked examples (WE) consisted of problem statements and screen-recorded videos of the solution to the problem being entered, step-by-step, into the interface used in all four conditions. The videos had duration of between 30 and 70 s, and did not include any narration or explanation of why steps were taken; students only saw the steps being completed. When the video finished, students had to indicate the reason for each step by selecting this from a drop-down menu. After entering reasons, they could click the Done button and correct/incorrect feedback appeared (in the form of green highlighting for correct steps, red highlighting for incorrect steps). The Erroneous Examples (ErrEx), consisted of screen-recorded videos of 30 to 70 s, except the items contained 1 to 4 errors that students were instructed to find and fix. They had to correct at least one step before they could click the Done button, at which point correct/ incorrect feedback appeared. The tutored problems (TPS) consisted of a problem statement and fields to fill in and students had to attempt to solve the problem themselves, but with assistance received in the form of on-demand hints and error feedback. There were up to 5 levels of hints per step, with the bottom-out hint being both a message giving the answer to that step and a worked example of the problem solved to that point, shown below the interface. Because the tutored problems always ended in a correct final problem state, students received no further feedback. The problems to solve (PS) consisted of a problem statement and fields to fill in by students themselves, without any assistance. They had to fill out at least one step before they could click the Done button. When they clicked the Done button, correct/incorrect feedback appeared. In all conditions, students could review their work including correct/incorrect feedback foraslongastheywantedbeforeselecting a Next button and proceeding to the next item.

Results Data are presented in Table 1 and were analyzed with ANOVA. Analysis of the pretest scores confirmed that there were no significant differences among conditions in prior knowledge, F(3112)<1, p=0.500. Test performance did not differ significantly among conditions, either on the embedded test problems, F(3112)=1.031, p=0.382, or on the posttest, F(3112)<1, p=0.883. There was a significant difference among conditions in the average mental effort invested in the intervention problems, F(3112)=9.709, p<0.001; Bonferroni post-hoc tests showed that WE<TPS (p<0.001) and PS (p=0.002); ErrEx<TPS (p=0.003); no other differences were significant. Regarding time spent on the intervention problems, significant differences among conditions were found F(3112)=72.93, p<0.001. Bonferroni post-hoc tests showed: WE<than all others, all p<0.001; ErrEx<than TPS and PS, both p<0.001; TPS>than PS, p=0.014. Discussion and Conclusions This study replicated the findings of (McLaren et al. 2014), so across the two experiments, evidence was found for enormous efficiency benefits of worked example study, both in terms of effort and time investment, compared to all other conditions (except for effort on the erroneous examples in the present experiment). The high-assistance instructional formats did not result in better learning outcomes than problem solving. We can only speculate about potential causes. One possibility is that the instructional videos on stoichiometry that were interspersed throughout the intervention in all conditions, and which sometimes included an example of how to apply a concept during problem solving, provided sufficient support for students in the problem-solving condition to benefit from practice, although that was slower and more effortful. These surprising, and now replicated, results are worthy of further study, especially given that the interspersed conceptual videos, providing both theoretical and procedural explanations, are much closer to real educational practice and therefore give more ecologically valid information about the impact of various instructional formats on learning processes and outcomes. Table 1 Performance, mental effort, and time on task per condition WE (n=29) ErrEx (n=28) TPS (n=27) PS (n=32) Pre-test (0 101) 48.69 (17.62) 47.46 (20.27) 41.85 (16.77) 45.25 (16.26) Embedded test (0 122) 92.21 (25.03) 79.79 (33.29) 85.30 (30.94) 80.69 (31.08) Effort intervention (1 9) 4.88 (1.44)* 5.31 (1.71) 6.70 (1.25) 6.27 (1.31) Time intervention (min.) 20.87 (5.50)* 40.48 (11.27) 67.11 (18.91) 56.82 (11.79) Post-test (0 101) 68.21 (18.21) 65.68 (23.08) 67.78 (19.98) 69.88 (19.17) Sig diffs indicated by *

This study, conducted in a classroom context, finds a clear time-efficiency advantage to worked examples. This result is a valuable finding for educational practice, although one that should be verified in additional domains, with different materials. Acknowledgments The National Science Foundation funded this research, Award No. SBE-0836012 ( Pittsburgh Science of Learning Center ). References Koedinger, K. R., Aleven, V. (2007). Exploring the assistance dilemma in experiments with cognitive tutors. Educational Psychology Review, 19, 239 264. McLaren, B.M., van Gog, T., Ganoe, C., Yaron, D. Karabinos, M. (2014). Exploring the assistance dilemma: comparing instructional support in examples and problems. In: S. Trausan-Matu et al. (Eds.) Proceedings of the Twelfth International Conference on Intelligent Tutoring Systems (ITS-2014). LNCS 8474. (pp. 354 361). Springer International Publishing Switzerland. Paas, F. (1992). Training strategies for attaining transfer of problem-solving skill in statistics: a cognitive load approach. Journal of Educational Psychology, 84, 429 434.