Process data as a window into students minds?

Similar documents
Department of Education and Skills. Memorandum

Twenty years of TIMSS in England. NFER Education Briefings. What is TIMSS?

National Academies STEM Workforce Summit

Introduction Research Teaching Cooperation Faculties. University of Oulu

Overall student visa trends June 2017

Impact of Educational Reforms to International Cooperation CASE: Finland

Welcome to. ECML/PKDD 2004 Community meeting

PIRLS. International Achievement in the Processes of Reading Comprehension Results from PIRLS 2001 in 35 Countries

HIGHLIGHTS OF FINDINGS FROM MAJOR INTERNATIONAL STUDY ON PEDAGOGY AND ICT USE IN SCHOOLS

The Survey of Adult Skills (PIAAC) provides a picture of adults proficiency in three key information-processing skills:

EXECUTIVE SUMMARY. TIMSS 1999 International Science Report

TIMSS Highlights from the Primary Grades

Universities as Laboratories for Societal Multilingualism: Insights from Implementation

PROGRESS TOWARDS THE LISBON OBJECTIVES IN EDUCATION AND TRAINING

Summary and policy recommendations

EXECUTIVE SUMMARY. TIMSS 1999 International Mathematics Report

How to Search for BSU Study Abroad Programs

May To print or download your own copies of this document visit Name Date Eurovision Numeracy Assignment

DEVELOPMENT AID AT A GLANCE

Improving education in the Gulf

Science and Technology Indicators. R&D statistics

The Rise of Populism. December 8-10, 2017

international PROJECTS MOSCOW

Advances in Aviation Management Education

Measuring up: Canadian Results of the OECD PISA Study

Challenges for Higher Education in Europe: Socio-economic and Political Transformations

SECTION 2 APPENDICES 2A, 2B & 2C. Bachelor of Dental Surgery

The development of national qualifications frameworks in Europe

Eye Level Education. Program Orientation

RELATIONS. I. Facts and Trends INTERNATIONAL. II. Profile of Graduates. Placement Report. IV. Recruiting Companies

Rethinking Library and Information Studies in Spain: Crossing the boundaries

CHAPTER 3 CURRENT PERFORMANCE

Students with Disabilities, Learning Difficulties and Disadvantages STATISTICS AND INDICATORS

IAB INTERNATIONAL AUTHORISATION BOARD Doc. IAB-WGA

JAMK UNIVERSITY OF APPLIED SCIENCES

15-year-olds enrolled full-time in educational institutions;

International House VANCOUVER / WHISTLER WORK EXPERIENCE

Business Students. AACSB Accredited Business Programs

Teaching Practices and Social Capital

The recognition, evaluation and accreditation of European Postgraduate Programmes.

National Pre Analysis Report. Republic of MACEDONIA. Goce Delcev University Stip

GREAT Britain: Film Brief

The European Higher Education Area in 2012:

SOCRATES PROGRAMME GUIDELINES FOR APPLICANTS

The development of ECVET in Europe

The International Coach Federation (ICF) Global Consumer Awareness Study

The Junior Community in ALICE. Hans Beck for the ALICE collaboration 07/07/2017

HAAGA-HELIA University of Applied Sciences. Education, Research, Business Development

83 Fellows certified in 2016! Currently 161 Fellows registered Global Online Fellowship In Head & Neck Surgery and Oncology

Tailoring i EW-MFA (Economy-Wide Material Flow Accounting/Analysis) information and indicators

PIRLS 2006 ASSESSMENT FRAMEWORK AND SPECIFICATIONS TIMSS & PIRLS. 2nd Edition. Progress in International Reading Literacy Study.

ISSA E-Bulletin (2008-2)

HARVARD GLOBAL UPDATE. October 1-2, 2014

Information needed to facilitate the clarity, transparency and understanding of mitigation contributions

International Branches

DISCUSSION PAPER. In 2006 the population of Iceland was 308 thousand people and 62% live in the capital area.

Information Session on Overseas Internships Career Center, SAO, HKUST 1 Dec 2016

OHRA Annual Report FY15


ehealth Governance Initiative: Joint Action JA-EHGov & Thematic Network SEHGovIA DELIVERABLE Version: 2.4 Date:

EDUCATION. Department of International Environment and Development Studies, Noragric

PISA 2015 Results STUDENTS FINANCIAL LITERACY VOLUME IV

Academic profession in Europe

OCW Global Conference 2009 MONTERREY, MEXICO BY GARY W. MATKIN DEAN, CONTINUING EDUCATION LARRY COOPERMAN DIRECTOR, UC IRVINE OCW

STAGE-STE PROJECT Presentation of University of Seville (Partner 44)

Supplementary Report to the HEFCE Higher Education Workforce Framework

The Achievement Gap in California: Context, Status, and Approaches for Improvement

North American Studies (MA)

Analysis of European Medical Schools Teaching Programs

An Example of an E-learning Solution for an International Curriculum in Manufacturing Strategy

EQE Candidate Support Project (CSP) Frequently Asked Questions - National Offices

Master in International Economics and Public Policy. Christoph Wirp MIEPP Program Manager

intsvy: An R Package for Analysing International Large-Scale Assessment Data

REFLECTIONS ON THE PERFORMANCE OF THE MEXICAN EDUCATION SYSTEM

A TRAINING COURSE FUNDED UNDER THE TCP BUDGET OF THE YOUTH IN ACTION PROGRAMME FROM 2009 TO 2013 THE POWER OF 6 TESTIMONIES OF STRONG OUTCOMES

California Digital Libraries Discussion Group. Trends in digital libraries and scholarly communication among European Academic Research Libraries

Steinbeis Transfer Institut - Management Education Network - Filderhauptstrasse Stuttgart - Germany Phone Fax + 49

Financiación de las instituciones europeas de educación superior. Funding of European higher education institutions. Resumen

Using 'intsvy' to analyze international assessment data

Cooperative Education/Internship Program Report

Language. Name: Period: Date: Unit 3. Cultural Geography

CSO HIMSS Chapter Lunch & Learn April 13, :00pmCT/1:00pmET

A14 Tier II Readiness, Data-Decision, and Practices

Grindelwald Tasmania 7277 Australia Tel: ++ (613)

Published in: The Proceedings of the 12th International Congress on Mathematical Education

Inspiring Teacher Educa1on: From Assignment analysis to program redesign. David H Slomp, University of Lethbridge

UNIVERSITY AUTONOMY IN EUROPE II

Name Partner University Country Partners with THUAS Faculty Partners with Programme Student exchange Staff exchange European University of Tirana

On the Combined Behavior of Autonomous Resource Management Agents

ISPOR Slovakia Regional Chapter Annual Report 2011

PeopleSoft Human Capital Management 9.2 (through Update Image 23) Hardware and Software Requirements

World Maritime University

empowering explanation

OWLs Across Borders: An Exploratory Study on the place of Online Writing Labs in the EFL Context

TERTIARY EDUCATION BOOM IN EU COUNTRIES: KEY TO ENHANCING COMPETITIVENESS OR A WASTE OF RESOURCES?

Mirum: Vivamus est ipsum, vehicula nec, feugiat rhoncus, accumsan id, nisl. Lorem ipsum dolor sit amet, consectetuer. Theory, Practice, Application

Private International Law In Czech Republic. By Monika Pauknerová

Root Cause Analysis. Lean Construction Institute Provider Number H561. Root Cause Analysis RCA

Save Children. Can Math Recovery. before They Fail?

Target 2: Connect universities, colleges, secondary schools and primary schools

Transcription:

Process data as a window into students minds? The exploita7on of computer- generated log files University of Luxembourg

Overarching goal: Explore the poten-al of computer- generated log files with the PISA 2012 problem solving data 2

Some ques-ons in the back of our head... Do log files allow us 1) looking into students minds? 2) moving from the What (students can do) to the How (they actually do it)? Is there a hidden poten7al. 1) for repor7ng in large- scale assessments? 2)...for research? 3) for educa7onal policy? 3

Log files: The solu7on to everything? Gartner s hype cycle for emerging technologies Where do we stand?

Agenda What was Problem Solving in PISA 2012? What did we do? The ra7onale Process data in PISA 2012 What did we learn? The results What s ahead? Implica7ons

PISA 2012 Problem Solving in brief Innova0ve PISA domain in 2012 85 000 students in 44 countries/economies took an interna7onally agreed 40- minutes test - on computers - to assess students capacity to engage in cogni7ve processing to understand and resolve interac0ve and dynamically changing problem situa0ons where a method of solu7on is not immediately obvious - in addi7on to the 2h paper- based test (mathema7cs, reading, science) Copyright by F. Avvisa7 6

PS and the other domains Mean R 2 of other domains: 77% overlap Mean R 2 of Complex Problem Solving with other domains: 61% overlap Copyright OECD (2014)

Mean score 570 560 550 540 530 520 510 500 490 480 470 460 450 440 Korea Hong Kong-China Chinese Taipei Finland Estonia Germany Czech Republic United States Norway Denmark Portugal Russian Fed. Strong performance in problem solving Poland Slovenia Israel Singapore Japan Macao-China Shanghai-China Canada Australia England (U.K.) France Italy Netherlands Austria Belgium Ireland Sweden Slovak Republic Spain Serbia Croatia Hungary Turkey Chile Average performance of 15-year-olds in problem solving Fig V.2.3 430 420 410 400 Montenegro Bulgaria Brazil Malaysia U.A.E Uruguay Colombia 390 Low performance in problem solving Copyright by F. Avvisati

Problem Solving Sample Question 2 Climate Control This is a harder item Level 4 on You have no instructions the problem- solving for your new scale air conditioner. You need to work out how to use it. Find whether each control influences temperature and Students must engage with the humidity by changing the sliders. machine, and use the feedback and Draw lines in the diagram on the right to show informa7on what each uncovered control to reach a influences. solu7on: it is an interac0ve problem This main demand is represen7ng and formula7ng (knowledge acquisi0on)

PISA Crea7ve Problem Solving Tasks

Possible insights from this item: How much 0me did students spend to solve this item? How many «experiments» did they set up (varying one thing at a 7me)? Did they press all the controls «at random», or according to a systema0c plan? Did some students try to guess the right answer, without gaining first all the relevant informa7on? Copyright by F. Avvisa7 11

The aim of this TJA fellowship was to showcase log file analyses on the basis of PISA 2012 data. Because of its dynamic nature and the interac@on between problem and student, the problem solving assessment is a par-cular good candidate for these analyses. All analyses on the basis of the task Climate Control. 12

Agenda What was Problem Solving in PISA 2012? What did we do? The ra7onale Process data in PISA 2012 What did we learn? The results What s ahead? Implica7ons 13

14 PISA Creative Problem Solving Tasks

15

Top Control Temperature Central Control Humidity Bottom Control 16

Axis 2: Assessment instruments VI

Agenda What was Problem Solving in PISA 2012? What did we do? The ra7onale Process data in PISA 2012 What did we learn? The results What s ahead? Implica7ons 18

How to (even start to) explore log files? 1) How do students behave during assessment? à How long do students interact with a task? à Does -me- on- task vary across countries and subgroups? 2) Can we iden7fy good behavior? à Are students applying op-mal strategies? à Can we explain final performance through behavior? 3) Can we iden7fy different levels of low performance? à Where do unsuccessful students fail? à Where should interven-ons be heading? 19

1) How do students behave during assessment? à How long do students interact with a task? à Does -me- on- task vary across countries and subgroups? 20

Broad varia7on of 7me! - Time on task across students M=155.39 SD=85.16 21

No large difference of 7me on task by item score! correct incorrect 22

How do students behave during assessment? - Broad varia7on among students à Process data highlight heterogeneous behavior - BUT: No clear rela7on to performance à No easy solu-on for explana-ons of performance on an individual level 23

2) Can we iden-fy good behavior? à Are students applying op-mal strategies? à Can we explain final performance through behavior? 24

25

Applica7on of VOTAT? Strong connec7on to performance! no VOTAT Climate Control Q1 incorrect correct Total 42.2% 9.7% 51.9% VOTAT 6.9% 41.1% 48.1% Total 49.1% 50.9% 100.0% 26

Can we predict students problem solving performance? - Correla7on (7me on task, performance) = -.012 - Correla7on (VOTAT, performance) =.667 à Successful students take the same 7me but use VOTAT à Stable rela7ons across countries 27

3) Can we explain low performance? à Where do unsuccessful students fail? à Where should interven-ons be heading? 28

Top Control Temperature Central Control Humidity Bottom Control 29

How consistently did students apply VOTAT? 50 40 All countries Percent 30 20 41.1 27.3 10 incorrect correct 0 10.6 4.2 2.5 4.3 3.1 no isolated varia7on isolated varia7on of 1 input variable isolated varia7on of 2 input variables 7.0 isolated varia7on of all input variables (VOTAT) 30

How consistently did students apply VOTAT? 50 40 Singapore Percent 30 20 63.3 10 incorrect correct 0 11.2 6.6 2.9 2.7 2.2 4.7 no isolated varia7on isolated varia7on of 1 input variable isolated varia7on of 2 input variables 6.4 isolated varia7on of all input variables (VOTAT) 31

How consistently did students apply VOTAT? 50 40 Brazil Percent 30 20 45.5 incorrect correct 10 0 6.3 12.9 no isolated varia7on isolated varia7on of 1 input variable 1.7 3.9 1.8 4.8 isolated varia7on of 2 input variables 23.1 isolated varia7on of all input variables (VOTAT) 32

How consistently did students apply VOTAT? 50 40 Hungary Percent 30 20 32.5 34.2 10 incorrect correct 0 2.6 9.4 no isolated varia7on isolated varia7on of 1 input variable 5.8 4.5 5.5 5.5 isolated varia7on of 2 input variables isolated varia7on of all input variables (VOTAT) 33

Differences in behavior of unsuccessful students à Different training needs Subgroups: - No isolated varia7on at all Ø Lacking understanding of isolated varia7on? - Isolated varia7on, but not for all input variables Ø Lacking understanding why complete varia7on is necessary? - Complete VOTAT, wrong solu7on Ø Lacking applica7on of available informa7on? à Process data as point of departure for interven7ons! 34

Agenda What was Problem Solving in PISA 2012? What did we do? The ra7onale Process data in PISA 2012 What did we learn? The results What s ahead? Implica7ons 35

Main results 1) How do students behave during assessment? à Lots of varia@on across students à Systema@c differences between countries 2) Can we iden@fy good behavior? à Iden@fica@on of op@mal strategies possible à Excellent predic@on of final performance 3) Can we explain low performance? à Iden@fica@on of points of failure possible à Point of departure for interven@ons à A lot of insights thanks to process data! 36

Difficult to look into students minds, but instead from the What to the How Introducing a new layer of understanding 37

Applica7ons: Repor7ng in large- scale assessments A wealth of new research endeavors Connec7on to educa7onal policy 38

When should we meet again? Gartner s hype cycle for emerging technologies PISA 2012 PISA 2015 PISA 2018 39

Thanks to: Fonds Nationale de la Recherche Luxembourg & OECD for funding this work Francesco Avvisati, Katinka Hardt, Jonas Neubert, Judit Pal, Pablo Zoido TJA fellowship programme Contact Dr. samuel.greiff@uni.lu Research Area Group "Computer- Based Assessment 11, Porte des Sciences 4366 Esch, Luxembourg Phone: +352-466644- 9245 40

Questions & discussion

How consistently did students apply VOTAT? 70 60 50 All countries Percent 40 30 20 41.1 incorrect correct 10 0 27.3 10.6 4.2 2.5 4.3 3.1 7.0 no isolated varia7on isolated varia7on of 1 input variable isolated varia7on of 2 input variables isolated varia7on of all input variables (VOTAT) 44

How consistently did students apply VOTAT? 70 60 50 Singapore Percent 40 30 63.3 20 incorrect correct 10 0 11.2 6.6 2.2 6.4 2.9 2.7 4.7 no isolated varia7on isolated varia7on of 1 input variable isolated varia7on of 2 input variables isolated varia7on of all input variables (VOTAT) 45

How consistently did students apply VOTAT? 70 60 50 USA Percent 40 30 53.0 20 incorrect correct 10 0 16.1 10.1 2.2 1.8 3.0 3.3 no isolated varia7on isolated varia7on of 1 input variable isolated varia7on of 2 input variables 10.4 isolated varia7on of all input variables (VOTAT) 46

How consistently did students apply VOTAT? 70 60 50 Canada Percent 40 30 52.9 20 incorrect correct 10 0 18.0 10.4 2.2 2.9 5.0 2.3 6.4 no isolated varia7on isolated varia7on of 1 input variable isolated varia7on of 2 input variables isolated varia7on of all input variables (VOTAT) 47

How consistently did students apply VOTAT? 70 60 50 France Percent 40 30 20 48.3 incorrect correct 10 0 26.9 8.3 2.1 2.0 2.3 4.3 5.8 no isolated varia7on isolated varia7on of 1 input variable isolated varia7on of 2 input variables isolated varia7on of all input variables (VOTAT) 48

How consistently did students apply VOTAT? 70 60 50 Germany Percent 40 30 20 43.5 10 23.2 incorrect correct 0 11.1 1.4 0.1 5.6 4.0 no isolated varia7on isolated varia7on of 1 input variable isolated varia7on of 2 input variables 11.1 isolated varia7on of all input variables (VOTAT) 49

How consistently did students apply VOTAT? 70 60 50 Spain Percent 40 30 incorrect correct 20 10 0 26.4 16.4 4.5 2.1 1.9 5.4 6.1 no isolated varia7on isolated varia7on of 1 input variable isolated varia7on of 2 input variables 37.2 isolated varia7on of all input variables (VOTAT) 50

How consistently did students apply VOTAT? 70 60 50 Hungary Percent 40 30 20 32.5 34.2 incorrect correct 10 0 9.4 2.6 5.8 4.5 5.5 5.5 no isolated varia7on isolated varia7on of 1 input variable isolated varia7on of 2 input variables isolated varia7on of all input variables (VOTAT) 51

How consistently did students apply VOTAT? 70 60 50 Brasil Percent 40 30 20 45.5 incorrect correct 10 0 12.9 6.3 1.7 3.9 1.8 4.8 no isolated varia7on isolated varia7on of 1 input variable isolated varia7on of 2 input variables 23.1 isolated varia7on of all input variables (VOTAT) 52

How consistently did students apply VOTAT? 70 60 50 Bulgaria Percent 40 30 20 52.0 incorrect correct 10 0 14.5 6.6 2.3 4.5 1.6 3.3 no isolated varia7on isolated varia7on of 1 input variable isolated varia7on of 2 input variables 15.1 isolated varia7on of all input variables (VOTAT) 53