International Comparison of Information Literacy in Digital Environments

Similar documents
NCEO Technical Report 27

STA 225: Introductory Statistics (CT)

How to Judge the Quality of an Objective Classroom Test

Running head: LISTENING COMPREHENSION OF UNIVERSITY REGISTERS 1

Psychometric Research Brief Office of Shared Accountability

Rubric for Scoring English 1 Unit 1, Rhetorical Analysis

What is PDE? Research Report. Paul Nichols

Further, Robert W. Lissitz, University of Maryland Huynh Huynh, University of South Carolina ADEQUATE YEARLY PROGRESS

Evidence for Reliability, Validity and Learning Effectiveness

Alpha provides an overall measure of the internal reliability of the test. The Coefficient Alphas for the STEP are:

IMPROVING ICT SKILLS OF STUDENTS VIA ONLINE COURSES. Rozita Tsoni, Jenny Pange University of Ioannina Greece

Research Brief. Literacy across the High School Curriculum

The Good Judgment Project: A large scale test of different methods of combining expert predictions

Linking the Ohio State Assessments to NWEA MAP Growth Tests *

Introduction to Questionnaire Design

Integration of ICT in Teaching and Learning

George Mason University Graduate School of Education Program: Special Education

VIEW: An Assessment of Problem Solving Style

NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE)

DESIGNPRINCIPLES RUBRIC 3.0

TU-E2090 Research Assignment in Operations Management and Services

MGMT3403 Leadership Second Semester

Graduate Program in Education

Conceptual Framework: Presentation

ACADEMIC AFFAIRS GUIDELINES

OVERVIEW OF CURRICULUM-BASED MEASUREMENT AS A GENERAL OUTCOME MEASURE

Corpus Linguistics (L615)

Saeed Rajaeepour Associate Professor, Department of Educational Sciences. Seyed Ali Siadat Professor, Department of Educational Sciences

Analysis: Evaluation: Knowledge: Comprehension: Synthesis: Application:

Evidence-Centered Design: The TOEIC Speaking and Writing Tests

Technical Manual Supplement

Norms How were TerraNova 3 norms derived? Does the norm sample reflect my diverse school population?

Probability and Statistics Curriculum Pacing Guide

Iowa School District Profiles. Le Mars

UK Institutional Research Brief: Results of the 2012 National Survey of Student Engagement: A Comparison with Carnegie Peer Institutions

Early Warning System Implementation Guide

What Is The National Survey Of Student Engagement (NSSE)?

ACTL5103 Stochastic Modelling For Actuaries. Course Outline Semester 2, 2014

USER ADAPTATION IN E-LEARNING ENVIRONMENTS

Developing an Assessment Plan to Learn About Student Learning

PREPARING FOR THE SITE VISIT IN YOUR FUTURE

QUESTIONS ABOUT ACCESSING THE HANDOUTS AND THE POWERPOINT

Greek Teachers Attitudes toward the Inclusion of Students with Special Educational Needs

South Carolina English Language Arts

DOES OUR EDUCATIONAL SYSTEM ENHANCE CREATIVITY AND INNOVATION AMONG GIFTED STUDENTS?

PAGE(S) WHERE TAUGHT If sub mission ins not a book, cite appropriate location(s))

Redirected Inbound Call Sampling An Example of Fit for Purpose Non-probability Sample Design

LITERACY ACROSS THE CURRICULUM POLICY

The Oregon Literacy Framework of September 2009 as it Applies to grades K-3

School Leadership Rubrics

PIRLS. International Achievement in the Processes of Reading Comprehension Results from PIRLS 2001 in 35 Countries

Enhancing Students Understanding Statistics with TinkerPlots: Problem-Based Learning Approach

Reference to Tenure track faculty in this document includes tenured faculty, unless otherwise noted.

TRAITS OF GOOD WRITING

Empowering Students Learning Achievement Through Project-Based Learning As Perceived By Electrical Instructors And Students

Diploma in Library and Information Science (Part-Time) - SH220

CONSULTATION ON THE ENGLISH LANGUAGE COMPETENCY STANDARD FOR LICENSED IMMIGRATION ADVISERS

1. Answer the questions below on the Lesson Planning Response Document.

10.2. Behavior models

ENG 111 Achievement Requirements Fall Semester 2007 MWF 10:30-11: OLSC

Software Development Plan

Major Milestones, Team Activities, and Individual Deliverables

Ecosystem: Description of the modules:

K 1 2 K 1 2. Iron Mountain Public Schools Standards (modified METS) Checklist by Grade Level Page 1 of 11

Gridlocked: The impact of adapting survey grids for smartphones. Ashley Richards 1, Rebecca Powell 1, Joe Murphy 1, Shengchao Yu 2, Mai Nguyen 1

EXECUTIVE SUMMARY. TIMSS 1999 International Science Report

A Pilot Study on Pearson s Interactive Science 2011 Program

This Performance Standards include four major components. They are

OPTIMIZATINON OF TRAINING SETS FOR HEBBIAN-LEARNING- BASED CLASSIFIERS

The College Board Redesigned SAT Grade 12

Engineers and Engineering Brand Monitor 2015

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

EXECUTIVE SUMMARY. TIMSS 1999 International Mathematics Report

MSW POLICY, PLANNING & ADMINISTRATION (PP&A) CONCENTRATION

Houghton Mifflin Online Assessment System Walkthrough Guide

Guide for Test Takers with Disabilities

Charter School Performance Accountability

School Competition and Efficiency with Publicly Funded Catholic Schools David Card, Martin D. Dooley, and A. Abigail Payne

1 Use complex features of a word processing application to a given brief. 2 Create a complex document. 3 Collaborate on a complex document.

THEORY OF PLANNED BEHAVIOR MODEL IN ELECTRONIC LEARNING: A PILOT STUDY

learning collegiate assessment]

PROGRAMME SPECIFICATION

Scoring Guide for Candidates For retake candidates who began the Certification process in and earlier.

Probability Therefore (25) (1.33)

English Language Arts Missouri Learning Standards Grade-Level Expectations

November 2012 MUET (800)

BENCHMARK TREND COMPARISON REPORT:

Types of curriculum. Definitions of the different types of curriculum

Kentucky s Standards for Teaching and Learning. Kentucky s Learning Goals and Academic Expectations

Standard 5: The Faculty. Martha Ross James Madison University Patty Garvin

Review of Student Assessment Data

TAIWANESE STUDENT ATTITUDES TOWARDS AND BEHAVIORS DURING ONLINE GRAMMAR TESTING WITH MOODLE

MGMT 479 (Hybrid) Strategic Management

success. It will place emphasis on:

Summary results (year 1-3)

MASTER S COURSES FASHION START-UP

E-Teaching Materials as the Means to Improve Humanities Teaching Proficiency in the Context of Education Informatization

Impact of Digital India program on Public Library professionals. Manendra Kumar Singh

Niger NECS EGRA Descriptive Study Round 1

PROGRESS MONITORING FOR STUDENTS WITH DISABILITIES Participant Materials

Transcription:

International Comparison of Information Literacy in Digital Environments Dr. Mark Zelman World Bank, United States (marknjpr@yahoo.com) Dr. Tigran Shmis World Bank, Russian Federation (tshmis@worldbank.org) Dr. Svetlana Avdeeva National Foundation for Training (NFPK), Russian Federation (Avdeeva@ntf.ru) Dr. Kirill Vasiliev World Bank, Russian Federation (kvasiliev@worldbank.org) Dr. Isac Froumin World Bank, Russian Federation (ifroumin@worldbank.org) Keywords: ICT Literacy, Information Literacy, Psychometrics, Validity, International comparison

ABSTRACT Despite coming of digital technology age, many have struggled over the concept of a minimum information and communication literacy skills - locating, evaluating, and communicating information as an important outcome of school education. The purpose of this paper is to provide a description of an innovative ICT literacy assessment instrument, developed within World Bank READ program in Russia which sought to develop contemporary problem-based and scenariobased approach to ICT literacy assessment for multinational secondary school environment and to develop benchmarks which could be used to assess levels of ICT literacy skills of secondary school students in different countries. ICT literacy assessment tool was developed and implemented on a sub-national scale in Russia during 2005-2010 and piloted for international use in Tatarstan and Thailand with strong support by the Government of Russia, Republic of Tatarstan, IPST and the World Bank 1. The case studies provide real life examples of how ICT instrument was implemented in participating countries. The paper presented will include a discussion of (i) how testing results could inform policymakers, secondary school principals and teachers, parents and students about ICT literacy level, (ii) secondary school information literacy curriculum development, and (iii) other policy related issues. INTRODUCTION Discussions of ICT in Education typically includes many fields such as mathematics, economics, geography etc., [1], each with their own definitions of Information and Communication Literacy. There are further discussions that include the ICT framework such as 21st century skills, information ICT competence, web literacy, information fluency, media literacy and media competence, and others. Despite of the differences in connotations all these discussions are pointing out that ICT literacy skills are a 21st century form of literacy which have as much importance as reading and writing skills in earlier centuries. ICT literate students master subject content faster, are good problemsolvers and better self-directed learners [2]. As a result, primary and secondary schools are beginning to make effort of improving students ICT literacy. However, without effective assessment it is difficult to know if instructional programs and curriculum overall are successful are students ICT literacy skills improving? 1 Under Russian Education Aid for Development (READ) fee based services program and Development of ICT in education technologies in Tatarstan fee based services program. 2

ICT LITERACY ASSESSMENT Starting from 2005, World Bank (WB) in Russia was launching projects to study the growing importance of existing and emerging information and communication technologies and their relationship to literacy. It was agreed that little had been done to address ICT literacy skills in primary and secondary schools. In response, Russian National Training Foundation (NTF) with support of the World Bank have developed a scenario-based computer delivered ICT literacy assessment that measures students abilities to research, organize, and communicate information using technology (E-Learning support project - http://web.worldbank.org/external/projects/main?pagepk=64283627&pipk=73230&thesitepk= 305600&menuPK=305634&Projectid=P075387). The ICT literacy assessment focuses on cognitive problem solving, critical reasoning and critical reading skills associated with using elementary technology to handle information. The developed assessment measures ICT literacy through seven processes, which represent important problem-solving and critical thinking aspects of ICT literacy skill (Table 1). Table 1: Components of ICT literacy (developed based on [3]) PROCESS Define DEFINITION Using ICT tools to identify and appropriately represent an information need Access Knowing about and knowing how to collect and/or retrieve information Manage Organizing information into existing classification schemes Integrate Interpreting, summarizing, comparing and contrasting information using similar or different forms of representation Evaluate Reflecting to make judgments about the quality, relevance, usefulness, or efficiency of information Create Generating new information and knowledge by adapting, applying, designing, inventing, or representing information Communicate Conveying information and knowledge to various individuals and/or groups 3

The students ICT literacy was described in terms of ICT literacy proficiency levels (Table 2). Five proficiency levels were defined based on student s performance on of each of the seven ICT literacy components or processes (Table 1). The levels are not discrete discontinuous steps but are a method of representing progress. Table 2: ICT Literacy Proficiency Levels Description Level ICT Literacy Level Description % of students Tatarstan % of students Thailand 5 When a student defines information, he or she does the following: 12 9 Generates and justifies questions regarding an information need Refines a vague research question to one that is appropriately specific. Discerns a highly appropriate information need from an existing problem When a student accesses information, he or she does the following : Chooses an information resource that are appropriate, cost effective, and efficient Searches for information in multiple sources in a directed and reflective manner Uses search strategies for a given information need Recognizes and respects authorship, copyright, trademark, and confidential information When a student manages information, he or she does the following: Creates or selects an information classification scheme that allows efficient storage, integration, and recall of information to meet anticipated need Recognizes and treats confidential or sensitive information appropriately Recognizes and follows security procedures When a student integrates information, he or she does the following: Uses multiple dissimilar sources to satisfy information need of summarization and synthesis When a student evaluates information, he or she does the following: Establishes criteria for judging the appropriateness of information Identifies and selects resources that meet all or nearly all of the criteria Recognizes and respects legal and ethical rights of information use When a student creates information, he or she does the following: Draws appropriate conclusions about information, even in contexts in which conflicting information is presented and supports conclusions When a student communicates information, he or she does the following: Fulfills all points of his or her communication plan Customizes the presentation of information to respond to each audience s information need Cites sources appropriately 4

4 Students working at level 4 define, access, manage, information products at level 5. However 4 6 When a student integrating information, he or she does the following: Uses multiple sources to satisfy an explicit information need of summarization and synthesis, with little inclusion of irrelevant information Draws justifiable conclusions about information, though such conclusions may be one-sided Supports conclusions with some extraneous information When a student evaluates information, he or she does the following: Applies pre-established criteria for judging the appropriateness of information Selects resources that are relevant and that satisfy at least the most critical of the criteria Proficient Standards for Year 9 3 Students working at level 3 integrate and evaluate information products at level 4. When a student defines information, he or she does the following: 53 69 Generates questions regarding an information need that are reasonably specific Refines a vague research question to one that is specific. Discerns a marginally appropriate information need from an existing problem When a student accesses information, he or she does the following: Searches for information in a somewhat directed manner Selects resultant information that is generally relevant to the information need Respects authorship, copyright, and confidential information When a student accesses information, he or she does the following: Selects an information classification scheme that allows them to meet a stated information integration or classification need When a student integrates information, he or she does the following: Uses multiple sources to satisfy an explicit information need of summarization or synthesis When a student evaluates information, he or she does the following: Applies pre-established criteria for judging the appropriateness of information Identifies and selects resources that are relevant and that satisfy the most critical criteria May continue searching piling up unnecessary information. Respects legal and ethical rights of information use 5

2 Students working at level 2 access, manage, communicate, create, define information products 14 13 at level 3. However When a student integrates information, he or she does the following: Omits critical information to satisfy need, Includes irrelevant information Organizes information in chaotic way or organizes by source rather than theme When a student evaluates information, he or she does the following: Violates pre-established criteria for judging the appropriateness of information Selects resources that are only marginally relevant or completely irrelevant Terminates search before sufficient sources have been obtained Ignores explicitly stated legal and ethical rights of information use 1 Students working at level 2 integrate and evaluate information products at level 2. 16 3 When a student defines information, he or she does the following: Selects questions that are irrelevant or marginally relevant, unclear, or vague Identifies inappropriate information needs from an existing problem or is unable to identify an information need When a student accesses information, he or she does the following: Chooses an overly general or inappropriate information resource Searches for information in a disorganized manner Obtains irrelevant results, avoid delimiting terms Selects resultant information that is marginally relevant to the information need Accesses and uses information regardless of authorship, copyright, and confidentiality. When a student manages information, he or she does the following: Applies an existing information classification scheme inappropriately Fails to respect explicitly stated confidentiality and security issues When a student creates information, he or she does the following: Draws unsupported conclusions about information Supports conclusions with little of the most critical information relevant to the conclusion No connections between elements in source material and conclusions Organizes information in a chaotic way When a student communicates information, he or she does the following: Fulfills some of the points of his or her communication plan Uses the same presentation of information to respond to various audiences information needs Is careless with confidential information 6

In addition to deriving the ICT literacy proficiency scale, proficient standards were established for Year 9. The proficient standards had been set at the level 3 of proficiency scale and represent a challenging but reasonable expectation for typical Year 9 students to have reached by the end of year of study. VALIDITY OF ICT LITERACY ASSESSMENT Before using an assessment, there should be evidence of its validity: the extent to which scores on the assessment reflect students ICT literacy skills. As Messick [4 a, b] suggested, there are basically two different kinds of validity evidence- convergent validity evidence and discriminant validity evidence. Convergent validity is supported if assessment scores correlate with other measures that are expected to be related to ICT literacy. Discriminant validity is supported if scores do not correlate with measures thought to be distinct from ICT literacy. These two kinds of validity evidences can be generated by using all of the methods of science, but Messick [4a, b] also observes, there are only a few distinct ways in which validity data are gathered. One of these ways involve examining external structure, i.e., relationship of scores to other measures or background variables. The external structure of the ICT literacy assessment had been examined by means of comparison measures there were developed from questionnaires administered to test-takers before they completed the ICT literacy assessment. Participants For the pilot study a representative sample of 395 of Year 9 students from rural and urban schools of the Republic of Tatarstan have been chosen. Table 3 shows the demographic and academic characteristics of the participants. It was impossible to obtain a representative sample in Thailand because of a scarce some budget. The sample was drown from IPST schools in Bangkok (the sample not shown). 7

Table 3: Characteristic of analytic sample Gender % Female 56 Male 44 General academic performance % D or D- 0.2 C- 13.0 C 22.0 B- 31.0 B 13.0 A- 13.8 A 7.0 Procedure The ICT Literacy Assessment was administered at different schools in Tatarstan and Thailand, and so each administration differed on a number of details such as the time-of-day, number of students within each administration etc. However, certain characteristics remained consistent. The delivery mode allowed to use of school computers, but in a way that did not affect the results of the ICT literacy assessment. Students first completed a background questionnaire before beginning the assessment. All testing sessions were proctored. If a student did not complete the assessment within the allotted time, the testing software stopped the section and asked the student to alert the proctor. After completing both the background questionnaire and ICT literacy test, students participated in a focus group concerning their experiences in taking the assessment. Instruments ICT Literacy Assessment scores. The purpose of the ICT Literacy sample based large scale assessment delivered in 2011 was to describe the ICT literacy levels of Year 9 student population in the aggregate (no individual scores or ICT literacy level were reported). Each test taker received tasks that targeted all of the seven processes. In addition to ICT proficiency scale (Table 2), raw scores for each test form were separately scaled to a mean scale score of 150 and a standard deviation of 35 scale score units. Each test taker s results were treated equally, regardless of the particular test form received. This equating across test forms was supported by analyses which showed relatively high (low.80s) inter- correlations among the seven ICT literacy processes. Cronbach s alpha reliabilities of ICT literacy test forms were reaching.90s. 8

Self-assessment measures. Two types of self-report measures were developed from the demographic and academic questionnaire administered prior to the ICT Literacy Assessment. Table 4 provides more details on the measures as well as descriptive statistics. 1.Self Assessment measures evaluated students opinions of their skills related to ICT literacy proficiencies. There are many instances of self-assessments have been used for validation of objective measures [5]. Research on self-assessment measures have revealed moderate correlations (high.20s) between self- assessment and objective performance measures [6]. 2.Academic performance measures reflect Year 9 students general academic performance. Students with high general academic performance tend to score better on a broad range of assessments. Therefore investigation into the validity of an assessment instrument must investigate whether the instrument assesses the underlying trait rather than measuring only general academic performance. However it could be expected that there are some connections between academic performance measures and ICT literacy. The students with high general academic performance level might be more likely to score higher on ICT literacy assessment as a result of understanding the importance of ICT literacy skills for their academic studies. Results and Discussion of Validity of ICT Literacy Assessment Correlations between the self-report measures and ICT literacy scores are shown in Table 4 below. Table 4: Correlations of Self-Report Measures with ICT Literacy Results Measure Correlation Academic Performance General academic performance 0.40 Self Assessment Skills Confidence in ICT literacy activities 0.27 Frequency of ICT literacy activities -0.01 Correlation all of ICT literacy self-report measures, but frequency of ICT literacy activities, with performance on the ICT Literacy Assessment consistent with research comparing self-report measures of skills to assessment scores (e.g., [6]). These findings are supporting the convergent validity of the assessment. Analyses showed a high correlation between frequency of ICT literacy activities scale and confidence in ICT literacy activities scale. Therefore, the low correlation between ICT literacy assessment results with frequency of ICT literacy activities might be due to a common believe 9

among Year 9 students that more ICT literacy competent students have more daily interactions with the Internet, supporting the divergent validity of the ICT literacy assessment. These findings are also supporting a claim that frequency of ICT literacy activities without proper instructions does not translate to ICT literacy skills. General academic performance correlated weakly with confidence in ICT literacy activities measure. The correlation coefficient is close to zero. Thus, ICT literacy confidence measure is distinct from general academic performance. However both measures are correlating with ICT literacy assessment results. ICT LITERACY PROFILE In comparing achievement across countries, it is important to consider differences in students curricular experiences, how these differences may affect they ICT skills, and their subsequent ICT assessment results. The Test-Curriculum Matching Analysis was conducted for Tatarstan and Thailand to investigate the extent to which the ICT Literacy Assessment have been relevant to each country s ICT literacy curriculum and and to evaluate the impact on a country s ICT Literacy Assessment performance. Table 5 below present the Test-Curriculum Matching Analysis results. Table 5: ICT Proficiency Test-Curriculum Matching Analysis PROFICYENCY PROFICYENCY PROFICYENCY DEFINITION MEASURED MEASURED MEASURED BY ICT BY BY THAI LITERACY TATARSTAN CURRICULUM ASSESSMENT CURRICULUM Cognitive _ The desired foundational skills of everyday life at _ school, at home, and at work. Literacy, numeracy, problem solving, and spatial/visual literacy demonstrate these proficiencies. Technical Technical Technical The basic components of digital literacy. It includes a foundational knowledge of hardware, software applications, networks, and elements of digital technology. Ethical Ethical and legal access and use of information, for example selecting a license agreement before downloading software. 10

Proficiency Test-Curriculum Matching Analysis showed that generally 30 percent of proficiencies measure in ICT Literacy assessment had curricula coverage in Tatarstna and Thailand. For example, students had been taught how to: build slide presentations, but have not been taught how to tailor it to an audience; store emails in folders, but have not been taught how to find them later; search the web, but have not been taught how to identify trustworthy information or build knowledge. The ICT literacy assessment, therefore, include some tasks that are measuring proficiencies which have not been taught to many students in the participating countries. Figure 1 below shows the distribution of ICT literacy across the five proficiency levels described. Figure 1: Distribution of ICT literacy across proficiency levels for Tatarstan and Thailand Proficiency level percentages in Table 2 (and those illustrated in Figure 1) show that overall Year 9 Thai students are operating approximately one proficiency level higher than Year 9 Tatarstan students at the level of proficient standards. Figure 1 also shows that a higher proportion of Tatarstan students are at the lower end of the ICT literacy proficiency scale than the corresponding proportion for Thai students. Approximately 30 per cent of Tatarstan students are working at proficiency levels 1 and 2 whereas approximately 16 per cent of Thailand students are working at proficiency levels 1 and 2 (with only 3 per cent at level 1). 11

CONCLUSIONS This paper provides evidences for the validity argument of the ICT Literacy Assessment. It paves a way for the assessment to be used, among other purposes, as a longitudinal tool for comparison of ICT literacy level of different countries. Also the paper has presented results of pilots in Tatarstan and Thailand which showed that 84 per cent of Thai students and 69 per cent of Tatarstan students reached or exceeded the Year 9 proficient standard by demonstrating the ability described in Table 1. Therefore inclusion or exclusion of ICT proficiencies in the country s curriculum does not guarantee students opportunity to learn. Just as important is what their teachers choose to teach them. The lessons provided by the teachers ultimately determine the ICT literacy skills students are taught. It is hoped that this information will encourage to compare and contrast the different approaches taken by Tatarstan and Thailand in the teaching and learning of ICT. As mentioned above, a sampling error for Thai sample needs to be taken into account when making these inferences. REFERENCES [1] Gapski, H. (2007). Some Reflections on Digital Literacy. Proceedings of the 3rd International Workshop on Digital Literacy (pp. 49-55). Crete, Greece. [2] Powers, D. (2002). Self-assessment of reasoning skills. ETS Research Report No. RR-02-22. Princeton, NJ: Educational Testing Service. [3] American Association of School Librarians & Association for Educational Communications and Technology (1998). Information literacy for student learning: Standards and indicators. [4a] Messick, S. (1989). Meaning and values in test validation: The science and ethics of assessment. Educational Researcher. [4b] Messick, S. (1989). Validity. In R. Linn (Ed.), Educational measurement (3rd ed.,). Washington, DC: American Council on Education/Macmillan. [5] Mabe, P.A., & West, S. G. (1982). Validity of self- evaluation of ability: A review and meta-analysis. Journal of Applied Psychology, 67. [6] Love, K. G., & Hughes, F. V. (1994). Relationship of self-assessment ratings and written test score: Implications for law enforcement promotional systems. Public Personnel Management, 12