The Latin-American Laboratory for Assessment of the Quality of Education: Measuring and comparing educational quality in Latin America

Similar documents
No. 11. Table of Contents

The Conference Center. of the Americas. at the Biltmore Hotel. Miami, Florida

Overall student visa trends June 2017

15-year-olds enrolled full-time in educational institutions;

Professional Development and Incentives for Teacher Performance in Schools in Mexico. Gladys Lopez-Acevedo (LCSPP)*

South-South Cooperation FUCVAM, Uruguay

Target 2: Connect universities, colleges, secondary schools and primary schools

The International Coach Federation (ICF) Global Consumer Awareness Study

Department of Education and Skills. Memorandum

Interpreting ACER Test Results

MEASURING GENDER EQUALITY IN EDUCATION: LESSONS FROM 43 COUNTRIES

CONVENTION ON INTERNATIONAL TRADE IN ENDANGERED SPECIES OF WILD FAUNA AND FLORA

Gender and socioeconomic differences in science achievement in Australia: From SISS to TIMSS

Lesson M4. page 1 of 2

The Survey of Adult Skills (PIAAC) provides a picture of adults proficiency in three key information-processing skills:

WP 2: Project Quality Assurance. Quality Manual

Calle Fray José de Guadalupe Mojica, no. 31A Colonia San Felipe Neri, San Miguel de Allende Guanajuato, México

Study Away in Spanish

Introducing the New Iowa Assessments Mathematics Levels 12 14

School Size and the Quality of Teaching and Learning

REFLECTIONS ON THE PERFORMANCE OF THE MEXICAN EDUCATION SYSTEM

Mexico (CONAFE) Dialogue and Discover Model, from the Community Courses Program

Research Update. Educational Migration and Non-return in Northern Ireland May 2008

THE UNITED REPUBLIC OF TANZANIA MINISTRY OF EDUCATION, SCIENCE, TECHNOLOGY AND VOCATIONAL TRAINING CURRICULUM FOR BASIC EDUCATION STANDARD I AND II

Measuring up: Canadian Results of the OECD PISA Study

W O R L D L A N G U A G E S

National and Regional performance and accountability: State of the Nation/Region Program Costa Rica.

Global School-based Student Health Survey (GSHS) and Global School Health Policy and Practices Survey (SHPPS): GSHS

PIRLS. International Achievement in the Processes of Reading Comprehension Results from PIRLS 2001 in 35 Countries

Ministry of Education, Republic of Palau Executive Summary

RELATIONS. I. Facts and Trends INTERNATIONAL. II. Profile of Graduates. Placement Report. IV. Recruiting Companies

Effective Pre-school and Primary Education 3-11 Project (EPPE 3-11)

National Academies STEM Workforce Summit

WOMEN RESEARCH RESULTS IN ARCHITECTURE AND URBANISM

19. Emilio Sánchez Looking West from My Studio NEW VISIONS

The Curriculum in Primary Schools

Corpus Linguistics (L615)

HI0163 Sec. 01 Modern Latin America

Mathematics subject curriculum

Third Misconceptions Seminar Proceedings (1993)

ACADEMIC AFFAIRS GUIDELINES

CURRICULUM VITAE CECILE W. GARMON. Ground Floor Cravens Graduate Library 104 Fine Arts Center

IMPROVING ICT SKILLS OF STUDENTS VIA ONLINE COURSES. Rozita Tsoni, Jenny Pange University of Ioannina Greece

Assessment System for M.S. in Health Professions Education (rev. 4/2011)

Intermediate Algebra

Journalism. An interdepartmental program. Objectives. How to Become a Minor. Committee. Requirements for the Minor

Navigating in a sea of risks: MARISCO, a conservation planning method used in risk robust and ecosystem based adaptation strategies

CONSULTATION ON THE ENGLISH LANGUAGE COMPETENCY STANDARD FOR LICENSED IMMIGRATION ADVISERS

Higher education is becoming a major driver of economic competitiveness

ASSESSMENT OF STUDENT LEARNING OUTCOMES WITHIN ACADEMIC PROGRAMS AT WEST CHESTER UNIVERSITY

ALEJANDRO ADLER. +1 (267) Market St., Suite 200 Philadelphia, PA USA

Twenty years of TIMSS in England. NFER Education Briefings. What is TIMSS?

Berkeley International Office Survey

Navitas UK Holdings Ltd Embedded College Review for Educational Oversight by the Quality Assurance Agency for Higher Education

A Global Imperative for 2015: Secondary Education. Ana Florez CIES, New Orleans March 11th, 2013

Algebra 1, Quarter 3, Unit 3.1. Line of Best Fit. Overview

Teaching Colorado s Heritage with Digital Sources Case Overview

Accounting for student diversity

Regional Bureau for Education in Africa (BREDA)

Motivation to e-learn within organizational settings: What is it and how could it be measured?

Study Center in Santiago, Chile

Niger NECS EGRA Descriptive Study Round 1

NCEO Technical Report 27

South Carolina English Language Arts

Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd. Hertfordshire International College

University of Toronto Mississauga Degree Level Expectations. Preamble

Conversions among Fractions, Decimals, and Percents

Introduction to Causal Inference. Problem Set 1. Required Problems

IB Diploma Program Language Policy San Jose High School

Educational Indicators

e-learning Coordinator

Organising ROSE (The Relevance of Science Education) survey in Finland

Implementing Pilot Early Grade Reading Program in Morocco

Developing Flexible Dual Master s Degree Programs at UPAEP (Universidad Popular Autónoma del Estado de Puebla) and OSU (Oklahoma State University)

Numeracy Medium term plan: Summer Term Level 2C/2B Year 2 Level 2A/3C

Assessment of Generic Skills. Discussion Paper

Lirio del Carmen Gutiérrez Rivera

Linguistics Program Outcomes Assessment 2012

School Inspection in Hesse/Germany

Stakeholder Engagement and Communication Plan (SECP)

Progress Monitoring for Behavior: Data Collection Methods & Procedures

NATIONAL CENTER FOR EDUCATION STATISTICS RESPONSE TO RECOMMENDATIONS OF THE NATIONAL ASSESSMENT GOVERNING BOARD AD HOC COMMITTEE ON.

Norms How were TerraNova 3 norms derived? Does the norm sample reflect my diverse school population?

Learning Lesson Study Course

Greek Teachers Attitudes toward the Inclusion of Students with Special Educational Needs

International School of Kigali, Rwanda

UPPER SECONDARY CURRICULUM OPTIONS AND LABOR MARKET PERFORMANCE: EVIDENCE FROM A GRADUATES SURVEY IN GREECE

THE PENNSYLVANIA STATE UNIVERSITY SCHREYER HONORS COLLEGE DEPARTMENT OF MATHEMATICS ASSESSING THE EFFECTIVENESS OF MULTIPLE CHOICE MATH TESTS

Our Hazardous Environment

Unit 7 Data analysis and design

Developing skills through work integrated learning: important or unimportant? A Research Paper

UNITED STATES SOCIAL HISTORY: CULTURAL PLURALISM IN AMERICA El Camino College - History 32 Spring 2009 Dr. Christina Gold

WORKING PAPER. Identifying the Impact of Education Decentralization on the Quality of Education

Suggested Citation: Institute for Research on Higher Education. (2016). College Affordability Diagnosis: Maine. Philadelphia, PA: Institute for

The Impact of Formative Assessment and Remedial Teaching on EFL Learners Listening Comprehension N A H I D Z A R E I N A S TA R A N YA S A M I

Foreign Languages. Foreign Languages, General

A Study of the Effectiveness of Using PER-Based Reforms in a Summer Setting

THE IMPACT OF STATE-WIDE NUMERACY TESTING ON THE TEACHING OF MATHEMATICS IN PRIMARY SCHOOLS

Textbook Evalyation:

GALICIAN TEACHERS PERCEPTIONS ON THE USABILITY AND USEFULNESS OF THE ODS PORTAL

Transcription:

ASSESSMENT GEMS SERIES No. 3 March 2014 The Latin-American Laboratory for Assessment of the Quality of Education: Measuring and comparing educational quality in Latin America The Laboratorio Latinoamericano de Evaluación de la Calidad de la Educación (Latin-American Laboratory for Assessment of the Quality of Education or LLECE) is the network of national systems for the assessment of education quality in Latin America, created in 1994, and coordinated by UNESCO s Regional Bureau for Education in Latin America and the Caribbean (OREALC). LLECE s purpose is to produce data and knowledge that inform educational policy in the region, contribute to capacity building, and serve as a forum for reflection, exchange and generation of new ideas and good practices in education evaluation (UNESCO, 2013). Origins and context To provide useful information for education policymaking and implementation, LLECE organises comparative studies aiming to measure the quality of education in the region. Regional assessments have common indicators that allow comparisons between countries, acknowledging the particular features of the Latin American context. Regional assessments also enable comparisons with those countries that do not participate regularly in large-scale international studies such as the Programme for International Student Assessment (PISA) and the Trends in International Mathematics and Science Study (TIMSS). The First Regional Comparative and Explanatory Study (PERCE, by its Spanish acronym) was implemented by LLECE in 1997 in 13 countries. Nine years later, in 2006, the Second Study (SERCE) tested students in 16 countries plus one Mexican state. The Third Study (TERCE) was implemented in 2013 in 15 countries and the same Mexican state. PERCE and SERCE results are not comparable because the latter introduced a series of modifications resulting from the experience and knowledge gained from the implementation of PERCE. Some of the changes are related to sampling, test design, target population and knowledge domains covered by the assessment. However, by aligning methodology, SERCE and TERCE are comparable studies which measure the progress that countries have made in students educational achievement since 2006. The implementation of LLECE assessments is agreed between a National Coordinators Council and UNESCO s Regional Office of Education for Latin America and the Caribbean. Together these two groups define and decide all aspects of the study including instrument design, administration and analysis. The UNESCO regional office is responsible for deciding on the more technical aspects of the implementation and analysis. Quality assurance is also provided by an external Technical Consultative Committee of international experts in measurement and evaluation, who provide advice and support to the technical coordination during the different phases of the assessment. Each country is expected to contribute financially to participate in LLECE, but countries appear to vary considerably in the resources they have available. Countries not able to fully finance their participation are still encouraged to participate in the assessments; UNESCO provides the additional funding (Bilagher, 2013; Solano-Flores & Bonk, 2008). Australian Council for Educational Research ISSN 2203-9406 (Online)

Figure 1: Participant countries Compiled by author based on LLECE regional reports. Outlined map of Latin America retrieved from www.worldatlas.com State of Nuevo Leon Mexico Cuba Dominican Republic Honduras Guatemala Nicaragua El Salvador Costa Rica Venezuela Panama Colombia Ecuador Purpose of the assessment LLECE assessments aim to provide information about the quality of education in the region and guide decision-making in public education policies. In line with this purpose, LLECE s studies not only compare results between participant countries but also investigate the factors associated with student achievement. The focus here is to identify those school factors that can be influenced by educational policies. Peru Chile Brazil Bolivia Paraguay Argentina Uruguay Measurement objectives Cognitive domains LLECE s assessments measure student achievement in relation to curriculum objectives. In designing each round of assessments, LLECE s members review countries curricula in the subjects and grades to be assessed, and identify the common content and cultural roots embedded in the curriculum of the participating countries. From SERCE (2006), these assessments have also included the skills for life approach promoted by UNESCO. This approach considers that schools should provide knowledge and develop skills, values and attitudes that students will need for active participation in society, as individuals and citizens (LLECE, 2008). PERCE assessed students in two key areas of the curriculum: Language and Mathematics. Language is essential for developing knowledge and learning, and gives access to further knowledge and critical thinking. Mathematics is crucial for logical reasoning, problem solving and accuracy in data analysis (LLECE, 2000). In designing PERCE tests, LLECE defined five conceptual domains to be assessed in each subject, as shown in Table 1. Participating in PERCE Participating in SERCE Participating in PERCE & SERCE Participating in SERCE & TERCE Participating in PERCE & TERCE Participating in PERCE, SERCE & TERCE The Latin-American Laboratory for Assessment of the Quality of Education 2

Table 1. PERCE, conceptual domains assessed Language Identify different types of text Distinguish between text author and audience Identify the message in the text Recognize specific information within the text Identify vocabulary related to the text meaning Mathematics Numeracy Operations with Natural numbers Common fractions Geometry General skills (reading graphs, pattern recognition, probabilities, and relationships between data) SERCE widened the scope of areas to be assessed by including Writing and Science, and also integrated the skills for life approach into the tests. Since this second study, LLECE s assessments have not only identified what students have and have not learnt but also how they use this knowledge to understand, interpret and solve problems in a variety of real-life situations and contexts. From the analysis of participant countries curricula, SERCE defined conceptual domains and processes to be assessed in each subject, as shown in Table 2. Table 2. SERCE, conceptual domains and processes assessed Subject Conceptual domains Processes Reading Mathematics Sciences Writing Characteristics of the text (length, type, genre) Numbers and operations Geometry Measurement Statistics Variation and change Living beings and health Earth and environment Matter and energy Aims to describe in detail children s abilities and knowledge regarding writing a text according to given directions. Reading (e.g. locate information, identify narrative argument, metalanguage application) Recognition of objects and elements Simple and complex problem solving Recognition of concepts Interpretation and use of concepts Problem solving Contextual information Along with the cognitive tests, LLECE assessments collect information about the context of learning by administering questionnaires to students, their parents, teachers and school principals. These questionnaires are crucial for the identification of the factors associated with student achievement. Among other topics, the student questionnaire aims to collect information about the family and socio-cultural environment in which children live, the dynamics and interactions in the classroom, and the satisfaction the student has with the school, classmates, and teachers. The teacher questionnaire addresses aspects of teachers socio demographic and education background, work conditions, teaching experience, satisfaction with the school, and teaching strategies and practice. The questionnaire for principals collects information about their personal characteristics, education background and The Latin-American Laboratory for Assessment of the Quality of Education 3

experience, approach to school management, expectations, and satisfaction with the school and its members. The principal is also asked to provide information on school facilities and geographic location. The parent questionnaire aims to collect information regarding socio-demographic family characteristics, the availability of services and resources in the household, and the participation and support of parents in their children s education, among other topics. TERCE introduced a national section of associated factors to the context questionnaires in some countries 1. Through these national sections, each country could further investigate the factors affecting learning that are specific to the national context (such as learning and cultural diversity, learning and school violence). TERCE also introduced two new sections in some of the questionnaires to address the impact of ICT in the quality of learning, and the relationship between eating habits and learning. Target population and sampling methodology LLECE s assessments are grade-based and students in the grades defined as the target population are tested regardless of their age. The PERCE target population was students in Grade 3 and Grade 4, who were tested in Language and Mathematics. Grade 3 was included because this is the level at which most Latin-American curricula expect that children have acquired the basic reading-writing and mathematics skills that they will need to continue their studies. Grade 4 was included to collect information about the progress in achievement between these two grades (LLECE, 2001). With the purpose of having a better picture of student achievement in primary education, SERCE and TERCE s target populations were students in Grade 3 and Grade 6 (Bilagher, 2013). The younger group was assessed in Reading, Mathematics and Writing; Grade 6 students were tested in the same three subjects plus Science. 2 LLECE administers tests to a stratified sample of students in each country. However, the three studies used a different sampling procedure. PERCE defined three strata based on school location (mega city/urban/rural) and two strata based on school management (public/private); these strata were also used for reporting results. The planned sample size was set at 4000 students per country and the sampling procedure was not weighted and not proportional to the target population in each country. Within a school, a minimum of 20 students per grade were randomly selected and the minimum number of schools per country was 100. In total, nearly 55 000 students were tested from 1500 schools (LLECE, 2001). SERCE used a stratified proportionally to size random-sampling procedure. Two strata were defined based on school location (urban/rural), two based on school management (public/private), three based on school size (small/ medium/large), and four based on the grades per school grades per school, as represented by the ratio between Grade 6 and Grade 3 enrolment numbers. Within selected schools, all students in Grade 3 and Grade 6 were tested. In SERCE strata were not used for reporting. At the national level, the minimum number of students required for Grade 3 was set at 5300 and for Grade 6 at 4700. The minimum number of schools per country was set at 150. In total, there were nearly 196 000 students tested from 3065 schools (LLECE, 2010). TERCE s sampling approach introduced modifications to the sampling frame of the assessment by modifying the exclusion criteria, the number and type of strata, and the approach for selecting schools and students within schools (LLECE, 2013a). 1 The relationship between learning and ethnic and cultural diversity of an indigenous population was investigated in Ecuador and Guatemala. The relationship between learning and school violence was investigated in Paraguay and Guatemala. The relationship between learning and use of ICT was investigated in Costa Rica. The relationship between participation in full-day schooling and achievement of learning outcomes was investigated in Uruguay. 2 The Science test was only administered in nine countries (Argentina, Colombia, Cuba, Dominican Republic, El Salvador, Panama, Paraguay, Peru, and Uruguay) and the Mexican state of Nuevo León. The Latin-American Laboratory for Assessment of the Quality of Education 4

In previous studies, LLECE excluded from the target population those schools with fewer than six students enrolled in the targeted grades. Students whose mother tongue was different to the language of the test and who have not received at least two years of instruction in the language of the test were also excluded from the sample. TERCE removed this last criterion with the aim of studying the effect of mother tongue on student achievement. Furthermore, considering that in some countries small schools are a structural characteristic of the education system and their exclusion would reduce the representativeness of the sample, it was decided to exclude only the two per cent of smallest schools from TERCE. TERCE maintained the school location and school type strata; modified the definition of the stratum referred to grades per school (only Grade 3/only Grade 6/Grade 3 and Grade 6); and no longer used the school size stratum. This last information was incorporated as part of the school selection process: within each stratum, schools were selected proportionally to their size. Within each selected school an intact classroom per grade was randomly selected. The minimum number of participant schools per country was set to 150. The final number of participants will be known when LLECE releases TERCE s results. Assessment administration LLECE s assessments are administered during school hours near to the end of the school year (May June for northern countries and August December for southern countries). Each country must hire and train independent test administrators. UNESCO s regional office provides detailed manuals outlining the profile and responsibilities of the National Coordinator and test administrators. The test and corresponding context questionnaires are administered in a paper-and-pencil format. Most of the test items are multiple choice but some open-ended items are also included. The booklets are scanned and organised at the national level and sent to UNESCO s regional office for analysis. As of November 2013, there is no defined frequency for LLECE assessments. Reporting and dissemination LLECE reports assessment results using a single continuous scale obtained from the application of the Rasch Model (Item Response Modelling approach) for each subject. For the analysis of factors associated with student achievement, that is, contextualising results, LLECE uses Hierarchical Linear Models. LLECE s strategy for reporting results consists of two stages. In the first stage, LLECE publishes a report with the overall results for the region and each country, focusing on comparing the average scores of countries and variance in each of the assessed grades and subjects. If it has been previously agreed, sampling strata are also used to report results, as was the case for PERCE. In this first stage, results are also analysed in terms of performance levels describing what students can do. PERCE defined three overall performance levels per subject and, based on experts judgement, also set an expected percentage of students for each level (90 per cent, 75 per cent and 50 per cent, from the lowest to highest level). The distribution of students across these three levels is compared between countries. To be considered as having achieved an adequate performance level, a country should reach the expected percentage of students in each level (LLECE, 2000). SERCE defined four performance levels for each grade for each assessed subject. These levels are specified simultaneously for each content domain and cognitive process assessed (see Table 2), and reflect progressive levels of difficulty. Countries are compared based on the percentage of students reaching each of these levels (LLECE, 2008). In the second stage, normally two or three years after the assessment has been completed, LLECE publishes a report on associated factors, aiming to explore the relationship between student and school variables (obtained from the context questionnaires) and student achievement. The purpose is not only to relate contextual factors to student performance, but also to identify influential factors that could be modified by educational policy, particularly at the school level. The Latin-American Laboratory for Assessment of the Quality of Education 5

Influence Due to its regional nature, it is difficult to evaluate the impact that LLECE assessments have had in each participant country. LLECE reports always conclude with a chapter on recommendations for education policy development (LLECE, 2013b); however, no information is available about whether these recommendations have triggered any changes in policy or practice. A study on this topic is envisaged in the LLECE Strategic Plan for 2015 to 2019 (Bilagher, 2013). Arguably, LLECE s most important achievement is the strengthening of technical capacity regarding the design and administration of large-scale assessments in member countries, particularly in those with recently established assessment units (Diaz & Flores, 2008). References Bilagher, M. (2013). [Personal communication]. Diaz, M., & Flores, G. (2008). SERCE, Resultados Nacionales [SERCE, National Results]. Mexico D.F.: Instituto Nacional para la Evaluación de la Educación. LLECE. (2000). Primer Estudio Internacional Comparativo sobre Lenguage, Matemática y Factores asociados. Segundo Informe [First International Comparative Study about Language, Mathematics and Associated Factors. Second Report]. Santiago de Chile: UNESCO-Santiago. LLECE. (2001). Primer Estudio Internacional Comparativo sobre Lenguage, Matemática y factores asociados. Informe Técnico [First International Comparative Study about Language, Mathematics and Associated Factors. Technical Report]. Santiago, Chile: LLECE / UNESCO-Santiago. LLECE. (2008). Segundo Estudio Regional Comparativo de los aprendizajes de los estudiantes de América Latina y El Caribe. Primer Reporte [Second International Comparative Study about student learning in Latin American and the Caribbean. First Report]. Santiago, Chile: LLECE / UNESCO-Santiago. LLECE. (2010). Compendio de los Manuales del SERCE [SERCE s manuals compendium]. Santiago, Chile: LLECE / OREALC-UNESCO Santiago. LLECE. (2013a). Diseño Muestral Tercer Estudio Regional Comparativo y Explicativo (TERCE) [Sampling framework for the Third Regional Comparative and Explanatory Study (TERCE)] LLECE. (2013b). Laboratorio Latinoamericano de Evaluación de la Calidad de la Educación. Retrieved November, 2013, from http://www.llece.org Solano-Flores, G., & Bonk, W. (2008). Evaluation of the Latin American Laboratory for the Evaluation of Educational Quality (LLECE): UNESCO-Internal Oversight Service Evaluation Section. Retrieved from http://unesdoc.unesco.org/ images/0016/001626/162674e.pdf. UNESCO. (2013). Latin American Laboratory for Assessment of the Quality of Education (LLECE). Retrieved October, 2013, from http://portal.unesco.org/geography/en/ev.php-url_ ID=7919&URL_DO=DO_TOPIC&URL_SECTION=201.html The ACER Centre for Global Education Monitoring supports the monitoring of educational outcomes worldwide, holding the view that the systematic and strategic collection of data on educational outcomes, and factors related to those outcomes, can inform policy aimed at improving educational progress for all learners. http://www.acer.edu.au/gem The Latin-American Laboratory for Assessment of the Quality of Education 6