Monitoring trends in educational growth

Similar documents
REGIONAL CAPACITY BUILDING ON ICT FOR DEVELOPMENT

Twenty years of TIMSS in England. NFER Education Briefings. What is TIMSS?

Setting the Scene and Getting Inspired

Regional Capacity-Building on ICT for Development Item 7 Third Session of Committee on ICT 21 November, 2012 Bangkok

TRANSNATIONAL TEACHING TEAMS INDUCTION PROGRAM OUTLINE FOR COURSE / UNIT COORDINATORS

MEASURING GENDER EQUALITY IN EDUCATION: LESSONS FROM 43 COUNTRIES

SACMEQ's main mission was set down by the SACMEQ Assembly of Ministers as follows:

5 Early years providers

Higher education is becoming a major driver of economic competitiveness

Master s Programme in European Studies

Educational system gaps in Romania. Roberta Mihaela Stanef *, Alina Magdalena Manole

16-17 NOVEMBER 2017, MOSCOW, RUSSIAN FEDERATION OVERVIEW PRESENTATION

e-portfolios in Australian education and training 2008 National Symposium Report

Educational Indicators

California Professional Standards for Education Leaders (CPSELs)

Assessment of Generic Skills. Discussion Paper

UK Institutional Research Brief: Results of the 2012 National Survey of Student Engagement: A Comparison with Carnegie Peer Institutions

Programme Specification. MSc in Palliative Care: Global Perspectives (Distance Learning) Valid from: September 2012 Faculty of Health & Life Sciences

Empirical research on implementation of full English teaching mode in the professional courses of the engineering doctoral students

Gender and socioeconomic differences in science achievement in Australia: From SISS to TIMSS

National Academies STEM Workforce Summit

Implementing Pilot Early Grade Reading Program in Morocco

No. 11. Table of Contents

Irtiqa a Programme: Guide for the inspection of schools in The Emirate of Abu Dhabi

Developing an Assessment Plan to Learn About Student Learning

I set out below my response to the Report s individual recommendations.

The Characteristics of Programs of Information

Position Statements. Index of Association Position Statements

EXECUTIVE SUMMARY. TIMSS 1999 International Mathematics Report

Management and monitoring of SSHE in Tamil Nadu, India P. Amudha, UNICEF-India

School Leadership Rubrics

DRAFT Strategic Plan INTERNAL CONSULTATION DOCUMENT. University of Waterloo. Faculty of Mathematics

Services for Children and Young People

Summary and policy recommendations

MODERNISATION OF HIGHER EDUCATION PROGRAMMES IN THE FRAMEWORK OF BOLOGNA: ECTS AND THE TUNING APPROACH

The University of North Carolina Strategic Plan Online Survey and Public Forums Executive Summary

EXECUTIVE SUMMARY. TIMSS 1999 International Science Report

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

Swinburne University of Technology 2020 Plan

A Systems Approach to Principal and Teacher Effectiveness From Pivot Learning Partners

Referencing the Danish Qualifications Framework for Lifelong Learning to the European Qualifications Framework

The Mission of Teacher Education in a Center of Pedagogy Geared to the Mission of Schooling in a Democratic Society.

The Survey of Adult Skills (PIAAC) provides a picture of adults proficiency in three key information-processing skills:

Council of the European Union Brussels, 4 November 2015 (OR. en)

Department of Education and Skills. Memorandum

DG 17: The changing nature and roles of mathematics textbooks: Form, use, access

Meeting on the Recognition of Prior Learning (RPL) and Good Practices in Skills Development

Improving education in the Gulf

Student Experience Strategy

Innovating Toward a Vibrant Learning Ecosystem:

Applying Florida s Planning and Problem-Solving Process (Using RtI Data) in Virtual Settings

Stakeholder Engagement and Communication Plan (SECP)

DOES OUR EDUCATIONAL SYSTEM ENHANCE CREATIVITY AND INNOVATION AMONG GIFTED STUDENTS?

New Jersey Department of Education World Languages Model Program Application Guidance Document

Development and Innovation in Curriculum Design in Landscape Planning: Students as Agents of Change

School Size and the Quality of Teaching and Learning

Post-intervention multi-informant survey on knowledge, attitudes and practices (KAP) on disability and inclusive education

Introduction Research Teaching Cooperation Faculties. University of Oulu

Scoring Guide for Candidates For retake candidates who began the Certification process in and earlier.

THE IMPACT OF STATE-WIDE NUMERACY TESTING ON THE TEACHING OF MATHEMATICS IN PRIMARY SCHOOLS

Tailoring i EW-MFA (Economy-Wide Material Flow Accounting/Analysis) information and indicators

TIMSS ADVANCED 2015 USER GUIDE FOR THE INTERNATIONAL DATABASE. Pierre Foy

European Higher Education in a Global Setting. A Strategy for the External Dimension of the Bologna Process. 1. Introduction

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

Charter School Performance Accountability

DSTO WTOIBUT10N STATEMENT A

Early Warning System Implementation Guide

OilSim. Talent Management and Retention in the Oil and Gas Industry. Global network of training centers and technical facilities

CONSULTATION ON THE ENGLISH LANGUAGE COMPETENCY STANDARD FOR LICENSED IMMIGRATION ADVISERS

PROJECT DESCRIPTION SLAM

REPORT ON THE ACTIVITIES OF THE INSTITUTE IN 2011

Arts, Literature and Communication (500.A1)

Programme Specification. BSc (Hons) RURAL LAND MANAGEMENT

Eye Level Education. Program Orientation

GHSA Global Activities Update. Presentation by Indonesia

Measuring up: Canadian Results of the OECD PISA Study

Improving the impact of development projects in Sub-Saharan Africa through increased UK/Brazil cooperation and partnerships Held in Brasilia

Overall student visa trends June 2017

Published in: The Proceedings of the 12th International Congress on Mathematical Education

Programme Specification. MSc in International Real Estate

PIRLS. International Achievement in the Processes of Reading Comprehension Results from PIRLS 2001 in 35 Countries

KENTUCKY FRAMEWORK FOR TEACHING

Global Convention on Coaching: Together Envisaging a Future for coaching

The Conference Center. of the Americas. at the Biltmore Hotel. Miami, Florida

UNIVERSITY OF SOUTH AFRICA

PROGRAMME SPECIFICATION

Lincoln School Kathmandu, Nepal

The Demographic Wave: Rethinking Hispanic AP Trends

INSTRUCTION MANUAL. Survey of Formal Education

IMPLEMENTATION OF EDUCATION AND TRAINING 2010 WORK PROGRAMME

Dakar Framework for Action. Education for All: Meeting our Collective Commitments. World Education Forum Dakar, Senegal, April 2000

UPPER SECONDARY CURRICULUM OPTIONS AND LABOR MARKET PERFORMANCE: EVIDENCE FROM A GRADUATES SURVEY IN GREECE

2013/Q&PQ THE SOUTH AFRICAN QUALIFICATIONS AUTHORITY

University of Toronto Mississauga Degree Level Expectations. Preamble

ASSISTANT DIRECTOR OF SCHOOLS (K 12)

Asian Development Bank - International Initiative for Impact Evaluation. Video Lecture Series

Self Assessment. InTech Collegiate High School. Jason Stanger, Director 1787 Research Park Way North Logan, UT

What Do Teachers Know and Do? A Report Card on Primary Teachers in Sub-Saharan Africa

Understanding Co operatives Through Research

Collaborative Partnerships

Transcription:

Monitoring trends in educational growth a partnership service to monitor the educational growth of students in the early to middle years of schooling The ACER Centre for Global Education Monitoring Measuring quality and equity in education Until recently, attention in the field of educational development has been focused on improving access to schooling, particularly at the primary level. It is now acknowledged that improving access is necessary but not a sufficient condition for improving educational outcomes. If individuals and countries are to realise all the benefits of education, then attending school must involve learning. This acknowledgement has shifted the focus of educational development from improving access to schooling to improving access to and quality of education. This shift in focus encompasses a concern with equity to ensure opportunities for education of the same quality are provided impartially to all individuals, regardless of their gender, socioeconomic status, ethnicity or geographical location. Assessments that measure learning outcomes yield important information about progress in addressing the concerns of quality and equity in education. As such, these assessments have themselves recently become a topic of vigorous discussion and debate. While measuring learning outcomes is being widely advocated, there is recognition that it presents technical and practical challenges for many countries. To support countries in this complex undertaking, The Australian Council for Educational Research (ACER) is offering a partnership service called Monitoring Trends in Educational Growth (MTEG). Australian Council for Educational Research

What is ACER? ACER s work seeks to identify, understand and build on skills and practices that improve learning for all learners. ACER has extensive experience in gathering and utilising assessment data to monitor educational development and inform the evidencebased formulation and evaluation of educational policy. It has an extensive record of working with countries to implement learning assessments and of collaborating with emerging economies and their development partners to support educational improvement. ACER has led, or contributed to, the development of the assessment tools and methodologies that are at the core of many international and national assessments. This work draws on staff expertise in sampling, survey management, scaling methodology and survey data analysis, as well as in the interpretation and reporting of results. International projects in which ACER has played a key role include the OECD Programme for International Student Assessment (PISA), the IEA Civic and Citizenship Education Study (ICCS), and the IEA International Computer and Information Literacy Study (ICILS). ACER has also assisted in the development of national assessment programs in Australia, Afghanistan, Bangladesh, the United Arab Emirates, and Zimbabwe, and supported the implementation of PISA 2009 in Tamil Nadu and Himachal Pradesh in India. In addition, ACER has provided support with educational assessment in Bhutan, Botswana, Cambodia, Chile, Colombia, Fiji, India, Indonesia, Malaysia, the Maldives, Mexico, Pakistan, Papua New Guinea, the Philippines and Vietnam. The MTEG approach MTEG offers a flexible, collaborative approach to developing and implementing an assessment of learning outcomes that yields high-quality, nationally relevant data. MTEG is a service that involves ACER staff working closely with each country to develop an assessment program that meets the country s monitoring needs while being based as closely as possible on a set of defined design principles and quality standards. A flexible, collaborative approach While different countries have different educational policy priorities, the first step in developing, implementing or evaluating educational policy in all countries is the same: to seek out evidence that supports decision making. Evidence is information, and since policy priorities vary from country to country, so too do the information needs. The design of MTEG is flexible, so it can address these different information needs. If information needs remain unformulated, MTEG staff can help to identify them by working collaboratively with national policymakers, policy analysts and other key stakeholders. ACER is well prepared to support efforts towards improvement in all educational systems, and recognises that fragile and conflict-affected countries face particular difficulties in their pursuit of this improvement.

MTEG is designed to be collaborative from the earliest stage, when the direction of the assessment is being determined, onwards throughout all stages of the program. It is collaborative because it is a partnership in which ACER s MTEG staff supports in-country personnel. In this way, MTEG is more than just an assessment that is tailored to particular information needs. It is also a service that offers significant opportunities for system strengthening across the range of assessment activities. High-quality, locally relevant data High-quality data can be obtained from an assessment if the aim of meeting rigorous, predefined technical standards has driven the development of the methods and processes in all stages of the assessment cycle. It is with this aim in mind that ACER has developed the methods and processes that form the foundation of MTEG. Though MTEG s methods and processes may be adapted to suit a particular national context, they never cease to be aligned with international best practice. In MTEG, in-country personnel are supported so they can follow these best-practice methods and processes, and therefore obtain high-quality data from their assessment of learning outcomes. Tailoring MTEG The MTEG assessment model is built around a set of standard tools and design features that can be tailored to the local context in four areas: target populations, assessment domains, contextual questionnaires, and assessment frequency and timing. Target populations In the standard MTEG service, target populations can be one or more of the following: Grade 3 Grade 6 Grade 9. The range of target populations covers most of compulsory schooling because this results in meaningful data about students growth as they progress through the educational system. Grade 3 and Grade 6 are offered in recognition of the commonly observed interest in assessing learning outcomes in the early to middle years of primary education. Grade 9 is offered because it can give a picture of students knowledge and skills at the time they are approaching adulthood. High-quality data can inform policy decision making because they paint a consistent and true picture of how things really are in the population under investigation. But high-quality data only usefully inform policy decision making if they answer national information needs, i.e. if they are nationally relevant. MTEG yields nationally relevant data because it is flexible it encourages participants to focus on matters of interest, and does not obligate them to collect data about and report on matters that are of no national interest. 3

Assessment domains In recognition of the fact that education should equip students for life beyond the classroom, MTEG takes into account the curriculum context, and measures the extent to which students have consolidated and generalised their learning so it can be applied in different situations. In the standard MTEG service, the core assessment domains are: mathematical literacy reading literacy writing literacy. While it is recommended that students are assessed in all three domains because together these domains cover what are widely considered to be the foundational skills of education each country can choose which of the three to include in a tailored MTEG model. Countries can also assess students in other domains. For example, for the youngest target population (Grade 3), a country might prefer to assess basic reading skills rather than assessing reading literacy. For the oldest target population (Grade 9), a country might wish to add an assessment of so-called 21st century skills, including employability skills such as problem solving, initiative and self-management. Contextual questionnaires It is not enough in itself to obtain data about student learning outcomes in the selected domains. Data must also be obtained about the assessed students backgrounds and the learning environment. Only through the combination of these two types of data do the variations in learning outcomes across a student population become evident. In MTEG, background information is collected with contextual questionnaires. Standard MTEG contextual questionnaires include: a student questionnaire a parent questionnaire a teacher questionnaire a principal questionnaire. Each questionnaire consists of a standard component designed to capture predefined data, and an optional component that is tailored to capture any additional data of interest to the country. The main component is included to enable cross-national comparisons, where they are desired. The questions that feature in the optional component are tailored to address national information needs. The student questionnaire collects demographic and home information about students, and information about their experiences of and attitudes towards school. The parent questionnaire collects information about parents educational levels and socio-economic status, and about their interaction with their children and cooperation with the children s schools and teachers. The teacher questionnaire collects information about characteristics of teachers, details of the instructional practices they employ, and their views about different educational tools and aids. The principal questionnaire collects information about school leadership, and administrative matters in schools such as educational priorities, infrastructure and resourcing. Although administration of the full set of questionnaires provides the most complete picture of the environment in which students are learning, a tailored MTEG program would only include those questionnaires expected to yield data that address the national information needs. Assessment frequency and timing In its recommended standard form, MTEG offers an ongoing assessment cycle of four years. The four-year cycle allows sufficient time for comprehensive data analysis and reporting, and for the implementation of policies whose development has been informed by the assessment data. Beyond the first assessment cycle, this timeframe also allows sufficient opportunity for any implemented policies to have an impact on learning outcomes. While the four-year cycle is recommended, cycles of other lengths can be implemented where they might better address national information needs. Within an MTEG assessment cycle, each target population is assessed at regular intervals. The intervals are set in such a way that data about changes in learning outcomes over time are obtained as soon as practicably possible after the introduction of the assessment. Monitoring trends in educational growth

How MTEG reports results Results from an MTEG assessment are reported on learning metrics. These metrics enable useful comparisons and provide powerful information about changes in learning outcomes over time. Learning metrics MTEG s learning metrics quantify student proficiency by assigning to it a numerical score located on a continuous scale. But learning metrics do more than just quantify student proficiency, they also describe it. Learning metrics can do this because their continuous scales span and define different proficiency levels, and at each proficiency level the details of what students at that level know, understand and can do are described. Proficiency descriptions are derived from the cognitive demands of test items within each described level, and the items in turn reflect the construct being assessed as defined in the underlying assessment framework. There is one learning metric for each of the three core MTEG domains of mathematical literacy, reading literacy and writing literacy. The scale for each learning metric is constructed by using some common questions within any year for adjacent grades, and by using some common questions over time, from one year to the next. In this way all the tests in a particular domain are linked and equated, and it is this feature that provides the means of comparing groups and of monitoring growth. Since all tests are linked and equated, proficiency indicators of students in different grades in the same year, or of students in the same grade in different years, can all be placed on the same scale. Useful comparisons By reporting results on learning metrics, MTEG can enable the following useful comparisons: national benchmarking comparisons between in-country subpopulations international comparisons. The first step in national benchmarking is the establishment of cut-scores that are used to define standards of student performance that recognise current realities in the country, and set achievable goals. Establishing cut-scores and defining standards requires expert knowledge of both the 5

assessment domains and the populations being assessed. It is an activity undertaken through collaboration between in-country experts and ACER staff. Once standards have been set, student proficiency as measured in MTEG can be benchmarked against these standards, and expectations can therefore be contextualised. Comparisons between in-country subpopulations of interest provide information about the relative strengths and weaknesses of these subpopulations, and this information is essential for policymakers and policy analysts. If desired, MTEG can also provide comparisons between different countries. These comparisons can compare results between different MTEG countries, or benchmark a country s MTEG results against the results of other countries to other assessments such as the Program for International Student Assessment (PISA), the Progress in International Reading Literacy Study (PIRLS), the Trends in International Mathematics and Science Study (TIMSS), the Annual Status of Education Report (ASER) survey, the Early Grade Reading Assessment (EGRA), the Southern and Eastern Africa Consortium for Monitoring Educational Quality (SACMEQ) project, the Programme d Analyse des Systèmes Educatifs de la CONFEMEN (PASEC, the Program for Analysing the Education Systems of the CONFEMEN; CONFEMEN is a conference of education ministries from francophone countries across the world) and the Laboratorio Latinoamericano de Evaluación de la Calidad de la Educación (LLECE, The Latin-American Laboratory for Assessment of the Quality of Education). Changes in learning outcomes over time Since MTEG is an ongoing assessment service in which results in the core domains are reported on a common scale, it is not limited to a one-off snapshot of student learning outcomes; rather it can provide a developing picture of the changes in learning outcomes over time. These changes in learning outcomes over time are usually described in terms of growth and trends. Educational growth refers to how much students are growing in their proficiency as they move through the school system. A trend is a pattern made over time by the results from a like group. Growth and trend information is important for teachers, education policymakers and policy analysts. Measurements of growth can indicate how much value is being added at different stages of students education, and can point to the next steps needed in an individual s growth trajectory. Such value-adding is not usually evident in ongoing curriculum-based assessments because they are typically directed towards a narrow range of curriculum objectives for individual students in a particular grade. Trend information through highlighting patterns in progress can show where educational reforms have been successful, and where targeted interventions are required. Some examples of the types of growth and trend information MTEG can provide are: growth between school grades changes over time in learning outcomes at specific grades changes over time in growth between grades changes over time in learning outcomes of different subpopulations. Monitoring trends in educational growth

The stages of an MTEG cycle MTEG proceeds via a collaborative partnership between ACER s MTEG staff and the participating country. The collaborative partnership is facilitated through the establishment at ACER of a country desk, at whose head is a dedicated senior ACER staff member responsible for coordinating support for the participating country during all stages of the MTEG cycle. One cycle of the MTEG service consists of four stages: environmental analysis assessment plan and roll out data analysis, interpretation and reporting research and evaluation. System strengthening occurs through all four stages, and the result is in-country skills that are solid, lasting and transferrable to other national assessment activities. Stage 1: Environmental analysis At the start of the first MTEG cycle, an environmental analysis is conducted. This analysis aims to recognise the priorities and needs of key stakeholders, to understand the policy development culture and processes, and to identify any features of in-country systems and structures that might help or hinder the implementation of the MTEG program. Though the first-cycle analysis is the most intense, subsequent MTEG cycles include a collaborative review process to ensure that each MTEG cycle begins with up-to-date information about the local context and changing needs and priorities. Where necessary, ACER MTEG staff can offer support by providing training in aspects of survey participation, by managing the development of survey instruments with in-country input, and by providing guidance in matters of survey operations and data capture. Stage 3: Data analysis, interpretation and reporting During this stage data analysis and interpretation activities that were planned in the second stage are undertaken. These activities yield rich and detailed information in the areas of interest in national educational policy. The reporting activities that follow adhere to a structured dissemination plan that was also formulated during the second stage. These dissemination activities support policy review and development. If desired, at this stage ACER MTEG staff can support the establishment of an in-country reporting committee consisting of individuals from different bodies that are part of the educational planning process. Stage 4: Research and evaluation Research and evaluation activities in this stage can be conducted by in-country academic institutions with the support of ACER MTEG staff. These activities may aim to present evidence to assist in evaluating an educational policy, or evidence in support of a change to educational policy. They may also inform the direction taken in the subsequent cycle of the MTEG assessment. Stage 2: Assessment plan and roll out During this stage ACER MTEG staff and national personnel work together to identify areas of policy interest and to develop a suitable plan for assessment, reporting and dissemination that reflects the priorities identified. They decide which elements of the standard MTEG design are relevant, and what kind of tailoring of the service is required to meet the national needs and priorities. They also work together to put together and train an in-country project team, and to conduct the data collection activities. 7

Getting involved ACER s Monitoring Trends in Educational Growth (MTEG) partnership service can support countries in the difficult and complex task of measuring student learning outcomes. MTEG offers: Guidance and support from internationally renowned experts A methodology that, through its flexibility in relation to target populations, assessment domains, contextual questionnaires and assessment frequency and timing enables the development of a service that meets national policy needs A methodology that is highly collaborative, and therefore provides significant opportunities for local system strengthening across the range of assessment activities The opportunity to obtain high quality, locally relevant data that yield powerful information about growth and trends in learning outcomes within the education system. To find out more about MTEG, contact ACER at mteg@acer.edu.au or visit The ACER Centre for Global Education Monitoring supports the monitoring of educational outcomes worldwide, holding the view that the systematic and strategic collection of data on educational outcomes, and factors related to those outcomes, can inform policy aimed at improving educational progress for all learners. Monitoring trends in educational growth