Pan-Canadian Assessment Program FAQ PCAP2013. Report on the Pan-Canadian Assessment of Science, Reading, and Mathematics

Similar documents
Measuring up: Canadian Results of the OECD PISA Study

Portfolio-Based Language Assessment (PBLA) Presented by Rebecca Hiebert

ÉCOLE MANACHABAN MIDDLE SCHOOL School Education Plan May, 2017 Year Three

Student Assessment and Evaluation: The Alberta Teaching Profession s View

Report on organizing the ROSE survey in France

September 6-8. San Francisco, California 1

Understanding University Funding

CÉGEP HERITAGE COLLEGE POLICY #8

Understanding Co operatives Through Research

STUDENT ASSESSMENT AND EVALUATION POLICY

PIRLS. International Achievement in the Processes of Reading Comprehension Results from PIRLS 2001 in 35 Countries

Interpreting ACER Test Results

CONSULTATION ON THE ENGLISH LANGUAGE COMPETENCY STANDARD FOR LICENSED IMMIGRATION ADVISERS

International comparison and review of a health technology assessment skills program

Guatemala: Teacher-Training Centers of the Salesians

AC : A MODEL FOR THE POST-BACHELOR S DEGREE EDU- CATION OF STRUCTURAL ENGINEERS THROUGH A COLLABORA- TION BETWEEN INDUSTRY AND ACADEMIA

TIMSS Highlights from the Primary Grades

STUDENT LEARNING ASSESSMENT REPORT

Twenty years of TIMSS in England. NFER Education Briefings. What is TIMSS?

Educational system gaps in Romania. Roberta Mihaela Stanef *, Alina Magdalena Manole

key findings Highlights of Results from TIMSS THIRD INTERNATIONAL MATHEMATICS AND SCIENCE STUDY November 1996

SASKATCHEWAN MINISTRY OF ADVANCED EDUCATION

Culture, Tourism and the Centre for Education Statistics: Research Papers

Audit Of Teaching Assignments. An Integrated Analysis of Teacher Educational Background and Courses Taught October 2007

Chapter 4 Culture & Currents of Thought

Youth Sector 5-YEAR ACTION PLAN ᒫᒨ ᒣᔅᑲᓈᐦᒉᑖ ᐤ. Office of the Deputy Director General

FINNISH KNOWLEDGE IN MATHEMATICS AND SCIENCES IN 2002

Assessment of Generic Skills. Discussion Paper

EXECUTIVE SUMMARY. TIMSS 1999 International Science Report

Engineers and Engineering Brand Monitor 2015

PROFESSIONAL INTEGRATION

Using CBM to Help Canadian Elementary Teachers Write Effective IEP Goals

Management and monitoring of SSHE in Tamil Nadu, India P. Amudha, UNICEF-India

Grade 7 - Expansion of the Hudson s Bay Company: Contributions of Aboriginal Peoples in Canada

HIGHLIGHTS OF FINDINGS FROM MAJOR INTERNATIONAL STUDY ON PEDAGOGY AND ICT USE IN SCHOOLS

INSTRUCTION MANUAL. Survey of Formal Education

School Competition and Efficiency with Publicly Funded Catholic Schools David Card, Martin D. Dooley, and A. Abigail Payne

St. Paul s Roman Catholic Separate School Division # Annual Report

REGULATION RESPECTING THE TERMS AND CONDITIONS FOR THE ISSUANCE OF THE PERMIT AND SPECIALIST'S CERTIFICATES BY THE COLLÈGE DES MÉDECINS DU QUÉBEC

Organising ROSE (The Relevance of Science Education) survey in Finland

Admission and Readmission

The State of Educators Professional Learning in British Columbia

CARPENTRY GRADES 9-12 LEARNING RESOURCES

GUIDE CURRICULUM. Science 10

Loyalist College Applied Degree Proposal. Name of Institution: Loyalist College of Applied Arts and Technology

Post-intervention multi-informant survey on knowledge, attitudes and practices (KAP) on disability and inclusive education

Academic Program Assessment Prior to Implementation (Policy and Procedures)

Educational Indicators

Higher education is becoming a major driver of economic competitiveness

Accessing Higher Education in Developing Countries: panel data analysis from India, Peru and Vietnam

1. Amend Article Departmental co-ordination and program committee as set out in Appendix A.

TEACHING QUALITY: SKILLS. Directive Teaching Quality Standard Applicable to the Provision of Basic Education in Alberta

The Early Development Instrument (EDI) Report

Assembly of First Nations National First Nations Language Implementation Plan Special Chiefs Assembly Ottawa, Ontario

Triple P Ontario Network Peaks and Valleys of Implementation HFCC Feb. 4, 2016

Observing Teachers: The Mathematics Pedagogy of Quebec Francophone and Anglophone Teachers

DISTRICT ASSESSMENT, EVALUATION & REPORTING GUIDELINES AND PROCEDURES

Colorado State University Department of Construction Management. Assessment Results and Action Plans

Inquiry Learning Methodologies and the Disposition to Energy Systems Problem Solving

UPPER SECONDARY CURRICULUM OPTIONS AND LABOR MARKET PERFORMANCE: EVIDENCE FROM A GRADUATES SURVEY IN GREECE

University-Based Induction in Low-Performing Schools: Outcomes for North Carolina New Teacher Support Program Participants in

PROJECT PERIODIC REPORT

C21 Canada Presents: Shifting Minds A 21ST CENTURY VISION OF PUBLIC EDUCATION FOR CANADA. C21 Canada

DISTANCE LEARNING OF ENGINEERING BASED SUBJECTS: A CASE STUDY. Felicia L.C. Ong (author and presenter) University of Bradford, United Kingdom

A Pilot Study on Pearson s Interactive Science 2011 Program

Pre-Health Sciences Pathway to Advanced Diplomas and Degrees Program Standard

FREQUENTLY ASKED QUESTIONS (FAQs) ON THE ENHANCEMENT PROGRAMME

Mexico (CONAFE) Dialogue and Discover Model, from the Community Courses Program

Ex-Post Evaluation of Japanese Technical Cooperation Project

VISION: We are a Community of Learning in which our ākonga encounter Christ and excel in their learning.

Assessment and national report of Poland on the existing training provisions of professionals in the Healthcare Waste Management industry REPORT: III

Interdisciplinary Ph.D. in Education Sciences College of Education, University of Kentucky

Ministry of Education, Republic of Palau Executive Summary

THE IMPACT OF STATE-WIDE NUMERACY TESTING ON THE TEACHING OF MATHEMATICS IN PRIMARY SCHOOLS

Finding the Sweet Spot: The Intersection of Interests and Meaningful Challenges

Joint Consortium for School Health Governments Working Across the Health and Education Sectors. Mental Resilience

Developing an Assessment Plan to Learn About Student Learning

Additional Qualification Course Guideline Computer Studies, Specialist

Department of Education and Skills. Memorandum

Title II of WIOA- Adult Education and Family Literacy Activities 463 Guidance

University of Toronto

Curriculum Policy. November Independent Boarding and Day School for Boys and Girls. Royal Hospital School. ISI reference.

CREATIONS: Developing an Engaging Science Classroom

Student Engagement and Cultures of Self-Discovery

Kenya: Age distribution and school attendance of girls aged 9-13 years. UNESCO Institute for Statistics. 20 December 2012

Engaging Faculty in Reform:

Just Because You Can t Count It Doesn t Mean It Doesn t Count: Doing Good Research with Qualitative Data

Presentation of the English Montreal School Board To Mme Michelle Courchesne, Ministre de l Éducation, du Loisir et du Sport on

CPKN EARNS SILVER AT GTEC

African American Male Achievement Update

A Collage Of Canadian Cooking By home Economist in Business Canadian Home Economics Association

Continuing Competence Program Rules

Ministry Of Education Jamaica Grade Four Curriculum Guide

Qs&As Providing Financial Aid to Former Everest College Students March 11, 2015

Position Statements. Index of Association Position Statements

Summary: Impact Statement

Indiana Collaborative for Project Based Learning. PBL Certification Process

Punjab Education and English Language Initiative (PEELI) 1

EDUCATION. Graduate studies include Ph.D. in from University of Newcastle upon Tyne, UK & Master courses from the same university in 1987.

How do we balance statistical evidence with expert judgement when aligning tests to the CEFR?

Transcription:

Pan-Canadian Assessment Program FAQ PCAP2013 Report on the Pan-Canadian Assessment of Science, Reading, and Mathematics

What is PCAP? The Pan-Canadian Assessment Program (PCAP) is a survey of the knowledge and skills of Canadian students in three core learning areas science, reading, and mathematics. It was developed and is administered by the Council of Ministers of Education, Canada (CMEC) with the active involvement of all participating ministries and departments of education. Why was PCAP developed? CMEC developed PCAP to ensure the availability of statistically valid, comparable data on student achievement in Canada. PCAP data will be used by education researchers, policy-makers, and government officials to understand and make improvements to provincial and territorial education systems. Which students are assessed in PCAP? For PCAP 2013, approximately 32,000 students in Grade 8 (Secondary II in Quebec) from over 1,500 schools across the country were tested. Science was the major focus of the assessment. Reading and mathematics were also assessed. Approximately 24,000 students were tested in English and 8,000 in French. Students from all provinces participated in PCAP 2013. Who funds PCAP and how much does it cost? PCAP is funded by provinces and territories through their long-standing intergovernmental body, the Council of Ministers of Education, Canada (CMEC). On average, PCAP costs approximately $1.2 million annually. 1

2007 Reading Math Science 2010 Reading Math Science 2013 Reading Math Science 2016 Reading Math Science 2019 Reading Math Science 2022 Reading Math Science How often is PCAP administered? PCAP is administered every three years on a nine-year cycle that allows for comparison of results over time in all three domains science, reading, and mathematics. These data help provinces and territories understand how the performance of their education systems may have changed over time. Each PCAP assessment has a major domain, or focus, and two minor domains. The major domain (in green) changes every three years. A major domain assessment can be compared over time with another minor- or major-domain assessment in the same subject. Why does PCAP have one major domain and two minor domains? This particular structure was chosen to align PCAP with the Organisation for Economic Co-operation and Development s (OECD s) Programme for International Student Assessment (PISA). It is expected that a significant portion of the Grade 8/Secondary II student cohort from PCAP 2013 will take the PISA 2015 assessment when those students are 15 years old. Because PISA 2015 will also have science as its major domain, it will be possible to compare performance patterns between the two assessments. Can performance among different provinces and territories really be compared? Education systems and school programs differ from one jurisdiction to another, so comparing results can be a complex task. PCAP allows a variety of education systems to be compared according to a set of common benchmarks in science, reading, and mathematics. The benchmarks have been established through extensive consultation among provinces and territories and with the guidance of statisticians, psychometricians, and education experts. By agreeing to common benchmarks, provinces and territories are able to determine their relative performance in relation to each other, even if their approaches to education may differ. 2

Is the assessment fair to students in each province and territory? The assessment is not tied to the curriculum of a particular province or territory but is instead a fair measurement of students abilities to use their learning skills to solve real-life situations. It measures how well students are doing; it does not attempt to assess approaches to learning. Provinces and territories also work to ensure that the unique qualities of our country s education systems are taken into account. Factors such as linguistic differences, rural and urban school locations, and cultural influences are all considered in both the assessment itself and in related context questionnaires. In addition, the common curricular framework for each subject incorporates an agreed-upon perspective for all jurisdictions that is based upon the latest pedagogical research. How are the results from PCAP determined? PCAP uses four equivalent versions of the test to ensure both a broad content coverage and a fair and accurate means of comparing student performance across provinces. To render the scores obtained from the various versions comparable, assessment experts developed a statistically valid common language. This was done by converting the raw scores from the four versions of the test to a standard scale. Students total scores in each subject area were transposed onto a common scale, ranging from 0 to 1,000, with the average for the pan-canadian population set at 500. The resulting scores are called scale scores. As a result of this conversion, the scores of two-thirds of the students participating in PCAP 2013 fell within the range of 400 to 600 points, which represents a statistically normal distribution of scores. 3

How does PCAP define scientific literacy? In PCAP 2013, scientific literacy was broadly defined as a student s evolving competencies in understanding the nature of science using science-related attitudes, skills, and knowledge to conduct inquiries, to solve problems, and to reason scientifically in order to understand and make evidence-based decisions about science-related issues. Questions assessing student scientific literacy were developed to reflect four science sub-domains: nature of science (e.g., describe the processes of science inquiry, distinguish between qualitative and quantitative data, identify characteristics of measurement); life science (e.g., describe characteristics and needs of living things, distinguish between cells and cell components); physical science (e.g., describe the properties and components of matter and explain interactions between those components, demonstrate scientific literacy with respect to physical science issues); Earth science (e.g., explain how water is a resource for society, explain patterns of change and their effects on water resources on Earth). Three competencies were also considered: science inquiry, problem solving, and scientific reasoning. The four sub domains reflect traditional groupings of science skills and knowledge, while the three competencies can apply to all sub-domains. The PCAP 2013 report provides an example of a science unit to show the types of knowledge and skills that are accessible to students at different levels of performance. In addition, questions regarding students attitudes toward science were embedded throughout the assessment in order to analyze students interest and awareness of sciencerelated issues, respect and support for evidenced-based knowledge, and awareness of suitable development and stewardship. 4

What do the performance levels in science mean? Performance levels represent how well students are doing based on the cognitive demand and degree of difficulty of the test items. Cognitive demand is defined by the level of reasoning required by the student to correctly answer an item, from high demand to low demand; degree of difficulty is defined by a statistical determination of the collective performance of the students on the assessment. There were four levels of performance in the science component of PCAP 2013: Level 4 Students who scored 655 or above Level 3 Students who scored between 516 and 654 Level 2 Students who scored between 379 and 515 Level 1 Students who scored 378 or less Level 2 is the expected level of performance for Grade 8/ Secondary II students. Level 1 represents the performance of students at a level below that expected of students in their grade. Levels 3 and 4 represent higher levels of performance. The defined expected levels of performance were established by a panel of assessment and education experts from across Canada, and confirmed by actual student test responses. What did we learn from PCAP 2013? Some of the key findings about the performance of students include the following: Canadian youth are managing well in science. At the pan- Canadian level, 91 per cent of students are achieving the expected level of performance for their grade. Moreover, close to half the students are achieving at an even higher level. In all provinces, without exception, over 85 per cent of students are achieving at the expected level of performance in science in some cases, well over 90 per cent. With PCAP 2013, some analysis over time is now possible in mathematics and reading. In mathematics, PCAP data show an improvement in Grade 8 student achievement in most provinces across Canada between 2010 and 2013. 5

Student performance in reading was assessed by PCAP in 2007, 2010, and 2013. In reading, performance was stable across Canada between 2007 and 2013 and showed some improvement between 2010 and 2013. Gender does not appear to be a factor in performance in either science or math at the Grade 8 level in Canada; PCAP data show no significant difference in the performance of girls and boys in either subject. In all provinces, girls continue to perform better than boys in reading. This phenomenon is also observed in PISA and other assessments. Across provinces, Alberta and Ontario students show the highest performance in science, while Ontario and Quebec students achieve higher than the Canadian average in reading and mathematics, respectively. In most provinces with English majority-language school systems, students in the English systems do better in science and reading than students in the French systems. The reverse is true in mathematics: students in the French systems tend to outperform their English counterparts. In Quebec, science and reading results are the same in English and French systems, while students in the French system do better than those in the English system in math. PCAP 2013 also collected extensive contextual information from questionnaires completed by students, teachers, and principals. This information will be published in late 2014 and should offer insight into some of the factors that may influence student performance. When will the next PCAP assessment take place? PCAP will be administered again in 2016. PCAP 2016 will have reading as its major focus; mathematics and science will be assessed as minor domains. 6