Success rates and inspection outcomes: a research paper for the Sixth Form Colleges Forum

Similar documents
Centre for Evaluation & Monitoring SOSCA. Feedback Information

GCSE English Language 2012 An investigation into the outcomes for candidates in Wales

Initial teacher training in vocational subjects

Research Update. Educational Migration and Non-return in Northern Ireland May 2008

Newlands Girls School

Known knowns, known unknowns and unknown unknowns The Six Dimensions Project Report 2017 Nick Allen

Principal vacancies and appointments

U VA THE CHANGING FACE OF UVA STUDENTS: SSESSMENT. About The Study

Oasis Academy Coulsdon

Short inspection of Maria Fidelis Roman Catholic Convent School FCJ

St Philip Howard Catholic School

How to Judge the Quality of an Objective Classroom Test

School Size and the Quality of Teaching and Learning

Eastbury Primary School

Proficiency Illusion

St Michael s Catholic Primary School

Educational Attainment

AUTHORITATIVE SOURCES ADULT AND COMMUNITY LEARNING LEARNING PROGRAMMES

Measuring Efficiency in English Schools, Techniques, Policy Implications and Practicalities

University of Essex Access Agreement

Pupil Premium Grants. Information for Parents. April 2016

Thameside Primary School Rationale for Assessment against the National Curriculum

Summary results (year 1-3)

Information Pack: Exams Officer. Abbey College Cambridge

Guide to the Uniform mark scale (UMS) Uniform marks in A-level and GCSE exams

Plans for Pupil Premium Spending

5 Early years providers

DfEE/DATA CAD/CAM in Schools Initiative - A Success Story so Far

FOUR STARS OUT OF FOUR

PROPOSED MERGER - RESPONSE TO PUBLIC CONSULTATION

St Matthew s RC High School

Initial English Language Training for Controllers and Pilots. Mr. John Kennedy École Nationale de L Aviation Civile (ENAC) Toulouse, France.

The views of Step Up to Social Work trainees: cohort 1 and cohort 2

Tutor Trust Secondary

Curriculum Policy. November Independent Boarding and Day School for Boys and Girls. Royal Hospital School. ISI reference.

This Access Agreement is for only, to align with the WPSA and in light of the Browne Review.

learning collegiate assessment]

THE QUEEN S SCHOOL Whole School Pay Policy

2015 Annual Report to the School Community

The recognition, evaluation and accreditation of European Postgraduate Programmes.

Level 1 Mathematics and Statistics, 2015

Post-16 transport to education and training. Statutory guidance for local authorities

Exploring the Development of Students Generic Skills Development in Higher Education Using A Web-based Learning Environment

State of the Nation Careers and enterprise provision in England s schools

Effective Pre-school and Primary Education 3-11 Project (EPPE 3-11)

Post-intervention multi-informant survey on knowledge, attitudes and practices (KAP) on disability and inclusive education

NCEO Technical Report 27

Allington Primary School Inspection report - amended

Sixth Form Admissions Procedure

What Is The National Survey Of Student Engagement (NSSE)?

Archdiocese of Birmingham

Formative Assessment in Mathematics. Part 3: The Learner s Role

Dyslexia and Dyscalculia Screeners Digital. Guidance and Information for Teachers

Oasis Academy South Bank

BENCHMARK TREND COMPARISON REPORT:

Evaluation of a College Freshman Diversity Research Program

Australia s tertiary education sector

Mandatory Review of Social Skills Qualifications. Consultation document for Approval to List

Engineers and Engineering Brand Monitor 2015

UK Institutional Research Brief: Results of the 2012 National Survey of Student Engagement: A Comparison with Carnegie Peer Institutions

Shyness and Technology Use in High School Students. Lynne Henderson, Ph. D., Visiting Scholar, Stanford

Pearson BTEC Level 3 Award in Education and Training

Measures of the Location of the Data

Linking the Common European Framework of Reference and the Michigan English Language Assessment Battery Technical Report

Minutes of the one hundred and thirty-eighth meeting of the Accreditation Committee held on Tuesday 2 December 2014.

The Waldegrave Trust Waldegrave School, Fifth Cross Road, Twickenham, TW2 5LH TEL: , FAX:

The Netherlands. Jeroen Huisman. Introduction

Archdiocese of Birmingham

Accountability in the Netherlands

Effective Recruitment and Retention Strategies for Underrepresented Minority Students: Perspectives from Dental Students

Irtiqa a Programme: Guide for the inspection of schools in The Emirate of Abu Dhabi

Teacher of Psychology and Health and Social Care

CHAPTER 4: REIMBURSEMENT STRATEGIES 24

This Access Agreement is for only, to align with the WPSA and in light of the Browne Review.

Probability and Statistics Curriculum Pacing Guide

INTRODUCTION TO TEACHING GUIDE

National Survey of Student Engagement at UND Highlights for Students. Sue Erickson Carmen Williams Office of Institutional Research April 19, 2012

I set out below my response to the Report s individual recommendations.

A Note on Structuring Employability Skills for Accounting Students

THE IMPACT OF STATE-WIDE NUMERACY TESTING ON THE TEACHING OF MATHEMATICS IN PRIMARY SCHOOLS

Introduction. Background. Social Work in Europe. Volume 5 Number 3

Specification. BTEC Specialist qualifications. Edexcel BTEC Level 1 Award/Certificate/Extended Certificate in Construction Skills (QCF)

Charles de Gaulle European High School, setting its sights firmly on Europe.

EDUCATIONAL ATTAINMENT

Longitudinal Analysis of the Effectiveness of DCPS Teachers

SASKATCHEWAN MINISTRY OF ADVANCED EDUCATION

BASIC EDUCATION IN GHANA IN THE POST-REFORM PERIOD

Diploma of Sustainability

American Journal of Business Education October 2009 Volume 2, Number 7

School Competition and Efficiency with Publicly Funded Catholic Schools David Card, Martin D. Dooley, and A. Abigail Payne

Student attrition at a new generation university

Investigating the Relationship between Ethnicity and Degree Attainment

What effect does science club have on pupil attitudes, engagement and attainment? Dr S.J. Nolan, The Perse School, June 2014

School Inspection in Hesse/Germany

Assessment Pack HABC Level 3 Award in Education and Training (QCF)

VOL. 3, NO. 5, May 2012 ISSN Journal of Emerging Trends in Computing and Information Sciences CIS Journal. All rights reserved.

Miami-Dade County Public Schools

Wisconsin 4 th Grade Reading Results on the 2015 National Assessment of Educational Progress (NAEP)

VI-1.12 Librarian Policy on Promotion and Permanent Status

Teacher of English. MPS/UPS Information for Applicants

Transcription:

Success rates and inspection outcomes: a research paper for the Sixth Form Colleges Forum Nicholas Allen April 2012 This paper explores the relationship between success rates and inspection outcomes in schools and sixth form colleges. Key Findings The sixth form colleges recognises the role that the application of success rates across the learning and skills sector has had in guiding improvements in student outcomes. However Success rates for A levels and AS levels in schools are in line with those in all colleges but some way below those in sixth form colleges The release of school success rates have revealed that while the top quarter of schools perform in line with the top quarter of sixth form colleges, success rates in many schools is well below that found in sixth form colleges. No sixth form college has a success rate below 71%; 300 school sixth forms do. There is a relationship between size of school sixth form and success rates: while the overall success rate in school sixth forms is 79.8%, in schools with fewer than 100 pupils in the sixth form it is just 69.0%. The harshness of the college inspection regime to those institutions with a success rate below 76% is not mirrored in inspection judgements in schools with success rates below 76%. No sixth form college with a success rate below 76% is currently graded as better than satisfactory; 48% of schools with a success rate between 71% and 76% are graded as good or better. Schools with relatively low success rates often secure high inspection judgements this paper recognises that until now success rates have not been part of the schools inspection framework, but this relationship suggests that other important aspects of performance, such as retention and numbers of students dropping courses have been neglected. Chief Executive Chair of Council David Igoe Jonathan Godfrey Principal of Hereford Sixth Form College Local Government House Smith Square London SW1P 3HZ 020 7187 7349 sfcf@local.gov.uk www.sfcforum.org.uk

The use of sector specific benchmarks undermine the inspection process and interfere with the ability of students and parents to make an informed choice between different providers: The arguments in this paper are founded on a fundamental principle, that of fairness. Inspection judgement s should be a reflection of the quality of student experience, regardless of the designation of the institution a student attends. We should not accept lower levels of student performance at certain types of institution; we should not disguise high levels of performance in certain institutions by setting a higher benchmark for these institutions. The assumed differences in the academic profile of students recruited by sixth form colleges and those recruited by other colleges which underpin sector specific benchmarks are without foundation. The GCSE profile of students on AS and A level courses in GFE and tertiary courses is only slightly below that found in sixth form colleges. Success rates by institution type and institution size In 2008 the Department for Education and other partner organisations set out to harmonise measures of performance used post-16. Success rates were the measure chosen. In February 2012, the success rates for schools from 2009-10 were published. This gives us the opportunity to explore what typical performance is in the schools sector, and see how this performance varies across institutions. The school success rates have been published as experimental statistics. As such their purpose is to inform and guide rather than provide definitive judgement. The analysis presented here goes beyond simply looking at the success rates and explores the relationship between success rates and inspection judgement and success rate and institution size. The analysis of school success rates and inspection outcomes is built on a data-base drawn from three sources: the key stage five league tables data for 2011 outcomes 1 (published in January 2012), school success rate data for 2010-11 2 (published in February 2012) and the inspection outcomes for all section five inspections as published on the OFSTED website 3. The analysis of college success rates and inspection outcomes is built on a data-base drawn from three sources: the key stage five league tables data for 2011 outcomes 4 (published in January 2012), the FE choices data for 2009-10 5 (published in February 2012 on the FE choices website) 1 http://www.education.gov.uk/schools/performance/download_data.html 2 http://www.education.gov.uk/schools/adminandfinance/financialmanagement/b00204762/institutions/ school-success-rates 3 http://www.ofsted.gov.uk/resources/official-statistics-maintained-school-inspections-and-outcomes 4 http://www.education.gov.uk/schools/performance/download_data.html 5 http://fechoices.skillsfundingagency.bis.gov.uk/pages/home.aspx 2012 2

and the inspection outcomes for all learning and skills inspections as published in the OFSTED website 6. It is recognised that the schools success rate data for 2009-10 has been compiled using a slightly different methodology 7 to that deployed in colleges, and that schools are relative newcomers to this form of analysis being assessed using success rates may well (in time)have an impact on success rates. However, these caveats to not reduce the status of these statistics to junk. The statistics are based on school census returns, and schools were given the opportunity to review the source data behind the success rates and offer additional information where they felt that students had achieved the qualifications they started but had not been recognised by the matching process. Our first analysis looks at success rates by institution type. For schools, the overall success rate is used. For colleges, at AS/A2 success rate from the FE choices analysis is used. These provide the closest we can get to a level playing field. The data presentation technique selected is the interquartile range. The interquartile range tells us about performance in the middle half of any distribution in this case, the middle half of success rate scores for each institution type. In effect we rank all the institutions and then remove the top quarter and the bottom quarter. By removing the extremes we can see how normal performance varies in the different institution types. 6 http://www.ofsted.gov.uk/resources/official-statistics-learning-and-skills-inspections-and-outcomes 7 The methodology used in schools does differ from that used in colleges, but in future years it will not. College success rates are based entirely on data supplied by colleges in funding returns. School success rates use data drawn from two sources. The number of students on course is drawn from the school census, the achievement data is drawn direct from exam boards. These records are then combined using a process called fuzzy matching. Fuzzy matching uses first name, last name, date of birth, home postcode, QAN and discount codes to match the outcomes to the census. However, in this year s census, achievement data will be collected, so the school and college methods will be harmonised. It has taken eighteen months to produce the data for 2009-10, and outputs are published as experimental statistics. Schools have had the opportunity to submit revisions to the data, where they have evidence that students have achieved. 2012 3

Figure One: Success rates 2009-10: interquartile range by institution type Figure One gives us the interquartile range for success rates in each of four institution types: sixth form colleges, tertiary colleges, general further education colleges and school sixth forms. The interquartile range shows us two important things how high success rates are relative to other provider types, and how much performance varies within these normal institutions. We find that the middle 50% of sixth form colleges are bunched together between 86% and 81%- a range of just five percentage points. Schools, by contrast have an interquartile range of twelve percentage points, and GFE colleges 14%. Differences between sixth form colleges are typically small, differences between schools are much larger. Figure One also shows us that typical outcomes in sixth form colleges tend to be well above those found in schools in other colleges: 75% of colleges would be in the top 45% of schools. What this data also shows us is the danger of comparison to sector specific benchmarks. A sixth form college might have a success rate of 81%, putting it at the 25 th percentile for sixth form colleges. We might reasonably conclude that the institution needs to improve somewhat, as it is not achieving the standards achieved by most sixth form colleges. But if we look at the figures for schools we can see that if this college was a school it would actually have achieved an above average performance. Let us make the focus a little sharper. The sixth form college that volunteered for the no-notice inspection pilot has a success rate at A and AS level of 79.6%. It was 2012 4

graded inadequate. A success rate of 79.6% would place the college in question exactly at the median for school sixth forms. Whilst we accept and agree that success rates should not be the sole driving force behind inspection judgements, this playing field seems far from level. A full quarter of school sixth forms have a success rate of below 73%. Only one of the sixth form colleges graded inadequate in the last year had a success rate below 73%. If we turn our attention to the Chief Inspector s Report for 2010-11 8, we find that despite poor performance in a sizeable chunk of school sixth forms, only 2% of school sixth forms inspected last year were graded inadequate. Of course the schools inspectors did not have this success rate data to hand in this form when they made judgements on these schools. On the one hand it would be unfair of us to expect them to reach the same judgement as us without access to the particular statistic that we are currently concerned with. But does this let the schools inspectorate off the hook? Surely they asked some fairly simple questions to establish the quality of outcomes for students in the sixth form before deciding on a grade. Presumably they asked how many students reach the end of the course, how many students drop an AS level during the lower sixth, what proportion of students finish three A levels and so forth. It is not necessary to have a success rate to hand to develop a clear picture of institutional performance. School success rates and size of sixth form We can also use the school success rate data to work out what overall performance is across all schools, by matching the success rates with student numbers taken from the league tables data-set. Having removed any schools with a success rate below 20%, and having weighted each school s success rate to the size of the sixth form, the school success rate is 79.8% - exactly in line with the average for all institutions on long courses in the learning and skills sector ( but, it must be said, some way below the 83.9% of the sixth form colleges). The closeness of school and college performance adds weight to the suggestion that we should take school success rate statistics seriously. Whilst overall schools perform in line with colleges, this average disguises a rather interesting story underneath. Figure Two: Schools success rates 2010-11 by size of sixth form Number of students Number of schools Success rate % 400 or more 123 81.8% 300 399 286 83.6% 200 299 568 80.6% 100 199 559 74.2% Fewer than 100 160 69.0% 8 http://www.ofsted.gov.uk/resources/annualreport1011 2012 5

In Figure Two, we find that school sixth forms with 200 or more students are performing well in success rate terms. However, we find that in school sixth forms with fewer than 100 students, success rates are well below national rates 9. To put it bluntly, in small school sixth forms, the success rate is fifteen percentage points below that in sixth form colleges. In human terms, imagine a class of 20 students on an AS course in a small school sixth form, 14 will achieve the qualification at the end of the year; in a sixth form college, 17 would. To disguise this excellent performance in sixth form colleges by requiring sixth form colleges to achieve a higher benchmark is totally inappropriate and completely at odds with the market based approach to education favoured by the current administration (and, to be fair, the philosophy which has dominated educational policy for the last twenty-five years). Furthermore, on this evidence, the policy of allowing the proliferation of small school sixth forms would seem an inappropriate route to raising student performance. As a footnote to this discussion, what is the inspection grade profile of the 160 very small sixth forms we identified in Figure Two? We have inspection grades from the last five years for 137 of these institutions: only two were graded inadequate. Success rates and inspection outcomes A third analysis we can complete looks at the relationship between inspection outcomes and success rates. We can use this to explore whether a particular level of success on A level and AS level programmes are correlated with similar inspection outcomes in different provider types. 9 The high level of performance in the 300-399 band reflects the significant number of selective school sixth forms of this size. 2012 6

Figure Three: success rates and inspection judgements: maintained schools Figures Three and Four work by plotting the most recent inspection outcome of each school (Figure Three) and each college (Figure Four) against the individual success rate for that institution. Each school or college is represented by a blue dot. A regression line has been added, which allows us to see the trend in the data. Figure Four: Success rates and inspection judgements: sixth form colleges 2012 7

The first thing that is striking about Figure Four is the remarkable levels of success rates in sixth form colleges. At A level and AS level, the lowest success rate recorded for a sixth form college is 71%. Over 300 school sixth forms have a success rate worse than the worst sixth form college. Many of these 300 school sixth forms have received outstanding grades for both overall effectiveness of the sixth form and quality of outcomes in the sixth form. Whilst is undoubtedly the case that some of the very low success rate scores in schools are a result of errors in completing the school census (which presumably is itself an indication of poor leadership and management), the contrast in the regression lines between school sixth forms and sixth form colleges suggest a rather different level of challenge in school inspections and college inspections. One way of comparing these graphs is to take a particular success rate percentage (say 75%) and look at the point that the regression line crosses that success rate percentage. In Figure Three, 80% of schools with a success rate of 75% are securing a grade 2. In sixth form colleges in Figure Four, no college secures above a grade 3. We can also express this in a simple form by looking at the grade profiles of schools and colleges in a particular band. Figure Five gives the grade profile from inspections for schools and colleges with a success rate between 71 and 76%. Figure Five: inspection outcomes: institutions with a success rate between 71% and 76% Grade 1 Grade 2 Grade 3 Grade 4 All School sixth forms 8 40 50 2 288 Sixth form colleges 0 0 75 25 8 Figure Five demonstrates that almost 50% of the schools in this success rate band secured a good or better grade for the sixth form; no sixth form colleges did. It is perhaps here that the unfairness of separate benchmarks for designated sixth form colleges is at its most glaring. Being the worst sixth form college in success rate terms is not in itself an indication of inadequacy. The use of separate benchmarks is at its most absurd when we consider the example of Suffolk One, a recently opened 16-19 school, built with a capacity of over 2,000 students. It is a predominantly A level institution, and almost an archetypal sixth form college. For inspection purposes Suffolk One counts as a school, will be inspected by the schools inspectorate using the schools framework. The use of separate benchmarks for sixth form colleges It is often suggested that GFEs tend to deal with students with a radically different GCSE profile to those found in sixth form colleges. Indeed GFE colleges make much of their perceived inclusivity. However, Figure Six demonstrates that when one looks at the GCSE profiles of students attending different types of college this assumption is not substantiated, and a potentially significant unfairness results from sixth form colleges having separate benchmarks. 2012 8

Figure Six: Average GCSE scores of students completing AS and A level courses 2011 SFC GFE Tertiary AS level 5.9 5.7 5.8 A2 level 6.0 5.8 5.9 Figure Six offers an average GCSE score by institution type, where a scale of A*=8, A=7 and so forth has been used. We see that the typical AS level student in sixth form colleges has an average GCSE score of 5.9 (equivalent to a profile of nine grade B s and one grade C at GCSE). The figure for GFE colleges is 5.7, equivalent to a profile of seven grade B s and three grade C s. It is the case that the typical student in the sixth form college sector is better qualified than their GFE counterpart, but the differences are slight, and not of the magnitude that might be assumed. When we look at the average GCSE scores of individual institutions, the case for separate benchmarks for sixth form colleges is further undermined. Figure Seven: average GCSE score benchmarks by institution type: A level SFC 25 th %ile 5.7 5.6 Median 5.9 5.7 75 th %ile 6.1 5.9 GFE/Tertiary Figure Seven summarises an analysis which ranks each individual college according to the GCSE profile of students taking A levels. The column for sixth form colleges indicates that the middle college has an average GCSE score of 5.9. The top quarter of colleges have an average GCSE score of 6.1 or higher, and the bottom quarter of colleges have an average GCSE score of 5.7 or below. It demonstrates that a full quarter of GFE and tertiary colleges have a GCSE score above 5.9 which is above the average for sixth form colleges. Similarly, a full quarter of sixth form colleges are dealing with cohorts whose prior attainment profile would put them below average for a GFE or tertiary college. Finally, we need to establish that having institutions with differing prior attainment profiles is of significance. To do this we need to explore the relationship between prior attainment and success rates. It is at AS level that the unfairness of the raw success rate measure becomes most apparent. It shows that there is a very clear relationship between prior attainment and performance at AS level. In the top bands almost all students successfully achieve the qualifications they start. In the lowest three bands, under two thirds of students successfully achieve the qualifications they start. At AS level, the overall success rate is 80.3%, though this varies from 96% for those students with an average GCSE score above 7.0 through to 56% for those students with average GCSE scores below 4.7. 2012 9

If all providers of post-16 education had student bodies with identical GCSE profiles this would not be an issue, but consider this: the sixth form college with the highest prior attainment profile has an average GCSE score of 6.9; the sixth form college with the lowest prior attainment profile has an average GCSE score of 5.0. The former draws students almost exclusively from the top four bands; the latter draws the vast majority of its students from the bottom four bands. Raw success rate analysis treats these institutions as if they were dealing with the same raw material. Figure Eight: AS level success rates and prior attainment The principle here must be that designation should have no impact on inspection judgement, and the prior attainment data presented here suggests that the use of separate benchmarks for sixth form colleges is predicated on misguided assumptions about significant differences in the profile of students attending sixth form colleges. The assumed statistical basis for separate SFC benchmarks is unfounded. In effect, sixth form colleges are having a proportion of their superior performance eroded by the use of benchmarks above those for other providers. One of the sixth form colleges judged inadequate in 2011-12 which has an average GCSE score in the bottom quarter had significant discussion with OFSTED prior to their inspection regarding which benchmarks would be used. Initially the lead inspector agreed to use GFE benchmarks, but was then informed by OFSTED that this was not appropriate. The use of GFE benchmarks (or perhaps more appropriately a benchmark for all providers) may not have made a difference to the overall inspection judgement, but the inspection process would be seen to be fair. 2012 10