Known knowns, known unknowns and unknown unknowns The Six Dimensions Project Report 2017 Nick Allen

Size: px
Start display at page:

Download "Known knowns, known unknowns and unknown unknowns The Six Dimensions Project Report 2017 Nick Allen"

Transcription

1 Known knowns, known unknowns and unknown unknowns The Six Dimensions Project Report 2017 Nick Allen June 2017

2 SFCA works to lead and support a thriving and sustainable Sixth Form College sector by being an effective advocate, adviser and information provider for members and a reliable and authoritative source of insight, data and expertise for policy-makers and opinion-formers. For more information about this report please contact Vanessa Donhowe: Sixth Form Colleges Association 4 th Floor, Ergon House Horseferry Road London SW1P 2AL info@sixthformcolleges.org sixthformcolleges.org Sixth Form Colleges Association

3 Known Knowns, Known Unknowns, and Unknown Unknowns (Part I) The Six Dimensions Project Report 2017 Introduction It would be difficult to imagine a more turbulent and complex backdrop to this year s analysis. At a structural level, academisation, merger, and the formation of new and occasionally unexpected alliances are reshaping the sector. The transition to funding by programmes of study is nearing final implementation with the gradual tapering of formula protection, and we have an inspection framework that inspects provision types instead of curriculum areas. This year has seen the first appearance of most of the new measures in the league tables: we now have a funding methodology and accountability framework that allows us significant freedom to form programmes of study to meet the needs of students, as long as the needs of students happen to correspond to the imperatives of the league tables. While we may try to protect students from any real awareness of the above changes, we are less able to disguise the somewhat unmissable reshaping of the curriculum. As far as the curriculum goes, we live in a time of great change, and perhaps the defining characteristic of this change is that it is not happening at once. It might be described, as it was, memorably, at an SFCA conference a few years back as phased confusion. This phasing of confusion is not limited to A level reform. The years 2016, 2017 and 2018 will see the gradual implementation of GCSE reform. The world of general vocational qualifications is not immune to this turbulence. Old style BTECs appear to have a stay of execution: at least for those prepared to make their vocational provision invisible in the performance tables. In a famous US Department of Defense news briefing in February 2002, Donald Rumsfeld commented on the fundamental problem of military strategy. He said: There are known knowns. These are things we know we know. There are known unknowns. That is to say, there are things that we know we don t know. But there are also unknown unknowns. There are things we don t know that we don t know. It strikes me that this is a rather neat summary of the situation in which we find ourselves. There are some things we know, other things that we know we don t, and there is always the fear that it is the unknown unknowns that will derail our attempts to navigate challenges we face. As the reforms in 2

4 the policy and assessment environment progress many of the known unknowns become known knowns and, (Rumsfeld again), Each year we discover a few more of those unknown unknowns. There is a bit of an issue about the known knowns, however, and that is that many people think they are dealing with known knowns, but are actually dealing with aspects that remain unknown, or they have imperfect knowledge of the things they think they understand. This report attempts to secure our knowledge about the known knows, and clarify which of these remain unknown, and help us to spot when other people are telling us with great conviction things that they either don t know or wrongly believe they know. As in other spheres of life, education is full of those willing to declaim loudly and with great confidence about what needs to be done whilst possessing almost no actual knowledge about the subject in question. As Bertrand Russell put it, The best lack all conviction but the worst are full of passionate intensity. Our journey will take us through the national context, and will pause to look at what we can learn from the evidence at each turn. In Chapter One: the changing assessment and accountability context we focus on the curriculum and assessment context in which we operate. We summarise the changes to GCSE and A levels, with a view to exploring what we know, what we need to know, and when is the first time we will be able to know it. We include an important discussion of comparable outcomes and how awarding takes place, placed alongside what we already know about awarding in the summer of Chapter Two: first contact begins the analysis of how colleges are reacting to changing times by looking at trends in the number of subjects taken in the lower sixth, weekly contact time and student outcomes. We know that some colleges have gone for a linear three A level for two years model, while others retained as close to a traditional modular model as possible. What we have not yet developed is a clear sense of is exactly how many followed each path. While an analysis of how colleges have approached the challenge of the transition to a linear world is of interest, the range of approaches also throws up significant issues in terms of how colleges appear to the outside world. We examine, in particular, the implications for achievement rates (formerly success rates) which continue to have a key place in Ofsted s risk assessment process. Last year s six dimensions report considered the theoretical implications for achievement rates: this analysis looks at the actual data. Chapter Three: first blood the experience of the first year of linear qualifications linearity looks at what we can learn from the first set of linear AS exams in terms of student performance. While AS level probably has a rather short shelf life, we can learn much by looking at them as a set of reformed specifications. Subsequent waves of reformed subjects at AS level, A level and beyond will share similar features and bring up similar issues. Chapter Four: on retention looks at the new performance tables retention measure and considers the extent to which it provides a reliable insight into retention patterns at sixth form colleges. We compare the figures from the league tables with actual retention figures from colleges. Our analysis proceeds to look at the funding implications of recording students on two year courses. This may seem like a very niche interest, but the implications for colleges are far reaching. Chapter Five: on deprivation looks to build an understanding of performance of students from disadvantaged backgrounds in sixth form colleges. This is particularly timely given 3

5 Ofsted s increasing emphasis on disadvantage. For us to respond to Ofsted s interest successfully we need to know the full picture of what happens in terms of participation and outcomes across the sector. Chapter Six: a note on prior attainment offers a guide to colleges on the calculation of prior attainment in the coming years. It introduces the methodology for calculating average GCSE scores when students have a blend of qualifications using A* G and 9 1 grading systems. Appendix One: performance in revised AS levels works through each of the thirteen subjects that were in the first wave of reform, and examines how performance in compared to performance in Appendix Two: understanding six dimensions reports works through each of the measures in the six dimensions suite and offers guidance on interpreting performance data. There are some changes this year, including the renaming of success and achievement rates, and the move to measures of grades rather than points to bring the analysis in line with government methodology Perhaps the most important part of the title of this year s report is the Part I bit. We are just at the start of a period of curriculum change that will severely test us. Even if GCSE reform and A level reform are implemented without any further changes, we are still looking at 2021 before our first set of A level exam results taken by students who have completed a full set of GCSE 9 1 qualifications. It is the aim of the six dimensions project to help colleges navigate these complex waters, and make decisions based on knowledge of what is actually happening in the sector. In many ways this year s analysis establishes a platform on which to build our understanding. In a year s time we can look at changing patterns of delivery and entries per subject and contact time against a clear starting point. We can also track the impact of curriculum decisions in terms of student performance and implications for funding and performance tables. In September, we will have our first sight of outcomes in reformed A levels, and a second (if rather smaller) set of reformed AS levels to examine. The quality of the information we have, and the focus with which we use it will help to shape the success of our response to the complexities of the world in which we find ourselves. Nick Allen, 12 June

6 Key Findings Only one college adopted a fully linear two year approach in , while 16 adopted a two year model for linear subjects, 24 adopted a linear model for some subjects, and 49 retained a fully modular structure (Figure 2.0) Those college who have moved to a linear model see a significant short term boost in achievement rates (formerly success rates) (Figure 2.4) In , 49% of lower sixth students started four subjects, but the proportions in individual colleges varied wildly (Figure 2.8) Average contact time per subject for lower sixth students is 4 hours 35 minutes (Figure 2.9) Across a range of measures, outcomes for students in colleges with high contact time per subject are lower than those for students in colleges with relatively low contact time per subject (Figure 2.12) Overall outcomes in reformed AS levels were not significantly different to performance in the unreformed versions of those subjects the previous year. However performance in some subjects, notably art and design subjects and English literature was significantly down on previous years (Figures 3.2 and 3.7) The Department for Education s retention measure significantly over estimates retention in sixth form colleges. Overall, retention is overestimated by around 13% (Figures 4.1 to 4.3) The move to recording courses over two years has significant implications for the calculation of the retention factor. The risk to sixth form college budgets is around 10,000,000 per year (Figure 4.5) Students in receipt of free college meals perform well across a range of measures, and are more likely to take facilitating subjects than their more affluent peers (Figures 5.1 to 5.2) 5

7 Chapter One: the changing assessment and accountability context We start this year s report with an exploration of key features of the national context. What makes this year s account of the environment somewhat different to that of previous years is that a significant part of it focuses on what is going on in schools. Those of you who are familiar with how awarding works, and understand the new key stage four accountability framework, might want to fast forward to the section on GCSE reform and A level reform. For everyone else, this contextual discussion is designed to raise some questions about the students that will be starting post 16 study over the next three years, and note some points about how the accountability context may shape the approaches to qualifications taken in schools and the resulting implications for how prepared students will be for studying at A level. We also present a run though of how awarding works, and look at how the A level pass rate has changed since grading was introduced in What emerges from this discussion is the historical uniqueness of the situation in which we find ourselves. We simply do not have a precedent to draw on. Accountability at key stage four While schools are battling to deliver the new GCSE specifications with many of the same challenges that colleges are facing at A level, there is an added dimension to the challenge that schools are facing and this is the introduction of an accountability measure called Progress Eight. What makes Progress Eight different to other measures in the performance tables is that it is has a direct relationship with inspection scheduling. Put simply, if a school scores 0.5 by this measure, it triggers an inspection. We will see below that there are a number of different ways of scoring below 0.5, and not all of them relate to the quality of the school. While some colleges may have watched the performance tables with interest, and some with significant concern due to local competition, we have never been held to account by the inspectorate in as direct (and as mathematical) a way as schools will be by Progress Eight. Progress Eight is most easily understood examining the composition of the measure. Imagine that the performance of each student is represented by three buckets, and there are strict rules regarding which subjects can be included in each bucket. Bucket one is reserved for English and Mathematics GCSEs. The student s Maths GCSE result is put in this bucket and is double counted, reflecting the importance the government attached to outcomes in Mathematics. If a student does two English qualifications, the highest grade is added to the bucket and this too is double counted. If a student only does one English qualification, this is placed in the bucket but is only single counted. There is a clear incentive for schools to get students to enter both English qualifications (even if they fail one). To not do two English qualifications can have major implications for a student s overall score. Bucket two is reserved for the best three grades in EBACC subjects. EBACC subjects are similar to facilitating subjects at A level and comprise science subjects, computer science, geography, history or languages. Only the best three grades that a student gets in these subjects are added to the bucket. Other EBACC grades can be saved for bucket three if they are required. If a student has not 6

8 done three EBACC subjects then the bucket remains partially full, again running the risk of a poor score by this measure. What we are starting to see here is a clear attempt to shape the curriculum offered to students by manipulating the accountability levers. The motivations are somewhat mixed here. On the one hand you could suggest that the whole concept of the EBACC favours subjects from a different generation and is based on the qualifications that politicians did when they were at school, rather than the subjects required for success today. There is, however a second motive here which is rather more laudable. If you look at students who qualify for pupil premium, or schools with high proportions of students that qualify for pupil premium, you find that the students are much less likely to be entered for EBACC subjects. Progress Eight can be seen as an attempt to force schools to ensure that appropriate numbers of these students are entered for appropriate qualifications. If we are to widen access to university to students from disadvantaged backgrounds we need to ensure that they follow a suite of GCSE subjects that are likely to facilitate progression. This theme is developed in Chapter Four: on deprivation. Bucket three can be filled with any three further subjects from the approved qualifications list. This can include the grade in the English qualification not included in Bucket one, other EBACC subjects with grades equal to or lower to the ones placed in bucket two, any other GCSEs, and other approved academic, arts or vocational qualifications. The best three grades are included. This bucket has proved to be somewhat controversial, as schools have been searching for qualifications on the approved list that can be delivered with relatively little input and relatively high reward. One subject that saw a sudden surge in popularity was the European Computer Driving Licence. We know of one school that has delivered this qualification in two hours and that is two hours in total, not two hours a week. Sadly European Computer Driving Licence has recently been removed from the approved qualifications list Figure 1.0: The composition of the Progress Eight Measure Bucket One Maths x2 English x2? Bucket Two EBACC EBACC EBACC Bucket Three Other Other Other The Progress Eight measure then favours students who do double English, follow a relatively traditional GCSE curriculum, and who do at least eight subjects. One of the issues with the five GCSEs including Maths and English measure was that it only placed value on the first five subjects. This 7

9 gives a useful value to breadth. The other thing that Progress Eight rewards is students getting relatively high grades in each of the subjects they attempt. While its sister measure, Attainment Eight is a simple calculation of the average points total amassed by each student at a school, Progress Eight is a value added measure. That is to say, the judgement the model is making is establishing whether a school scores as highly as would be attained by similarly qualified students nationally. The starting point for comparison is the key stage two results that students secured age 11. To calculate a Progress Eight score for an individual, the eight grades in the measure are converted to points. In 2016, a familiar 0 8 scale was used, though this will change slightly for 2017 with the introduction of reformed qualifications in Maths and English. Once the total points score has been established, it is compared to an expected number of points for students with that particular level of prior attainment in key stage two SATs. While students are only given a level 1 to 5 (with 4 the expected level) when leaving primary school, candidates are actually graded at a much finer level of granulation in the background data. It is this granular data that is used in the Progress Eight calculation. This process is repeated for all candidates, and a school s progress eight score is the average performance of their students. It is interesting that a significant thrust of government policy appears to be about giving schools freedom from the constraints of the national curriculum: academies and free schools are not required to follow it at all. However, while these institutions may be free to not follow the national curriculum if they so choose, there are very clear imperatives to follow the Progress Eight curriculum, and there are significant penalties if they do not. The implications for colleges are significant, as what this represents is a fundamental shift in the raw material that we deal with, most obviously expressed in the incoming prior attainment of students. Our understanding of what a GCSE score represents is not as secure as it used to be. A student who in the past might have been entered only for five or six subjects that the school believed they would pass with a high grade, might find themselves entered for eight subjects. While this might improve the Progress Eight score of the school, it might depress the student s average GCSE score. Our ability to judge what a particular average GCSE score has to say a about a student s potential at A level is significantly compromised. We consider the implications of Progress Eight and other changes in schools more fully in the section on GCSE reform below. A level grading: history and process We are entering into the unknown. The summer of 2017 will see the first awarding of the new linear A levels, and as yet we have no sense of what the grade profiles will look like, or what will happen to these grade profiles over time. In considering what student outcomes will be like (and in turn shaping our response to student performance in a linear world) there are two lines of enquiry that we can pursue that will help to paint the backdrop to the first awarding of grades in the linear A levels. Firstly, it is worth looking at A level grading over time, considering in passing what the pass rate was like the last time a linear 8

10 model was in operation. Secondly, and perhaps more importantly, it is worth looking at how awarding actually takes place. When A level was introduced, in 1951, it did not have graded outcomes. Certificates only recorded whether a candidate had passed or failed. Gradually exam boards started to introduce their own unofficial grading schemes: UCLES used a scale of 1 9, while the JMB recorded achievement in percentages, rounded to the nearest 5%. In 1963, the Secondary Schools Examinations Council (SSEC) introduced a standardised grade system, which survived until Grades were awarded on a quota basis as follows: Figure 1.1: A level grade quotas, 1963 to 1987 Percentage of Entry Cumulative percentage A B C D E O* F In 1987, the Secondary Examinations Council decided to revise its approach and issued new instructions to exam board to award A level grades on the basis of the examiners judgement of the quality of student work. If the examiners believed that more than 70% had met the A level standard they could award more students a pass. This was the key change that allowed the A level pass rate and the proportion of high grades awarded to rise. There is a logic behind both the original quota system and the standards model. With the quotas, employers and universities would know that an A grade put you in the top ten per cent of candidates, a B in the top 25% and so forth. With the revised standards model, the logic shifted to one where all students who had reached a particular standard should be rewarded, because they had reached that standard. In the standards model, the focus of awarding process became getting the judgement right at key grades. While it would be something of a diversion to give much time to the question, there are some interesting implications here for the discussion of how hard or easy A levels have been over time. Prior to 1987, they were not necessarily hard, they just had a pass rate of 70%, regardless of how well (or how badly) students did. In 2009, a further significant change to awarding at A level was implemented with the introduction of comparable outcomes. These changes coincided with the 2008 specification changes which made some adjustments to the content of curriculum 2000 specifications, reduced the number of modules, and in some cases changes the mode of assessment. Comparable outcomes is based on a very different model to those that preceded it. It rests on the idea that similarly qualified students (in terms of prior attainment) should get similar results to students of equal prior attainment the previous year. With such an approach, the pass rate could change, but only if the nature of the students attempting the qualification changes. The motivation here was two fold. Firstly, the introduction of comparable outcomes was an attempt to halt grade inflation. Secondly, the methodology is an attempt to ensure that no cohort is disadvantaged because they happen to be preparing for A levels in a year where there is significant curriculum 9

11 change. Thirdly, comparable outcomes ensures a stability as to the currency of A levels in the jobs and higher education market. Figure 1.2: A level pass rate, 1951 to 2011 Percentage Pass rate Year Nmber of Entries Figure 1.2 shows us the A level pass rate over the last 60 years. Until 1987 it fluctuates between 67 and 70 %, reflecting the quota system in place at the time. After 1987 the pass rate rises steadily, with year on year improvements. In these years increasing numbers of students were deemed to have met the A level standard, and the awarding arrangements were such, that all those who met the standard were awarded the qualification. These improvements start to slow until 2002 where there is a sudden leap upwards, coinciding with the first year of the AS/A2 model. Note that at this point the A level entry dropped significantly. It might not simply have been that the AS/A2 model was easier, it might be that a big part of this change in pass rate was a reflection of a change in the nature of the cohort, with significant numbers of students not completing two years of study. Those reaching the A level exams may well have been better suited to A level study that the cohort in previous years. The pass rate continued to rise until Since the introduction of comparable outcomes, the A level pass rate has barely moved. The relevance of this is that it gives us a glimpse as to what things were like the last time a linear model was used. It is unlikely that the A level pass rate will drop to 2001 levels: teachers have become much more skilled at delivering what is required by the specification, and are much more 10

12 attuned to the nuance of exam questions and accountability frameworks encourage teachers to focus on preparing students to meet the assessment requirements of individual qualifications. Monitoring is undoubtedly more thorough, and the use of value added data to monitor student progress far more widespread than it was at the turn of the millennium. It is however likely that the pass rate will drop over the next few years, largely as a result of changes to the cohort of students reaching A level exams, and the relative difficult of demonstrating A level standard in a linear framework. Figure 1.3: The proportion of A grades at A level: 1963 to A Grades (inc A* grades) Percentage Year Figure 1.3 repeats the time series analysis for A grades. We see identical patters to those we see at A level. It is interesting that the proportion of A* grades awarded today is equal to the proportion of A grades under the pre 1987 methodology. Awarding To further enhance our understanding of what is likely to happen at A level over the next few years, it is instructive to review how A level awarding actually takes place in the comparable outcomes era. The process rests on the activities of the awarding committee. This is made up of the chair of examiners (for that subject), chief examiner, principal examiners (who write question papers), principal moderators (responsible for coursework units), the reviser (responsible for revising questions; sometimes called the scrutineer) and an exam board officer. 11

13 The awarding committee considers where the A/B and E/U boundaries should be set for each unit. We will see that these two grade boundaries are pivotal in the awarding process. The committee considers whether there has been an improvement in the quality of the entry, whether papers have proved easier than anticipated, and reviews the outcomes for each unit in the light of outcomes for the qualification as a whole. Awarding itself has a number of steps. Firstly, comparable outcomes analysis produces what is known as a putative grade distribution. This is based on performance in the subject in previous years, adjusted for any changes in the prior attainment profile of students. For example, it might be that a subject sees a surge in popularity as a result of a newfound (and possibly misguided) reputation for being particularly helpful for university entrance. If the new cohort is less well qualified than the traditional cohort in the subject, the putative (expected) grade profile will be lower than it would have been in previous years. If a subject becomes less popular among relatively modestly qualified students, perhaps as a result of colleges switching to BTEC routes for some students in that curriculum area, then the putative grade profile will be higher than that in previous years. To use an image from closer to home, comparable outcomes analysis is like setting target grades for students: we look at the incoming GCSE profiles of students, and then look at what we know about the performance of similarly qualified students in previous years to set targets. Comparable outcomes analysis does the same. Awarding in the comparable outcomes era is, however, more complicated that a simple statistical calculation of grade profiles. The system also rests on examiner judgement. There is an expectation that awarding bodies guard the A level standard, so they will also look at the quality of work at particular points and come to a view about whether this work matches their understanding of the grade in question. The second step of the awarding process involves setting key boundaries known as judgemental grades. This is interesting, as it is only at these key boundaries that the quality of student work is scrutinised. The process involves a combination of scrutiny of the putative grade profile alongside examiner judgement. Judgemental grade boundaries are important as the other grade boundaries (eg the B/C boundary) are calculated with reference to these key borderlines. At A level, the A/B and E/U borderlines are the judgemental boundaries. The other grades are known as mathematical grade boundaries and are calculated by linear interpolation. Once the boundaries have been fixed, the actual grade profile is produced. The following example shows how this works. Figure 1.4: Setting judgemental and mathematical grades: an example Judgmental Grades Mathematical Grades A* 78 A 70 (70) B 62 C 54 D 46 E 38 (38) U Figure 1.4 looks at the decisions made by an awarding committee. Remember, it is only responsible for setting the AB boundary and the EU boundary. It decides that the A grade boundary should be 70 marks, and that the mark required to secure an E grade should be 38 marks. The committee does not consider the other grades, which are fixed at equal points between the E and the A boundary. 12

14 The judgemental grades are added to the column judgemental grades. The other grades are then calculated mathematically. As the gap covering the four grades between A and E is 32 marks, each grade is set eight marks above the one below it. The mark required for a D becomes 46, the mark for a C becomes 54, and so forth. Note that this does not mean that an equal proportion of students will get each of the grades. It might be that relatively few students get marks close to the A grade border, so there are more C and D grades awarded than there are B grades. To date, the A* grade has also been set through linear interpolation. In our example, it places the A* grade 8 marks higher than the A grade, at 78 raw marks. An awarding body may want to award a different grade profile to that suggested by the putative grade distribution. If this is the case, it needs to identify the basis on which it is making such a claim. The evidence will be rooted in the quality of work produced by students, alongside an examination of the difficulty of the exam papers in comparison to those set in previous years: just because students score higher marks on year does not automatically mean that the students are better than those in a previous year. They may just have sat an easier exam. If an awarding body wishes to award a profile which is different to that suggested by the putative grade profile, it will then take the case to Ofqual. This rarely happens. Note though that just as an awarding body can ask for a grade profile that is higher than that in previous years, it can also ask for one that is lower based on the quality of student work that year. It may well be that this becomes a feature over the next few years, for reasons we discuss below. One final part of the process (as least until the final awarding of modular A level grades) is the conversion to UMS scores. This often confuses and annoys teachers, when they see the marks that they awarded for coursework being converted to lower UMS scores than they were anticipating, even when there has been no adjustment made to their raw marks. Of course, the issue is nothing to do with the conversion to UMS, the issue is purely derived from the judgemental grades set by the awarding committee. Just because 62 marks is enough to secure a grade B in one year, does not mean it will be enough to secure a grade B the following year. Figure 1.5: Raw marks to UMS conversion Judgmental Grades Mathematical Grades UMS conversion A* A 70 (70) 80 B C D E 38 (38) 40 U 0 In Figure 1.5 we use our example from earlier, where the A grade is set at 70 marks, and an E grade at 38 marks. The UMS system is simply a system to allow students to assemble a grade in a qualification based on performance in different modules at different times. In a linear world this may be of little relevance. As the raw marks required to get a grade A in one year may be different to the marks required to get an A the following year, there needs to be a system that converts the raw marks in to a uniform system to allow comparability. With the UMS scale, 80/100 always indicates the A grade boundary, so in our example, 70 raw marks converts into 80 UMS marks. If the raw mark 13

15 grade boundaries are close together, a small change in raw marks can produce a relatively large change in UMS. Note also that the number of UMS marks for an A grade will vary according to the size of the component in question. A level grading has gone through three phases. A quota system, where a fixed proportion of students secured each grade; a standards system, where the number of students passing is determined by the number of students who are deemed to have attained the A level standard; a comparable outcomes model where outcomes are tied to the performance of similarly able students in previous years. This understand how awarding works, will be of significance when we proceed to look at awarding and the grading of the new GCSE and A level qualifications. The comparable outcomes system can be used to ensure stability in the grade profiles, regardless of the quality of work that students produce. In fact, linear specifications are a lot more easy to manage in this way, as all examined components are considered at the same time. The AS/A2 system was comparatively difficult to shape, with exam outcomes from one summer contributing alongside UMS marks collected from previous exam series. The title of this report is known knowns, known unknowns and unknown unknowns. It strikes me that this is a rather good way of considering the state of knowledge about qualification reform. It is important to draw this distinction, as quite often when we come to write things down, what we think we know is not as secure as we might have at first believed. We can of course discount any discussion of unknown unknowns, but we do know that there will be some, and that these will be revealed in the fullness of time. In essence the known knowns are the facts of the matter, and the known unknowns are often the implications of these known knowns. GCSE Reform As is the case with A level reform, GCSE reform will be completed in three waves. FIRST EXAMS 2017: English, English Literature, Mathematics FIRST EXAMS 2018: Ancient languages, Art and Design, Biology, Chemistry, Citizenship Studies, Computer Science, Dance, Double Science, Drama, Food Preparation and Nutrition, Geography, History, Modern Foreign Languages (French, German, Spanish), Music, Physical Education, Physics, Religious Studies. FIRST EXAMS 2019: Ancient History, Astronomy, Business, Classical Civilisation, Design and Technology, Economics, Electronics, Engineering, Film Studies, Geology, Media Studies, Modern Foreign Languages (Arabic, Bengali, Chinese, Italian, Japanese, Modern Greek, Modern Hebrew, Polish, Punjabi, Russian and Urdu), Psychology, Sociology, Statistics. There are huge implications for colleges here. Putting aside for one moment the fact that colleges will be delivering these qualifications (and some are already delivering these qualifications), the reforms represent a fundamental change in the raw material (students) that we deal with. In contrast to A level reform, where there is no intention to change the A level standard, there is an express intention to change the standard of the qualification. The idea (and we will avoid any discussion as to whether this is a plausible idea) is that if you expose a group of students to tougher material and expect a higher standard, then there will be an overall improvement in the standards that students reach. 14

16 It is worth considering these reformed GCSEs in terms of the known knowns and known unknowns discussed above. Over the past few years, the number of known knowns has increased, but the understanding of these is often incomplete and confused. Often, the known unknowns that we identify are the consequences (intended nor otherwise) of the decisions that have been made. What we know (known knowns) The new GCSEs will be called GCSE 9 1 qualifications, and will have a nine grade structure Students enrolling in September 2017 will have a portfolio of mainly legacy GCSEs alongside new GCSEs in English, English Literature and Mathematics Students enrolling in September 2018 will have a portfolio mainly comprising of new GCSEs, with a small number of old GCSEs Students enrolling in September 2019 will have a portfolio of new GCSEs. Calculating average GCSE scores will be problematic, and that potentially the approach to this could differ in each of the next three years. In English and Mathematics, the proportion of students securing grades 9 4 has been fixed to match the proportion of students who secured A* C in the summer 2016 GCSE series. The concept of a good grade has been replaced by standard pass and strong pass. The Department has confirmed that a grade 4 will be known as a standard pass and grade 5 will be known as a strong pass. Both will be reported on in performance tables 1 The proportion of students securing grades 9 7 has been fixed to match the proportion of students who secured A* or A in the summer 2016 GCSE series In setting grade boundaries there are certain grades that are set by examiners (judgemental grades) and other grades which are calculated with reference to the judgemental grades. The 7/6 borderline and 4/3 borderline will be judgemental grades, set using a combination of statistical pre analysis of prior attainment of the cohort and by examiner judgement. Other grades will be set mathematically at equal points between a grade 4 and a grade 7 Grade 9 will be set statistically, and will reflect the proportion of very well qualified students who attempt a subject. It is likely that around 20% of students who get grades 7, 8, and 9 will get the grade 9 (but if patterns from legacy GCSE qualifications are replicated, this will vary significantly from subject to subject). A grade 8 will be set mathematically, at the mark exactly halfway between those required for a grade 7 and a grade to 1 grading justine greenings letter 15

17 Coursework and controlled assessment is being replaced by non examined assessment (NEA). Of the EBACC subjects, only Computer Science has non examined assessment. AS is now, there will be different tiers of examination available, but the grades accessible through the different tiers has changed. In Mathematics, students who take the foundation tier will be able to access grades 5 to 1, so students will be able to get a grade equivalent to a B for the first time (a 5 is a high C low B in today s money), for the first time. The higher papers will allow students to access grades 4 to 9, but a student who is very close to a grade 4 will be awarded a grade 3. Those further adrift will not be able to be awarded a grade 1 or a grade 2. Ofqual have published grade descriptors for each subject. 2 For example, the grade descriptors for English Language suggest that to achieve grade 5, in relation to a range of texts students will be able to: o Critical reading and comprehension summarise and evaluate with accuracy and clear understanding understand and make valid responses to explicit and implicit meanings and viewpoints analyse and evaluate relevant aspects of language, grammar and structure support their understanding and opinions with apt references to texts, informed by their wider reading make credible links and comparisons between texts o Writing communicate effectively, sustaining the reader s interest produce coherent, well structured and purposeful texts vary sentence types and structures and use vocabulary appropriate to purpose and effect spell, punctuate and use grammar accurately with occasional errors There are a number of routes through science GCSE. In addition to Biology, Chemistry and Biology, there are two combined science routes: Combined Science (Trilogy) and Combined Science (Synergy). Trilogy is co teachable with individual GCSEs, Synergy is not. What we know we don t know (known unknowns) While there is an express intention to make GCSEs tougher, we don t know how many marks will be required to achieve particular grades, or (by extension) what standard of work each grade represents. We don t know how well prepared someone who has done the new GCSEs will be for A level study. It might be that experience of a greater range of material will have an impact, particularly in subjects such as mathematics where students will have encountered topics that would hitherto have been left until A level. Theoretically it is possible that someone 2 descriptors for gcses graded 9 to 1?utm_source=postcards&utm_medium=physical&utm_campaign=PostcardsAndSlides 16

18 with a grade 4 in the new GCSE will be better prepared for A level study than someone with an old grade C. This makes setting entry requirements particularly challenging. We don t know what impact exposure to harder exams might have. Anecdotally, schools are saying that many less able students are giving up on papers when they are faced with new style GCSE papers, as questions are at a high level from the start. We don t know how schools will respond when they lose the ability to fine level the entry level for individual components. We don t know if being able to get a grade 5 (high C, low B) on foundation papers will change entry patterns in schools. It might be that more students are entered for foundation papers, limiting the numbers that are able to secure a grade 6 or higher. We don t know what impact changes in the accountability measures will have on the curriculum offer in schools. Much has been made of how the new accountability measures attainment 8 and progress 8 will change how schools behave. There is certainly an imperative for students to study more GCSE subjects (in order to fill up bucket 2 of the measure. However, up to three of the qualifications used in the measure are allowed to be other approved academic or vocational qualifications. We may see a surge in non GCSE subjects such as the European Computer Driving Licence (which counts towards the measure and can be delivered with very little contact time, and analysis suggests is remarkably generous in terms of typical outcomes in ECDL compares to typical outcomes for similarly qualified students in GCSE subjects). We don t know what impact the experience of studying the new GCSEs will have on subject choice, and whether these impacts will be short, medium or long term. It might be that people are put off studying at A level in particular subjects by their experience of the new GCSEs. Some colleges have noted decreased applications for maths and English for September 2017 start, but that picture might be confused by colleges moving from four to three subjects in year 12. It might though represent something quite important, as the new tougher specification deters people from continuing to study specific subjects, and perhaps put people off studying at A level at all. In terms of recruitment, there might be some benefit here for subjects that are not often studied at GCSE level We don t know how successful individual feeder schools will be in preparing students for the new GCSEs. In times of curriculum change, an overall picture which appears to show stability can mask significant turbulence in individual school and college performance. We don t know what impact the reduction of coursework and controlled assessment will have. It is often argued that one feature of the education system that was associated with the improvement in the performance of female students was the introduction of coursework. It may be that the return to terminal assessment and removal of coursework from many subjects will see an improvement in the performance of male students. It might be that the synergy and trilogy GCSE pathways are different in how well they prepare people for doing science A levels 17

19 Historically, the proportion of students getting a grade A or higher who got an A* grade varied significantly from subject to subject. In English Language, 19% of A or higher students secured an A*, in Maths it was 36%, in French it was 30%. A level reform Colleges should be rather more familiar with the substance of A level reform than they are with the nuance of GCSE reform, but it is worth conducting a similar known knowns / known unknowns analysis. Our analysis here focuses on the implications in terms of quality assurance and the interpretation of A level outcomes, but these quality implications are really just the external expressions of fundamental changes for the students we deal with. What we know (known knowns) There is no intention to change the A level standard. While at GCSE there was a very clear ministerial steer that harder courses and exams were required, there is no such imperative at A level. There are fundamental changes to the assessment methodology, assessment patterns and assessment instruments. The role of practical work in science has changed, and the volume of coursework is much reduced in many subjects. Colleges are changing their approaches to the delivery of A level programmes of study. Fundamentally these relate to the number of subjects taken, the balance between the number of subjects taken in the lower sixth and the upper sixth year, and the use of external assessment at the end of the lower sixth year. Chapter Two: First Contact below explores how colleges responded in the first year of reform, though it should be emphasised that this is both an emerging and a moving picture. These changes to delivery started with the shift to funding by programmes of study, and the removal of the incentives to offer a broad programme, and continue as colleges navigate the changing assessment framework and respond to funding and cost pressures. These changes are an important context when considering awarding in summer There have been reductions in the number of students who have sat AS level exams, but these reductions are not evenly distributed across subjects. That the first round of AS level exams saw some changes in performance, particularly in Art and Design and English subjects. Chapter Three: First Blood explores this in more detail. The changes to A level are not being made in a closed system. Other factors such as the move to BTEC qualifications and mixed programmes need to be considered when trying to develop an understanding of what is happening at A level That the awarding of the A* grade will be different. To date, to secure an A* grade a student needed to secure an A grade overall, and an average of 90% in the A2 modules. Future 18

20 awarding of the grade will relate to the prior attainment profile of a subject and the history of awarding of the A* grade in the subject in question. The Welsh examining board, WJEC, has been split into two distinct spheres of operation. The Welsh side offering a modular AS/A2 model, and the English side (EDUQAS) offering linear A levels. Note that this will bring significant changes to the cohort than English sixth form colleges will be compared to. What we know we don t know (known unknowns) We don t know what the A level standard will be. This point is worth considering for a moment to let it sink in and consider what the implications of this are. Think of what we have been requiring subject teams to do for the last two years: prepare students for exams armed with a rough outline of what they need to cover and with relatively little in the way of support materials (save for a few hastily produced text books). Most importantly, there has been nothing in the way of actual exam papers, which have been written by experienced examiners, scrutinised by experienced examiners, tested on thousands of candidates, subjected to the full awarding process, and reported on with clarity in a chief examiner s report. Perhaps the task in colleges regarding how we respond to results over the next few years is not one of quality assurance: it is more a matter of learning from the experience of the first linear exams at college, subject and teacher level. We don t know what an A level piece of work actually looks like. Just as it is the case that we don t know what the A level standard is, we also don t know what an A level piece of work, or an A grade piece of work looks like. Consider one example: in Art and Design A level courses this summer, teachers will mark the work of every single candidate, and moderators will scrutinise the work of every single centre. Not a single person involved in that process can say that they have ever seen a piece of work that has been awarded an A level pass, or an A level grade A under the new system. The point here is simply to illustrate how incomplete our knowledge is, and how steep the learning curve over the next few years. We don t know what the A level pass rate will be. We saw above (Figure 1.2) that the move to modular exams (with first awards in 2002) saw a sharp increase in the pass rate at A level, alongside a sharp reduction in the number of A level entries. This can be interpreted in a number of ways. The usual interpretation is that modular exams were easier than the linear specifications they replaced, but the fact that the size of the cohort changed at the same time may reflect the fact that an AS/A2 model has a function of elimination at the midpoint, however well intentioned a college may be. The increase in pass rate may simply reflect the fact that students likely to fail A level were removed from the process at the midpoint. It is certainly the case that exam board expect the A level pass rate to fail, though by how far and by when are moot points. We don t know what the relationship with prior attainment will be. While the comparable outcomes process will be used to ensure that a similar profile of students will get similar outcomes to those in previous years, this process simply ensures that the overall profile will be correct. It does not necessarily mean that students at particular points in the prior attainment profile will get similar results to the year before. For example, those at higher levels of prior attainment could get lower results than the previous year, those at lower 19

21 levels of prior attainment could get better grades: overall the profile would be the same. It would be reasonable to expect that examiners are less willing to use the extremes of the mark scheme in the early years of a specification. This would produce the contrasting fortunes indicates above. This is an issue for value added models which make value added calculations against historical benchmarks. We don t know how the A level pass rate will change over time. In the short term the pass rate is likely to hold up, and there are three separate reasons for this. Firstly, comparable outcomes will be used to ensure that a subject with a similar entry to the previous years will have a similar grade profile to that in previous years. Secondly, there is a presumption that no year group should be disadvantaged due to curriculum change, so we should not expect any sharp changes in performance. Thirdly, while many colleges have given up on AS level courses and A level exams, most stuck with AS level exams in The AS level entry in reformed subjects was around 80% of that the previous year. This means that the elimination function mentioned above will still have been in operation, and the ability profile of the cohort similar to the year before. Over time, the AS level elimination function will be diminished. Alongside these reasons to expect stability in the short term, we must mention that there is a very important imperative placed upon the awarding bodies: the need to maintain the A level standard. Put simply, if an awarding body looks at the grade boundaries suggested by the comparable outcomes analysis, and does not feel that work is of A level standard, it can change the grade boundaries. To use an extreme example, comparable outcomes analysis could suggest a pass rate of 90%, but analysis of raw marks could find that at the 10 th percentile, the students only scored 8/100. It would be clearly absurd to award an A level to a students who had only scored 8/100. We don t know whether schools are behaving in similar ways to sixth form colleges. This is important because schools enter more students for exams than sixth form colleges. Their behaviour therefore influences the comparator group against which the performance of our students is examined. In turn, the behaviour of the majority will have a bigger influence on awarding body behaviour than the behaviour of sixth form colleges. For example, we don t know the shelf life of AS levels. Schools are often very attuned to the imperatives of the league tables (in contrast to many colleges who are entirely indifferent), and many have been surprised to find that the AS level performance in those subjects that a student does not take forward to A level is included in league tables value added (L3VA) calculations. If schools stop doing them, it is unlikely that there will be much of a market in all but the largest subjects. Aparently schools are not using AS level as they are worried about the impact on performance tables. We don t know the implications of fewer students sitting AS level exams. The obvious way to view the decline of AS level exams is as the removal of a hurdle, which used to restrict the number of students progressing to the A level year. It might be that the absence of this hurdle allows increasing number of borderline candidates to make it through to the A level exams, creating downward pressure on the pass rate. It could be that something entirely different happens. Perhaps the AS level hurdle has been causing many students who would have made A level standard in a year s time to leave prematurely. After all, students do not learn at an even pace. AS level exams are just eight months after the start of A level study, A level exams are twenty one months into a programme of study 20

22 We don t know whether students who start on three subjects will perform better than those who start on four. Previous analysis has suggested that the more subjects that students do, the better they do in each of those subjects. This may be correlation rather than causation, and it might represent a selection bias in that more confident learners (and institutions that are more confident) enter more qualifications. It might be that students who start on four subjects have somewhere to go if one subject goes wrong and so will get better grades in the three that they eventually take through to A level. Alternatively, it could be argued that the clarity of a twenty one month long course provides a far more effective preparation for A level exams than a confused first year where students try to decide which of their subjects they should discontinue at the mid point. There is also the question here of the impact of fewer qualification hours on the learning experience. A student doing three A levels will have more free time than a student doing four subjects, and may find this difficult to manage. We don t know whether the cohort of students who have not sat AS level exams will be qualitatively different to the cohort who have done AS levels. This creates issues in terms of quality assurance: We don t know what target grades we should set our students. The challenges we face at the moment are huge: in September 2017 we will set students our first set of fully linear target grades. We won t know until September 2019 whether the grades we set will have been appropriate. While we can make a reasonable guess, and I m not suggesting we will be totally wrong, it does illustrate two things. Firstly, that we are feeling our way through these reforms, and can only rely on professional judgement. Secondly, just as we don t know what the standard will be, nor does anyone else. That includes Ofsted inspectors. So if any of them make comments on the progress of current students, we should respectfully ask them how they know. It is not just me that thinks so. In a recent bog on Inspection and the use of grade predictions Sean Harford, Ofsted s Director of Education, commented that While we know that the national profile of results will be stable, none of us yet knows what a new grade will look like in terms of pupils work. We also know that there is always more schoollevel volatility in results when qualifications change. 3 There is a wonderful contradiction between an inspection framework with a focus on the progress of current students which is grading colleges on that very basis, and a chief inspector chap saying no one knows what the A level standard is. Add to that the fact that inspectors now inspect across programmes of study, scrutinising provision in areas they really know nothing about and we really ought to be voicing a few concerns. We don t know how well different groups of students will respond to the new assessment arrangements: Female performance at GCSE diverged from male performance when coursework was introduced. It might be that a shift to terminal exam based assessment may suit male learners. Over time, things that are in the known unknowns box will gradually migrate to the known knowns box. It will be 2021 before we have the full picture of performance in linear exams taken by students who have previously taken a full set of GCSE 9 1 qualifications. The job of this project is to provide as 3 and the use of grade predictions/ 21

23 much clarity as we can as soon as that clarity is available, but also to be clear about what we don t know, what we can t know, and what others can t know. Through this gradual implementation of curriculum reform, the one group that we must protect from this turbulence and change is our students. It is imperative that when we are talking to them we do not transmit our own anxiety. As a group of colleges, we are those who are best placed to get things right: we are not restricted by the timetabling needs of the lower school, or distracted by the complications of wholesale GCSE reform, or an accountability framework that places huge value on key stage four outcomes. 22

24 Chapter Two: first steps in a linear world Modes of Delivery As we take our first tentative steps into a linear world there is much to learn. Most obviously, we need to understand how student performance is changing, but we also need to know how colleges have responded to the changing world. What made so unusual was the collision of a series of competing imperatives: the demands of linear qualifications, the demands of modular qualifications, a funding methodology that funded students rather than enrolments, and a backdrop of a rate of funding that continued to exert significant pressure on colleges, and in turn on the breadth of student programmes. We deal with student performance in Chapter Three: First Blood. In this chapter, we concern ourselves with aspects of how colleges are responding to the challenges of organising the curriculum. The first element of college behaviour we are interested in here is how colleges have arranged student programmes of study. Later in the chapter, we will see that the ways that colleges have behaved here have implications for achievement rates in , and beyond (and, in turn, for how colleges are perceived by Ofsted and other external agencies), and for student grades in the summer of It is very easy to get a fragmented view of what colleges have done, and the view you develop will largely be shaped by the particular sample of people you have spoken to. Our analysis here avoids the perils of small and biased samples, and clarifies what paths have been followed in ninety different colleges and considers the implications of these decisions. Colleges faced significant challenges in planning for the first year of the linear specifications. In contrast to Curriculum 2000, where colleges could plan for the decisive implementation of a common mode of delivery across the curriculum, was just the first of three years of gradual introduction of reformed specifications. Colleges had three main options: 1. To make a decisive switch to a linear mode of delivery from September 2015, with all students recorded on two year courses, regardless of whether the specifications were linear or modular. We call this the fully linear model. 2. To maintain a modular mode of delivery with all students sitting AS exams at the end of the lower sixth year, even in those subjects which had gone linear. We call this the fully modular model. 3. To deliver a hybrid model with some enrolments on two year courses and others on one year courses. We call this the hybrid model. To confuse things further there are a wide range of hybrid approaches that colleges could and have taken. One could have a mainly linear, a mainly modular approach or an entirely balanced approach. College decisions here will have been influenced by a second set of considerations related to the breadth of student study programmes. A three A level equivalent programme may lend itself to a linear model and a four subjects in lower sixth, three in upper sixth model may lend itself to a modular approach, as colleges need a way to manage the transition from four to three subjects. Figure 2.0 examines the mode of delivery in 90 sixth form colleges. This information has been derived from looking at student data at enrolment level in each college, and examining the breakdown of enrolments on one and two year courses in each. 23

25 Figure 2.0: Modes of delivery : sixth form colleges Mode of delivery (lower sixth students) Number of colleges Fully linear 1 Hybrid (all linear subjects recorded as two year courses) 16 Hybrid (4 9 subjects recorded as two year courses) 13 Hybrid (1 3 subjects recorded as two year courses) 11 Fully modular 49 Figure 2.0 reveals that in , only one college moved to an entirely linear mode of delivery, with all students enrolled on two year courses. Sixteen colleges moved to a hybrid model where all subjects that had moved to a linear structure were recorded as two year courses, but all other course were recorded as one year courses in the first instance. This is less than a quarter of colleges that have made significant steps into a linear mode of recording student enrolments. In other colleges, students on a small minority of courses were recorded on two year courses. To the uninitiated, it may not be entirely clear why the ways in which colleges record student enrolments are of anything more than passing interest. The reason is that modes of delivery have huge implications for achievement rates (formerly known as success rates). Moving to two year recording of courses simply produces achievement rates that are different to those produced by recoding students on two one year courses, even if the actual outcomes for students are exactly the same. In the short term, moving to a two year mode of delivery improves achievement rates, but in following years, the achievement rate will drop to below the rate achieved when recording enrolments as one year courses. An example should make this clearer. Imagine two colleges. We will call one Oxford Sixth Form College, and the other one Chelmsford Sixth Form College. These colleges are very similar. There are 1,000 students that start lower sixth A level courses at both colleges each year. Figure 2.1: Oxford City SFC and Chelmsford City SFC compared Oxford City SFC Chelmsford City SFC 1000 students each start 3 courses 1000 students each start 3 courses 80% pass rate in external AS exams 80% pass rate in internal exams 75% of students survive into year 2 75% of students survive into year 2 All stick with 3 subjects All stick with 3 subjects 100% pass rate in A level exams 100% pass rate in A level exams Students have identical experiences at the two colleges and are just as successful. The only difference between the two colleges is the way they record courses. At Oxford SFC they still use AS exams at the mid point and record student programmes as two one year courses. At Chelmsford they had adopted a two year linear model. In terms of how well students do (in our example at least) it makes no difference. Where it does make a difference is in how these two colleges would appear to the outside world. Despite there being no difference in how successful the students are, the achievement rate at Oxford SFC is 13.6% higher than that in Chelmsford. 24

26 Figure 2.2: Achievement rates: Oxford SFC compared to Chelmsford SFC Oxford City SFC Chelmsford City SFC AS achievement rate: A level achievement rate: 3000 starts, 2400 achievements 3000 starts, 2250 achievements (75.0%) A2 achievement rate: 2250 starts, 2250 achievements Overall achievement rate Overall achievement rate: 5250 starts, 4650 achievements (88.6%) 3000 starts, 2250 achievements (75.0%) If a college records study programmes as two one year courses, they end up with a significantly higher achievement rate than one who records study programmes as two year courses. The situation we find ourselves in is one where colleges are moving from the Oxford model to the Chelmsford model. We can model how this would impact on the achievement rates over time. So how does this affect achievement rates? Figure 2.3: Oxford SFC moves to two year courses. Mode of Delivery Student outcomes Achievement rate Year one: all students recorded on one 5250 starts, % year courses achievements Year two: upper sixth still one year courses, lower sixth recorded on two year courses 2250 starts, 2250 achievements 100% All students recorded on two year courses 3000 starts, 2250 achievements 75% Once a college reaches a fully two year model its achievement rate will be significantly lower, but in the short term achievement rates will jump upwards, as the only students included in the calculations are those who have survived to November of the upper sixth year. We know that A level pass rates are very high, and retention through the upper sixth year is very high also. Reality is, alas, rather more complicated than the model. We know from figure 2.0 that colleges are moving at different speeds towards the fully linear two year model, and for the cohort only one college is on the journey outlined in Figure 2.3. The data gives us our first chance to examine the effect of this in reality. We can see the impact by comparing achievement rates in and in the seventeen colleges which recoded at least ten A level subjects on two year programmes. 25

27 Figure 2.4: Changes in achievement rates to : colleges with more than ten subjects recorded as two year courses Number of subjects recorded in a linear fashion Change in achievement rates to College A College B College C College D College E College F College G College H College I College J College K College L College M College N College O College P College Q All but one of these colleges see significant increases in achievement rates. The bad news is that this effect lasts only for the year after migration to a two year model. If you move all subjects at once, it has a huge one year effect. If you move subjects to a two year over time the effect will have a less pronounced peak, and a gradual tapering down to lower levels of performance. Given the complexity of the choices made in colleges, it does make Ofsted s annual risk assessment process a highly challenging exercise. Ofsted simply will not have the information available at the level of sophistication that we are looking at here to be able to see whether changes in performance relate to genuine changes in quality, or whether they are a statistical effect caused by a move to a linear mode of delivery. There are significant implications for colleges here also: we will need to ensure that our self assessment reports are clear about where colleges are on the road to linearity. Figure 2.5 summarises the mode of delivery effect on achievement rates by mode of delivery. We see that those colleges who have migrated to linear modes of delivery for a significant number of their courses have seen a significant improvement in success rates. Figure 2.5: Changes in achievement rates to by mode of delivery Mode of delivery (lower sixth students) Change in achievement rates to Fully linear 7.7 Hybrid (all linear subjects recorded as two year courses) 3.6 Hybrid (4 9 subjects recorded as two year courses) 0.1 Hybrid (1 3 subjects recorded as two year courses) 0.1 Fully modular

28 Figure 2.6: Recording of lower sixth enrolments (40 highest volume subjects) Subjects Total One Two One Two Starts year year Year % Year% GCE AS Level Accounting GCE AS Level Art & Design Fine Art GCE AS Level Art & Design Graphics GCE AS Level Art & Design Photography GCE AS Level Art & Design Textiles GCE AS Level Biology GCE AS Level Business Studies GCE AS Level Chemistry GCE AS Level Computing GCE AS Level Design & Technology GCE AS Level Economics GCE AS Level English Language GCE AS Level English Language & Literature GCE AS Level English Literature GCE AS Level Film Studies GCE AS Level French GCE AS Level Further Mathematics GCE AS Level Geography GCE AS Level Geology GCE AS Level Government & Politics GCE AS Level History GCE AS Level in Classical Civilisation GCE AS Level in Communication and Culture GCE AS Level in Drama and Theatre Studies GCE AS Level in General Studies GCE AS Level in Health and Social Care GCE AS Level in ICT GCE AS Level in Mathematics GCE AS Level in Sociology GCE AS Level Law GCE AS Level Media Studies GCE AS Level Music GCE AS Level Philosophy GCE AS Level Physical Education GCE AS Level Physics GCE AS Level Psychology GCE AS Level Religious Studies GCE AS Level Spanish GCE AS level Use of Maths Grand Total

29 Figure 2.6 allows us to look at enrolment patters on a subject by subject basis. We find that around a fifth of enrolments in the reformed A level subjects (indicated in red) were recorded as a two year linear enrolments. The only subjects where the volumes of two year enrolments are higher than this are in the suite of Art and Design AS levels. There were significant concerns in these subjects about the unnecessary burden of assessment in the revised AS level specifications. Indeed, some centres elected to deliver an EDUQAS AS specification in the lower sixth as it only required one assessment, followed by AQA/OCR/EDECXEL A level in the upper sixth. It seems that in those colleges which have elected to maintain a modular approach Art and Design departments have had some success in making the case for their being an exception. In the unreformed specifications, the proportion of enrolments that have been recorded as two year courses is very low. In the summer of 2017 we will be able to reflect on the relative success of these different modes of delivery. There is an argument that as a two year linear course lacks the hurdle of a formal external exam at the mid point, students will access the upper sixth year in greater numbers than previously. The implication is that they will be, on average, a lower quality of candidate, and will produce work of lower quality that secures lower grades. There is a counterargument that suggests that the AS exams have, for some seventeen years, provided an artificial, ill placed and unnecessary hurdle which has caused significant numbers of students to leave the system after just nine months of study. Without this hurdle, students have twenty one months do develop the knowledge base and skills set necessary for success at A level. Having a significant number of subjects with an 80:20 split will give us the material to offer some preliminary conclusions on this. Breadth of Programmes We can also use our data set to look at the breadth of programmes of study in the first year of reform. We do this by taking the individual enrolments across the subjects and then aggregating them into programmes for individual students. The analysis only looks at the lower sixth students who are attempting a lower sixth programme of study for the first time. Those students who were on level three courses during have been excluded, as their programmes may well comprise a complex blend of AS level, A level, A2 level and BTEC qualifications. Figure 2.7: Breadth of programme lower sixth year Breadth of programme Number of students % of students Grand Total Figure 2.7 gives the breadth of programme for all the students in the analysis. The courses counted embrace AS level qualifications, enrolments on the first year of a two year A level qualification for , one year and two year BTEC qualifications (adjusted to the number of A level equivalents each qualification represents), and GCSE Maths and English re sits. Extended project has not been 28

30 included in the analysis, but there were around 1,000 students who had that as part of their programme of study. What we see suggests a reasonably even split between those students doing three subjects and those doing four. One might assume that this represents around half the colleges maintaining four courses as their standard offer and others narrowing the standard programme to three subjects, but the truth is spectacularly more complex than that. Figure 2.8: breadth of programme % 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% Three Four Figure 2.8 looks each of the 90 colleges in the analysis. It excludes those students doing two subjects and those doing five or more, which allows us to focus on the balance between students doing three and four subjects in each college. The results are remarkable. One might have reasonably expected to see a group of colleges that maintained a four qualification programme, a group with a clear three qualification programme, and a group in the middle with a reasonably even four / three split, with decisions about breadth of programme for individual students shaped by prior attainment. Instead, we see almost every possible combination. Figure 2.8 arranges the colleges by the proportion of students doing four subjects. To the right we see those colleges with the highest proportions of students doing four subjects (represented by the red bars). To the left we see those colleges with the highest proportion of students doing three subjects. 29

31 Contact time Over the years the six dimensions project has tried to fill in some of the gaps in our knowledge about what actually happens in the sector. Last year in Unpicking White Rainbows, we sought to shed some light on set size and the relationship between set size and student outcomes. This year we turn our attention to contact time. Contact time is a significant issue. Many colleges are reviewing their standard contact time alongside consideration of the number of subjects in a typical study programme. The curious factor in these discussions is that no one actually knows two of the key things needed to make a decision on contact time: what contact time is in all the colleges in the sector and what the relationship between contact time and students outcomes is. Here we provide the answers to these questions. In our analysis of contact time, data was collected from 91 colleges regarding contact time in and Colleges were also asked to identify whether they had any concrete plans to change things for Eleven colleges said that contact time was under review for , but this might underrepresent the true figure as MIS teams may not necessarily be involved in discussions before decisions are made. Figure 2.9: Contact time and : sixth form colleges AS level / lower sixth A2 level / upper sixth Lowest Highest Average (Mins) Average (Hrs, Mins) Figure 2.9 summarises the data. We see that the lowest hourly contact for lower sixth students is 4 hours. The highest contact time is one hour 40 minutes per subject higher at 5 hours 40 minutes. Most colleges offer the same contact time for lower sixth and upper sixth courses, but some do not. What it interesting is that colleges vary in whether they deem upper sixth students or lower sixth students in need of more contact time. There are arguments either way. Some world argue that students need more input in the lower sixth so they can develop the independent study skills necessary for success at level three and beyond, and then they can take these forward into the upper sixth year with less direct contact from teachers. Others take an approach that the upper sixth year is harder, and therefore students need more engagement with teachers. Most colleges have the same contact time for upper sixth and lower sixth students. Ultimately, the complexity of running different timetables for lower and upper sixth may be a deterrent to too much deep thinking in this context. It is worth noting though that some colleges make exceptions for some subjects: in one college science subjects have an additional hour teaching time. In another, BTEC subjects are given an extra 55 minute contact time. It is interesting to note that despite there being a lot of noise in the sector about changing contact time, the average contact time in was only two minutes more than it was in

32 Figure 2.10: A level (upper sixth) contact time per subject A2 contact Figure 2.10 arranges the colleges from the one with the lowest contact time (left hand side) to the highest (right hand side). Note that half of the colleges have contact time of 4 30, and threequarters of colleges are within ten minutes of this. Now we have established how contact time per subject varies, it would be useful to establish whether contact time seems to make a difference when it comes to students outcomes. To consider the impact of contact time, we divide the colleges into four groups. Figure 2.11: Contact time banding Contact Time Band Name Number of colleges in AS analysis Number of colleges in A2 analysis 5 00 to 5 40 Band to 4 55 Band Band Less than 4 30 Band The bands are of reasonably equal size, with the exception of Band 3, which includes only those colleges with 4 hours 30 minutes contact time. Figure 2.12 explores the outcomes in these four bands. It is a standard six dimensions analysis, so has already adjusted for prior attainment and subject routes. 31

33 Figure 2.12: AS level outcomes Attend Ret Pass Ach High PPS PPC PPA Band Band Band Band The outcomes are counter intuitive to say the least. At AS level, the outcomes for students at colleges with above average contact time per subject are worse than those colleges with below average contact. The outcomes for A2 courses (Figure 2.13) confirm the patterns at AS level. The gaps are not enormous, but what is interesting is that there is not c clear trend in the opposite direction. It might be quite reasonably assumed that students who get more hours of teaching perform better than those who get less teaching. Figure 2.13: A2 level outcomes Attend Ret Pass Ach High PPS PPC PPA Band Band Band Band So how can we explain this? We suspect that what we are seeing is the primacy of quality of teaching over quantity of teaching. With the exception of a few outliers, contact time at most colleges is around the same basic 4 hours 30 minutes. While some colleges have fifteen minutes more contact, and others fifteen minutes less, students are getting a similar deal wherever they study. It may well be that those colleges with relatively high contact time may well have decided to increase contact time as a strategy to improve performance. This might not be as effective a strategy as might be assumed. Consider this at the level of an individual teacher who is not performing very well who is delivering an A level course in 4 hours 30 minutes a week. Would we assume that if we gave them five hours a week that their performance would improve? Unless we pay attention to teaching craft and how one might usefully use the additional time it is unlikely to make much of a difference. It might even make things worse, if the teacher pads out lessons to fill the time available, and the pace of learning slows further. To put it bluntly, an ineffective teacher is unlikely to produce better outcomes over five hours than they might over four: dull uninspiring teaching plus a bit more dull uninspiring teaching does not equal good teaching. When we consider the impact of contact time at college level, suddenly it does not seem so surprising that increasing contact time does not necessarily lead to improved results. What is certainly true is that contact time does not equal learning time. Where teaching is effective learning extends well beyond the classroom, and where it is not it may not even extend to the time in the classroom. There are other reasons the one might increase contact time, particularly as part of a move from four subjects in the lower sixth year to three. Moving to three subjects effectively liberates a seventh 32

34 of the teaching resource, and if there is no financial imperative to reduce the numbers of teachers, colleges might quite reasonably allocate more time to each class. We can also examine what contact time means in terms of total academic contact time by multiplying the breath of programme with the contact time per subject. Our calculations will only include core academic courses, so will not include tutorial, work experience, workshops, enrichment courses, accredited courses and so forth. Figure 2.14: Academic contact time : lower sixth students Academic contact time (hours) 90 th th th (Median) th th 14.0 Figure 2.14 shows us the range of academic contact time across sixth form colleges for lower sixth students. Total contact time will be lower for upper sixth, as they average around 3.0 subjects, whereas lower sixth students average 3.5. The middle college (median) has contact time of 15.8 hours. The 75 th percentile is a figure that is exceeded by 25% of colleges, the 90 th percentile a figure that is exceeded by 10% of colleges and so forth. We see there is significant variation in the amount of contact time students get at different colleges. This is, in the main, driven by the number of subjects that students do but is also influenced by contact time. It will be interesting to see how colleges adjust contact time per subject, as seems inevitable, most student programmes to three A levels or equivalent. If a college moves from a 4:3 model to a 3:3 model it reduces the amount of teaching required (and therefore the number of teachers required) by a seventh. A college that spends 7,000,000 on teachers has choices about how to use that spare 1,000,000. It could invest it in extra contact time. Teachers may of course argue that the new specifications have more content and therefore need more contact time. Our analysis of contact time suggests that this may not be the panacea that one might assume. There are other ways of spending the money of course: enrichment, tutorial time, workshop time, interest payments on a nice new building and so forth. There is another aspect of the move to three subjects worth considering, which is the total contact time per student. Colleges may be reluctant to keep a relatively low contact time if students are only following three subjects in the lower sixth. It may be perceived that if students have less than three hours a day of contact then they will be less likely to attend, and there is the issue of how to accommodate them if they have large gaps between lessons. The funding reasons surrounding the need to maintain 540 hours a year, will be a consideration here also. Our analysis last year demonstrated that large set sizes are not a barrier to success. This year we find relatively low contact time is not a barrier to success either. 33

35 Chapter Three: first blood the experience of the first year of linear qualifications While AS level exams themselves may have a relatively short shelf life, there is much that we can learn from looking at AS level outcomes in the first thirteen subjects to come on stream. During , most colleges delivered even their linear A levels using a modular approach of sequential AS and A level courses, and in the reformed AS level subjects alone we are dealing with over 100,000 enrolments in sixth form colleges alone. We are able to get a glimpse here of how different performance looks in a linear world, and gain clues as to what is likely to happen with A level going forward. We gain insights into how students have performed and in terms of how colleges have behaved. Introduction The report that follows is based on the analysis of AS level performance in reformed specifications across sixty two sixth form colleges. Of these, ten (or sixteen per cent of the total) elected not to enter any students for exams in the revised subjects. In the remaining 52 colleges, students started 76,035 qualifications, across the thirteen reformed specifications. In the entry data produced by Ofqual in the summer, that there had been an overall decline in entry numbers for these subjects, but that the decline had been uneven across subjects, suggesting that decisions on AS entry had been made on a subject by subject basis in individual schools and colleges. It seems more the case in sixth form colleges that providers tended to take an all or nothing institution level approach to AS entry, rather than leaving it to individual subjects to make a case whether or not to enter students for the reformed AS level exam. Figure 3.1: number of enrolments per subject: sixth form colleges: data Subject Enrolments GCE AS Level Art & Design 3D Design 370 GCE AS Level Art & Design Fine Art 3278 GCE AS Level Art & Design Graphics 1715 GCE AS Level Art & Design Photography 2967 GCE AS Level Art & Design Textiles 1064 GCE AS Level Biology GCE AS Level Business Studies 6832 GCE AS Level Chemistry 9292 GCE AS Level Computing 2417 GCE AS Level Economics 5322 GCE AS Level English 5597 GCE AS Level English Language & Literature 3231 GCE AS Level English Literature 6986 GCE AS Level History 8771 GCE AS Level in Sociology GCE AS Level Physics 6108 GCE AS Level Psychology GCE AS Level all subjects

36 In developing our understanding of performance in the reformed specifications there are two distinct areas that are touched upon in this paper. Firstly, we need to see whether the introduction of these reformed qualifications has seen any changes in behaviour and performance within colleges. It might be that we see distinct changes in attendance or retention, reflecting students, teachers and institutions engaging with qualifications differently in a linear world. Secondly we need to explore whether there are any notable changes in performance of those who have sat the exam. The exam boards can show us whether the overall grade profile in a subject is different to that in the previous year, but they cannot show us whether there has been any change in the relationship between prior attainment and student outcomes. For example, it might be entirely possible in a subject to see the overall grade profile remaining the same, but those at the top end of the prior attainment spectrum achieve rather fewer high grades than they did previously, and those at the lower end of the spectrum achieve rather more. In this analysis, we present two main approaches to analysing this year s results in reformed specifications. Together, they give us an important but incomplete picture, as no reference is given (for example) to performance in modular AS levels or the number of subjects taken per student. It does however give us some useful clues as to changes in performance across the sixth form college sector The first analysis we present is a six dimensions analysis using the legacy AS level benchmarks from This allows us to see in summary form whether there has been any changes in rates of attendance, retention, achievement, success, high grades and points per student in each of the reformed subjects. In the second analysis, a comparison has been made across the prior attainment spectrum of performance in terms of points per completer. This is in effect a standard value added measure as it looks at the grades secured by those who enter exams. It gets to the heart of the question of whether students are getting similar grades to students the year before. The Six Dimensions Analysis Figures 3.2 and 3.3 conduct a six dimensions analysis of all the enrolments in reformed specifications. In essesnce this analysis involves running the data against the benchmarks established in the previous year ( outcomes). By looking at this we can get an immediate sense if there is anything different about this year s outcomes overall. The main purpose of such an analysis is to establish whether there is any substance to suggestions that might emerge in individual colleges regarding changes in performance. Figures 3.2 and 3.3 aggregate performance in all subjects. 35

37 Figure 3.2: Six Dimensions outcomes: Reformed AS level qualifications % 20 Performance versus National 15% 10% 5% 0% 5% 10% 15% 20% 0% 1% 0% 1% 3% QCA Points (15 points = 1 grade) In Figure 3.2, performance in is represented by the zero line. Performance in in reformed subjects is represented by the red and blue bars that (sometimes) extend above or below that zero line. Put simply, if bars extend above the zero line it suggests that having adjusted for prior attainment and subject choice, performance was better in than it was the year before. If bars extend downwards, it suggests that performance is not as good. Where a bar is almost exactly on the zero line, as is the case with Achievement in Figure Two, it suggests performance is exactly in line with the previous year. We see that for most measures performance is pretty much in line with that of the previous year: attendance is slightly up, retention is slightly down. The proportion of high grades though is down by 3%, and the points measures are down by one point, representing a fifteenth of a grade per entry, or one grade less for every fifteen students. While these changes are not dramatic, they do provide some useful context for looking at performance in an individual college. A college might be somewhat reassured to know that overall high grades rates are down 3%, if their own high grades rate has dropped by a similar amount. This initial analysis helps us to understand if the features in the data are a local problem or whether they are part of a national issue. Figure 3.3 provides a four outcomes analysis across the GCSE profile. The most well qualified students are to be found towards the right hand side of the graph, the less well qualified students towards the left. We can use this analysis to see if changes have been evenly felt across the ability range, or whether there are particular groups of students that have fared less well than others. We see, reassuringly that the patterns of leaving, failing, passing with a low grade and passing with a high grade are very similar to those in previous years. 36

38 Figure 3.3: Six Dimensions outcomes: all subjects % 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% A* B grade C E grade Failed Left course What we are seeing in Figures 3.2 and 3.3 is aggregate performance across all subjects. Performance in all thirteen reformed specifications (with separate analysis for the different strands of the Art specifications) is included as an annex to this report. When we look at these individual subejcts we do see some significant variation. While overall performance has not seen dramatic change, there are some quite telling changes at individual subject level. The Art and Design specifications, Psychology and English specifications have seen a significant drop in performance. We can only speculate as to the factors that influence these changes, but to produce the changes in performance in Art subjects was, one suspects, intentional. Figures 3.4 and 3.5 examine outcomes for the 6,986 students in our stample who started AS level courses in English Literature in September Figure 3.4: Six dimensions outcomes: GCE AS level English Literature % 20 Performance versus National 15% 10% 5% 0% 5% 10% 15% 20% 0% 0% 1% 1% 4% QCA Points (15 points = 1 grade) 37

39 We see that in the basic retention, achievement and success analysis, little has changed. Having adjusted for prior attainment and comparing performance with AS level English Literature students from , we see that as many students reached the end of the course as would be expected, and around as many passed the qualification as would be expected given the patterns the previous year. The high grades figure stands out and is some way adrift of expected levels of performance. The various points measures all come out as below performance the previous years. To develop a fuller understanding of what has happened with high grades we need to consult the four outcomes graph for the subject. Figure 3.5: Four outcomes graph: GCE AS level English Literature % 90% 80% 70% 60% 50% 40% 30% 20% A* B grade C E grade Failed Left course 10% 0% High grades are represented by the dark blue colour towards the top right hand side of the graph. The shaded background represents what happened in , and the thin darker bard represent performance in Concentrating on the darker blue (the A and B grades) we see that in all the prior attainment bands above 5.8, the cohort is performing below expectation in terms of high grades. What is really interesting here, is that the gap between expected levels of performance and actual performance actually widens the further up the prior attainment profile you go. Those cantres attracting very well qualified students will have felt the drop in high grades more acutely than those with more modestly qualified students. 38

40 Figure 3.6: overall six dimensions outcomes: reformed AS levels versus benchmarks Att Ret Pass Ach High PPS PPC PPA GCE AS Level Art & Design Fine Art GCE AS Level Biology GCE AS Level Business Studies GCE AS Level Chemistry GCE AS Level Computing GCE AS Level Economics GCE AS Level English GCE AS Level English Lang & Literature GCE AS Level English Literature GCE AS Level History GCE AS Level Art & Design Photography GCE AS Level Physics GCE AS Level Psychology GCE AS Level in Sociology GCE AS Level Art & Design 3D Design GCE AS Level Art & Design Graphics GCE AS Level Art & Design Textiles GCE AS Level all subjects Figure 3.6 presents the data in summary form for each of the eight measures in the six dimensions analysis. If performance in a subject by a particular measure is in line with that of similarly qualified students the year before, then the score will be zero. A positive score would suggest that students have performed better in the reformed specifications than their similarly qualified peers the year before. A score below zero suggests that students have fared less well in the reformed specifications. While in English Literature we have seen a drop in high grades, this has not been the experience in sixth form colleges in all reformed subjects. In fact we see that in some subjects performance was higher in than it was in Figure 3.7 looks at the points per completer measure, and converts it into an analysis of grades per completer. This is an important measure as it summarises changes in performance for those students that sat the exam. 39

41 Figure 3.7: Grades per achiever , versus benchmark Grades per completer Performance in is represented by the zero line. The graph reports in terms of grades per completer. A score of 0.25 would suggest that a quarter of students scored a grade lower than achieved by similarly qualified students in that subject the year before. A score of suggests that one in ten students scores a grade higher than similarly qualified students the year before. We see that in Photography and 3D Design, performance dropped by just over a third of a grade per student. Across a class of twenty students, seven students scored a grade lower than similarly qualified students the previous year. In contrast to this, we see modest improvements in performance in Chemistry, Business Studies, Physics and Computing. The Points Per Completer Analysis Of all the measures in the six dimensions analysis this is the one that is closest to a traditional value added model, in that it looks at those students that sit exams, and examines the points per entry attained at different points in the prior attainment spectrum. Figure 3.8 presents the points per completer analysis. The benchmark established with data is represented by the black line, the performance in reformed specifications is represented by the red line. Note that the data table underneath the graph gives us the number of entries across all subejcts in , and the points scores in each band in both years. Overall, we see a change of one point per entry in the first eight bands, and a narrowing of the gap in the top three bands. Overall then, there is no significant story in terms of student getting dramatically different results to similarly qualified peers the year before. Do bear in mind that in this context, a situation of no change is as significant as a situation of dramatic change. It helpe us understand where we are, and where we need to focus our energy. 40

42 Notice that what we have not attempted to look at here, but something that will certainly do with the first analysis of reformed A levels in the Autumn of 2017, is examine volatility at individual college level. Overall, it might be that the grades are very similar to those awarded in previous years, but it does not means that the same centres that have done well in did well in Figure 3.8: points per completer: 2016 outcomes compared to 2015 benchmarks 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE We can repeat this year on year analysis for all the individual subjects and do so in Appendix One below, but before we do, we ought to see how individual subjects compare to the general pattern of performance indicated in Figure 3.8. The relationship between prior attainment and outcomes varies considerably when individual subjects are examined. Figure 3.9 summarises this information, and plots the lines for each individual subject against a (thicker red) line, which is the line for all entries in reformed AS level subjects. 41

43 Figure 3.9: Points per completer by GCSE Band, by reformed AS level subject GCE AS Level Art & Design 3D Design GCE AS Level Art & Design Graphics GCE AS Level Art & Design Textiles GCE AS Level Business Studies GCE AS Level Computing GCE AS Level English GCE AS Level English Literature GCE AS Level in Sociology GCE AS Level Psychology GCE AS Level Art & Design Fine Art GCE AS Level Art & Design Photography GCE AS Level Biology GCE AS Level Chemistry GCE AS Level Economics GCE AS Level English Language & Literature GCE AS Level History GCE AS Level Physics Grand Total In Figure 3.9, performance in all reformed subjects is represented by the thick red line. The other lines represent the individual reformed subjects. We see two major features in the data. The lines for some individual subjects are well above the all subjects line, those for others are some way adrift of the line. Note that the scale used on the vertical axis moves in multiples of fifteen, which is equal to a grade. Similarly qualified students secure two grades higher in some subjects than they do in others. Note also how this variation from the black line is not even across the prior attainment profile. Towards the top end of the profile the gaps tend to narrow. There is something to consider here for any colleges using target grades for students which are the same for all subjects, if they are using these for quality assurance (as opposed to pastoral) purposes. An example might make this point somewhat more clear. We can use Figure 3.9 to tell us the average outcome for any subject at any point in the prior attainment spectrum. Take a student with an average GCSE score of 5.4. This student is in the band 5.2 to 5.5. We see that on average, students in this band achieve grade D in their AS level exams. If we look at the line for Physics, however, we see that on average physics students in this band secure, on average an E grade (with as many failing as secure D grades). In the Art specifications students are averaging a C grade (with as many getting B grades as get D grades. 42

44 We have seen that the relationship between prior attainment and outcome varies significantly from subject to subject, but we should also note how entry patterns vary from subject to subject. Figure 3.10: Entry patterns by prior attainment profile: Reformed AS levels GCE AS Level Art & Design 3D Design GCE AS Level Art & Design Fine Art GCE AS Level Art & Design Graphics GCE AS Level Art & Design Photo GCE AS Level Art & Design Textiles GCE AS Level Biology GCE AS Level Business Studies GCE AS Level Chemistry GCE AS Level Computing GCE AS Level Economics GCE AS Level English GCE AS Level English Lang & Lit GCE AS Level English Literature GCE AS Level History GCE AS Level in Sociology GCE AS Level Physics GCE AS Level Psychology Grand Total All The contrasts in entry patterns from subject to subject are remarkable. Overall, 25% of entries are found in the top three prior attainment bands, but in sociology and business, only 8% of enrolments are found in the top three prior attainment bands, while in physics the figure is 44%. As we consider changes in performance in each subject, it is instructive to consider the prior attainment profile of the students attempting the qualification. In individual subjects we find the following patterns Art and Design (Fine Art), Art and Design (Photography), and English see falls in performance across the prior attainment spectrum in the region of 5.0 points per entry the equivalent of one in three students getting a grade lower than last year. Biology, Business Studies, Economics and English Language and Literature, History and Sociology are almost exactly in line with performance. Chemistry is very much in line with previous years, but note that shape of the line is very much an upward sloping curve. This curve is also seen in physics, but the performance is stronger at lower levels of prior attainment. Computing students performed better in , though performance in the top bands was very similar There are significant drops in performance in English Literature, and the size of the gap widens as prior attainment increases. In Psychology, performance is lower than at lower levels of prior attainment, but this gap narrows as prior attainment rises. At the very top, the cohort outperforms that from

45 How can we explain the changes in performance? With any change in specification, the best institutions will see some of their carefully cultivated advantage eroded. As sixth form colleges outperform other institution types, we are more susceptible to this regression towards the mean than other institution types. We must be cautious though not to overplay this hand. While in the heyday of modular A levels sixth form colleges were some distance ahead of other providers, with the end of January exams and the reduction in centre assessed coursework the gap between sixth form colleges and other providers has narrowed at A level. We don t have a clear view of performance in schools at AS level, as L3VA only looks at the AS level outcomes of those students who do not continue the qualification to the full A level. The best colleges might not have entered students for AS level qualifications, so part of the story might be institution level behaviour. It is true that some of the top performing sixth form colleges have moved straight to a linear model, and so we do not see their excellence as part of this particular picture. However it is also true that many of the top performing sixth form colleges have not moved straight to a linear mode of delivery and have entered all students for AS level exams, so it is unlikely that there is significant distortion here. The new AS levels bring with them new patterns of assessment, new assessment objectives and new styles of question. In the early stages of a specification, teachers will be less attuned to the nuance of exam questions, and will not have a substantial collection of past papers to draw on in helping students prepare for external assessments. Part of the story might be a teaching and learning explanation It is often suggested that examiners make less use of the extremes of the mark scheme when faced with a new specification, so we should consider the significance of examiner behaviour. As examiners and team leaders become more familiar with the specification they become more confident in making use of the full range of marks available. The effect of this in the short term is to depress grades at the top end of the ability spectrum and inflate grades at the lower end. There are some subjects where we could suggest that this is a feature (AS level Computing, for example). We should also consider awarding body behaviour as a possible driver here. There may be a deliberate attempt by some boards to temper the use of the top bands in the mark scheme. The data would suggest that something along these lines has happened in Art and Design, English and English Literature. There may also be specification effects, to do with the way that AS level and A level specifications have been designed. It may simply be that it is very difficult to prepare students effectively for the AS level exam while delivering a successful first year of a two year linear A level specification. Many colleges will have encountered particular subjects that were very concerned about this. What it interesting is that the identity of these subjects that are considered particularly problematic varies from college to college. It may well be that slightly poorer outcomes at AS relates to these difficulties. We should also consider whether there is a student effect at work here. Have students prepared as seriously for an exam in a linear specification as they would in a modular exam, for example. The subjects that are perhaps most interesting are English Literature and the Art endorsements. It would be very strange if the magnitude of the changes we are seeing solely reflected the ways that different colleges were preparing students for the reformed specifications. There must be decisions made in marking, moderating and awarding that produced the patterns we see here. 44

46 If one was to summarise the changes we have seen in AS level outcomes, one would probably be closer to a summary of mild turbulence rather than dramatic change. However, the degree of turbulence seems to vary by subject. It reinforces the notion that we need to be cautious of over interpretation, but remain robust in our challenge to underperformance. We find ourselves at the end of the first year of a period of change that is going to take at least five years to work itself through. It will not be until 2019 that we have our first fully linear A level awarding season, and the understanding of the relationship between GCSE performance in the revised 1 9 specifications and performance at A level will not be realised until the year

47 Chapter Four: on retention The performance tables measures for 2016 outcomes introduced a retention measure. In many ways this was long overdue. Previous iterations of the performance tables have included a range of measures that look at how well students do when they sit exams, but have not extended to look at how successful institutions are over two years. Pre 16 the performance tables focus on exam outcomes makes some sense. Schooling is compulsory, and broadly speaking it makes sense to look at the achievements of notional 16 year olds at the end of key stage two and key stage four. Developing a retention measure for year olds was always going to be a more challenging exercise than reporting on exam outcomes. While participation is now compulsory, full time attendance for two years at an institution is not. At first glance it would appear simple to calculate retention: look at the students that started in September in a particular year and look at how many are still there in June, 21 months later. However, our world is complicated by the various routes through post 16 education, the choices we have in how to record student enrolments, and by a significant number of students doing level 2 courses in the year after leaving school: some of these students quite legitimately leave after a year to pursue other courses and employment and training opportunities. Others stay for three years. Before the performance tables were released, we knew that they would be slightly curious. When developing the methodology, the Department made a decision to count students who left after a single year of AS level study as being retained. An example of why this has a distorting effect will make this clearer. A student who joins a college in September and leaves after six months is counted as not being retained A student who joins a college, lasts the whole of the lower sixth and joins the upper sixth, only to decide in November to leave to do an apprenticeship is counted as not retained A student who joins a college in September, is recorded on the ILR as doing three two year linear A levels, but leaves in August having realised that college is not for them is counted as not retained But A student who joins a college in September with the intention of doing a two year A level programme, is recorded on the ILR as doing three AS levels, but leaves in August having realised that college is not for them is counted as retained It is the final bullet point that is the issue, and a point that is going to become more of an issue over the next few years. The feature that allows students to be counted as retained even if they leave at the mid point of a two year course, is the fact of being recorded on the ILR as doing three AS levels. It means that students who are recorded as being on a modular programme can leave mid course without penalty, and those who are on a linear programme cannot. Colleges can still choose to record students in a modular fashion. They could be recorded on one year AS levels, which are then later converted to two year A levels. To concord with the spirit of the funding regulations, students would have to be entered for AS level examinations, but the cost of examination fees might be a small price to pay as we shall see shortly. 46

48 Our first glimpse of the performance tables team s attempt to produce a measure of retention is based on the cohort of students. In effect, this was the last fully modular cohort to go through the system, and therefore, the last cohort who had an automatic (and penalty free) jumping off point in the middle of the course. Using the six dimensions data set we can subject the league tables retention measure to some scrutiny. We can ask a number of questions: What is the picture of retention across the sixth form college sector that emerges from the Department for Education s retention measure What is the gap between the retention that the performance tables report and actual retention in colleges Does the performance tables measure produce the correct rank order In the six dimensions data set there are some 65,359 students who started at least three A levels or BTEC equivalent qualifications in September Figure 4.0 looks at the retention of these students over two years. Figure 4.0: Retention : Sixth Form Colleges (six dimensions data) Left during lower sixth Left between lower and upper sixth Left during upper sixth Number leaving Percentage leaving Percentage leaving (cumulative) Completed lower sixth Started upper sixth Completed upper sixth Number retained Percentage retained The methodology used here selects students who do at least three A levels worth of activity. This includes students recorded on year one of two year A level and BTEC courses, and those recorded on courses that form the basis of the first year of a two year programme of study: AS levels, and BTECs of a width equal to 1.5 A levels or narrower. The retention of these students is examined after one year, and the data for the following year is examined to see whether these students progress to starting and completing the upper sixth year. In all instances we are measuring starting a course in the ILR sense of having completed at least 42 days of study. Students that leave before the end of six weeks are not included in the data, and any courses that are dropped in this time are not included in the analysis. There are three significant features of the data. We see that many students leave during the lower sixth year, and that a similar number leave between the end of the lower sixth and the beginning of the upper sixth. We see that few students leave during the upper sixth. Figure 4.1 looks at the retention rate in each of the colleges in the analysis. 47

49 Figure 4.1 Retention : Sixth Form Colleges (six dimensions data) Retention ranges from 96% in the most highly performing colleges to around 65%. The interquartile range (the gap between the college at the 75 th percentile and the 25 th percentile) spans from 85% to 77%. Figure 4.2 Retention : Sixth Form Colleges (Department for Education data) Figure 4.2 shows the retention figures as produced in the performance tables. Retention levels range from a remarkable (and frankly unlikely) 99.7% to 87%, and the interquartile range shows that 48

50 college scores are much more bunched, with a spread from 96% to 92%. The league tables data is publically available. 4 We know from Figure 4.0 that actual retention over two years in sixth form colleges is 81.7%. The performance tables measure puts it at 94.6%. The counting of those that leave at the mid point as retained produces a figure that is some way away from reality. The performance tables figures are about 13% out overall. Figure 4.3: DfE calculated retention and actual retention: cohort 30 Difference Figure 4.3 arranges the sixth form colleges according to the DfE s retention measure. The college at the far left has the lowest calculated retention (86.5%). The one on the far right the highest (99.7%). The red lines indicate how far from actual retention (as calculated using the six dimensions dataset) the DfE figure is. If we look at the college on the far left, it has a bar that reaches 16. This suggests that the DfE data overestimates retention by 16%. For the next college, the gap is even larger, with the DfE measure reporting some 22% higher than that suggested by the six dimensions analysis. While a couple of colleges are very close, they are very much the outliers in the data set. What is particularly interesting about the data is the spikiness of the data. It is, for statisticians, polykurtotic. The differences between the calculations vary widely from data point to data point. Sometimes it is helpful if a measure is wrong in a consistent way, but we see no such consistency here. Not only does the Department for Education produce a picture of retention that is a long way from the reality, the picture for individual institutions is unreliable. While in time the measure will become more useful, particularly for A level provision, currently it is not fit for the purpose of reporting retention. 4 school performance.service.gov.uk/download data 49

51 Before we close this section on performance tables, it is worth having a look at how retention measure relates to prior attainment at the individual institutions in the analysis. Note that the inclusion of BTEC students means that the profile of average GCSE scores is significantly lower than we are used to seeing when we are only looking at A level. Figure 4.4: Retention and prior attainment (six dimensions) In Figure 4.4 colleges have been arranged in terms of prior attainment, with the lowest levels of prior attainment at the left. We see that the highest levels of retention are found in those colleges with the highest levels of prior attainment, and that those with lower levels of retention tend to be found at lower levels of prior attainment. However, we also see that the general trend of higher retention at higher levels of prior attainment is not reproduced in every college. This demonstrates the importance of adjusting for prior attainment in any analysis of retention an opportunity that has been missed in the current performance tables measures. Without this adjustment, we cannot see where the most successful institutions are, and if we do not know where they are, we cannot seek to learn from them. As colleges move to linear models of delivery and two year recording of courses, the impact of the mid point escape clause for students recorded as AS level students will reduce, and in time disappear. There are significant implications here: Almost all colleges will see a drop in retention over the next four years, but the speed of the drop will depend on institution level decisions about how to manage programmes of study, and how quickly to record all A level courses as two year courses Those colleges that adopted a fully linear model early, will see their retention figures fall before other providers For some colleges the drops in recorded retention will be dramatic: from around 90% in 2017 league tables to 70% by 2020 It will appear that there has been a significant change in performance across the sector, and in individual institutions. 50

52 Colleges that work in particularly competitive local environments, will need to be mindful that (on average) schools and colleges with high levels of prior attainment will see less of a drop in retention than other providers Ofsted are required to use the measures in the performance tables as part of their evidence base for inspection. They will not have the nuanced understanding of the approaches taken in different colleges to be able to understand the significance (or the insignificance) of the data they have to hand. Thus far we have explored retention as one of a number of measures in the performance tables, and if we are to be honest, a measure that will not get the attention it deserves in an arena where the A level pass rate is king. There is, however, a much more serious dimension of changes to the recording of A level programmes of study, and this one involves money. In the funding formula, an adjustment to college allocations is made based on the historical retention rates at the college in question. Put simply, the retention figure for the college the previous year is calculated, this is expressed as a decimal (eg 0.95), divided by 2, and 0.50 is added to the total. A college with a retention rate of 90%, will have a retention factor of 0.95, a college with a retention rate of 96% will have a retention factor of The impact of the retention factor is to reduce the amount of funding in the allocation. With a retention factor of 0.98, an allocation of 10,000,000 is reduced to 9,800,000. The concept of a retention factor is not entirely a bad thing. It is saying that it is more expensive to educate a student for a full year than it is to educate one for six months. Therefore colleges where lots of students leave early should receive less funding than one where almost all get to the end of the year. The issue here relates to the switch from recording A levels as two one year courses to recording A levels as one two year courses. The methodology for calculating the retention factor has not changed. In the methodology for the retention factor, to count as retained, a student either has to finish one of their academic courses in a particular year, or continue into the next year to be counted as retained. 5 6 In a modular world, a lower sixth student was counted as retained if they stayed to the end of the AS level course, which for most students was recorded as May. If a student left in June, July or August (responding perhaps to the experience of sitting AS exams, or their own or the college s reaction to AS outcomes) a college was not penalised in the retention factor, as they had already completed the learning activities for the courses they were recorded on. When we move to recording A level as a two year course, things become a little more problematic. Now, a student that leaves in June or July of the lower sixth year is counted as not being retained. Currently colleges are partially protected against the full effect of this as some courses (at least until September 2017) retain a modular structure. If, for example, a student enrolled on AS maths (as one of their courses) in September 2016, they will be counted as retained, as long as they get through to the AS level exam in May 2017, and will not have a negative impact on the funding allocations issued in Spring 2018 for the funding year. The effect of the change will be experienced gradually, with some colleges not seeing a significant change until the allocation. If students on a two year course leave in June or July, the situation is clear: they are leaving during a two year course, and have not completed any of the learning activities, and will count against a college in the retention factor. Where things get rather more confusing is in relation to the question 5 to 19 funding how it works 6 rates and formula 51

53 of what happens to students who leave in August. In the ILR, leaving dates are counted as the date of last attendance. It would be a very strange college that required attendance in early August, so a summer leaver will have their leaving date counted as the last day they attended during the lower sixth. This will be in July (or in some colleges, June). As far as the ILR is concerned, the student will not have been retained, and will count against the college in the funding formula. Figure 4.5 provides a rather dramatic illustration of why colleges should be worried about this change. It lists worst case scenario for each college, in cash terms, of moving to the recording of enrolments over two years rather than one. Figure 4.5: funding implications of recording enrolments as two year linear courses 430, , , , ,000 90,000 74,000 54, , , , , ,000 86,000 70,000 52, , , , , ,000 86,000 66,000 42, , , , , ,000 84,000 64,000 42, , , , ,000 98,000 84,000 62,000 34, , , , ,000 98,000 84,000 60,000 28, , , , ,000 96,000 82,000 60,000 28, , , , ,000 96,000 76,000 58,000 22, , , , ,000 94,000 76,000 56,000 16, , , , ,000 92,000 76,000 54,000 10,000 The methodology deployed here needs a little explanation. In essence, it involves looking at the impact of counting students that leave in June, July or August as being not retained in calculation of the retention factor as opposed to being counted as retained in the calculation of the retention factor. In the analysis behind Figure 4.5 we have calculated retention in each college using a one year methodology and then repeated the analysis using a two year methodology. For the two year methodology, we have assumed that all courses (A level, BTEC or CAMTEC) have been recorded as two year courses. We have used this data to generate a retention factor based on each methodology. By comparing the difference between the two retention factors we can model the cash impact on colleges. These retention factors have been applied to the volume of level 3 activity in each college using a standard figure of 4,000 per student. In Figure 4.5, each number represents the reduction in funding in cash terms for an actual college. Figure 4.6 uses a simplified version of this approach to illustrate how this might happen in a college. Until recently, all A level students were recorded as on a set of one year AS level courses in the lower sixth, and then recorded on a new set of A2 courses in the upper sixth year. A student was counted as being retained for the lower sixth as long as they reached the end of one of their AS level courses. Under the new arrangements, most colleges are moving towards recording A level over two years. Figure 4.6 looks at how the retention figure for a college changes if you move from one recording system to another. The crucial difference is that with two one year courses, leaving at the half way point does not matter: you count as being retained for the year. Under two year recording methodology, a student leaving at the mid point does not count as being retained. The impact in Figure 4.6 is that under a two year methodology, 100 extra students are counted as leaving, which reduced the retention figure from 90% to 80%, and the retention factor used in funding calculations from 0.95 to 0.90: equal to a 5% cut in funding. 52

54 Figure 4.6: The impact of recording A levels as two year courses on the retention factor: an example college The old system of AS level in lower sixth and A2 in upper sixth The new system of two year A levels Students starting lower sixth 1000 Students leaving during lower sixth 90 Not retained Not retained Students leaving between lower and 100 Retained Not retained upper sixth Students leaving during upper sixth 10 Not retained Not retained Students completing two years 800 Retention 80% 90% 80% Retention Factor What makes this cut unusual is that it is an unintended cut. It simply emerges from the curriculum change at A level and the change in the width of the typical lower sixth study programme. Both of these changes bring with them a logic to recording courses over two years. In cash terms for the sector, this equates to a potential cut of 10,748,000. The methodology we have used to model the effect of this change in recording methodology does not take into account students on level 2 programmes of study, and in reality the effects will be slightly softer in colleges with large numbers of students on mixed/btec programmes where enrolments are still recorded as two one year courses. Figure 4.7 repeats the presentation of data in Figure 4.5, but represents the cut as a proportion of colleges budgets. Figure 4.7: funding implications of recording enrolments as two year linear courses: % budget cut There is a little good news here to counterbalance some of this. It is likely that in the absence of a clear external set of exams at the half way point in a typical A level programme of study, more students will progress into the upper sixth anyway. These students will bring with them the funding they attract from being included in the numbers for their upper sixth year (benefitting the subsequent year s allocation), and will also have a positive effect on the retention factor. There are also ways of recording individual enrolments that can soften the impact: entering all students for AS level courses and exams; enrolling them for just one of their courses as an AS level and ensuring that 53

55 that is recorded as the core aim ; having an August leaving day where everyone not continuing in to the upper sixth comes in, is counted as attending something in August, and so is signed off for the previous year. While there may be mileage in re thinking details of how enrolments are recorded, really we should not have to. In performance terms, retention rates and achievement rates, we are already held to account if students do not reach the end of their courses. This is appropriate. To further penalise colleges is unnecessary and unfair. We bear the costs of educating these students for the year. If a student reaches May of the lower sixth year they should count as being retained for funding purposes. 54

56 Chapter Five: on deprivation The impact of deprivation on educational performance is finally getting some of the attention it deserves. At GCSE the figures are stark. In , 55.6% of pupils secured five GCSEs including Maths and English: for those not eligible for free school meals the figure was 60.5%; for those eligible for free school meals, it was 33.5%, a gap of 27.0 percentage points. 7 Alongside the issue of achievement, there is also the issue of the quality of the curriculum accessed by pupils from a free school meals background. The EBACC measure (as used in the performance tables) looks at the proportion of students who achieve a pass grade in English, maths, history or geography, the sciences and a language. Whereas 26.6% of pupils with a non free school meals background achieved the EBACC, just 9.7% of free school meals pupils got it. It suggests differences in the quality of participation, and that students from a free school meals basis are more likely to be offered an impoverished programme of non traditional GCSEs. Even those free school meals pupils that pass five GCSEs are less likely to be doing those subjects that open doors. The issue of deprivation has not gone unnoticed by Ofsted, and the Raiseonline inspection dashboard now contains a breakdown comparing the progress of students from a free school meals background with other students. If we are to engage with this agenda effectively we need to understand what happens across the sixth form college sector. We can ask a number of questions of the six dimensions data set which will provide important context for colleges when considering the participation and performance of students in their own colleges. What proportion of students in sixth form colleges are in receipt of free college meals? What is the prior attainment profile of these students? How well do they perform compared to other students? Do they access facilitating subjects in similar numbers to other students? In schools, around 13.8% of pupils in year 11 are eligible for free school meals. As the proportion of pupils with an FSM background that get five GCSEs including Maths and English (the baseline entry requirement for sixth form colleges) is lower than it is for other pupils, we would expect participation of free schools meals students to be lower in sixth form colleges than it is in schools. If we look at the students who achieve five or more GCSEs including maths and English, we find just 8.9% of those are from a free school meals background. In SFCs, in AS level provision, 7.7% of enrolments are from students in receipt of free college meals. In BTEC the proportion of enrolments is exactly the same. In level 2 provision the proportion of students in receipt of free college meals is higher: in GCSE provision it is 15.0%, and in other level 2 provision 15.0%. Sixth form colleges then are doing significant work with those from disadvantaged backgrounds. The prior attainment of students starting sixth form college from a free college meals (FCM) background is significantly lower than it is for other students. At AS level, the average GCSE score for FCM students is 5.7, against a figure for non FCM students of 6.0. Given this difference in prior attainment, we would expect to see different patterns of subject entry. In Chapter Three: First Blood, Figure 3.10 showed us that the entry patterns across subjects are very different. Some 7 and equivalent attainment by pupil characteristics

57 subjects (business studies and sociology, for example) attract students with lower GCSE scores than others (biology, chemistry, maths). If the free college meals students followed these pattern sit might look like the quality of participation was different. Figure 5.0 produces a fair test to see if FCM students pursue a curriculum similar to equally qualified peers from a non FMC background. Figure 5.0: proportion of enrolments on facilitating subjects (AS level) FCM versus non FCM students GCSE Band % of enrolments in % of enrolments in Difference facilitating subjects (Non FCM students) facilitating subjects (FCM students) 7.5 < < < < < < < < < < < Figure 5.0 uses facilitating subjects as a proxy for high quality participation. While the whole concept of facilitating subjects is highly questionable, it does represent a group of traditional subjects. If we did see that students were much less likely to do these subjects than similarly qualified peers from a Non FCM background then we would have reason to be concerned and could raise questions about bias in college processes that directed FCM students away from traditional subjects. Sociologists have explored concerns such as these over many years. In The Educational Decision Makers 8, Cicourel and Kitsuse (1963) examined the impact that college guidance counsellors in American schools had on subject choice. They concluded that postcode, previous school and other markers of a student s social class significantly influenced the courses suggested by the counsellors. In sixth form colleges we find that students from a FCM background are actually over represented in facilitating subjects. In the top prior attainment band, 76.5% of the subjects chosen by FCM students are facilitating subjects. Amongst their more affluent peers the figure is 73.5%. This advantage is found in all but two of the eleven prior attainment bands. This is a powerful endorsement of sixth form colleges. We have established that FCM students participate well at level 3, and are over represented in level 2 provision, and we can conclude that their subject choices are every bit as challenging as those of similarly qualified students from a non FCM background

58 The final piece of the jigsaw we need to establish is how well these students perform. We can explore this in terms of how well they complete their courses, how well they pass the qualifications that they start, and whether they get similar grades to equally qualified non FCM peers. Figure 5.1: Free college meals versus non free college meals outcomes : AS level In Receipt of Free College Meals Non Free College Meals Starts Act Exp Retention Achievement Grades per entry Act Exp Act Exp Act Exp Act Exp Act Exp Figure 5.1 explores the outcomes At AS level. The analysis presented is a standard six dimensions analysis. It looks how well a particular group performs compared to similarly qualified students nationally. If we look at retention, the free college meals students complete 91.8% of the courses they start. If they had followed national patterns for similarly qualified students, they would have a 92.1% retention rate. The FCM students perform just 0.3% below expectation. In terms of achievement, they are just 0.7% below expectation. In terms of grades per entry, one in twenty students from a FCM background scores a grade lower than would be expected. This is pretty much as close as you can get to being in line with national patterns. What is most pleasing here is the retention figure. Accessing a sixth form college education is no great use if you leave before the end of the course. Indeed, where we have found other groups that perform significantly below expectation (eg students with mental health difficulties) it tends to be retention that is the issue. If students complete the course, they do not tend to achieve grades far away from expectation. Figure 5.2: Free college meals versus non free college meals outcomes : A2 level Starts Act Exp Retention Achievement Grades per entry Act Exp In Receipt of Free College Meals 10, Non Free College Meals 134, Act Exp Act Exp Act Exp Act Exp At A2 level we see very similar patterns. While the FCM students perform fractionally below expectation, it is a very small fraction. The picture we have painted is an important, but incomplete one. In a year s time we will have access to a full two year run of data so can examine the extent to which participation is sustained over a two year programme of study. Colleges can look at their students from a free college meals background secure in the knowledge that there is no reason to expect that the performance of these students should be different to that of other students. 57

59 Chapter Six: a note on prior attainment In Chapter One we explored the turbulent world of GCSE reform. We saw that over the next three years the bundles of qualifications that students will move in three easy stages from being one where all the students arrive with GCSEs graded using letters, to one where all the students arrive with all subjects graded by numbers. The start and end of this process are reasonably straightforward, but how do we calculate an average GCSE score for students who arrive (from September 2017 onwards) with a mixture of grades and letters. This is a matter of concern now, as we need to be able to set target grades for students arriving in September. We also need to ensure that colleges have calculated average GCSE scores in a way that will make sense when it comes to making returns to Alps or six dimensions. The vast majority of these students will have three subjects graded on a 9 to 1 scale. We need a way of mapping one set of qualifications to the other, and a common points scale to use for both. There are two ways that this could be approached. One could develop a set of points values for the 9 1 qualifications that re base to the 8 1 scale that is used currently. Alternatively, you could come up with new points values for the old A* C qualifications. There are merits to both approaches, but perhaps the most significant factor is that in time we will all use a 9 1 scale. If we translated new grades to the old 8 1 scale, we would at some point in the future have to make a second set of changes. The Department for Education will be using a set of transitional points in some of its performance tables measures, so to avoid a further layer of confusion we have followed their schema. In their arrangement the gaps between grades are not consistent: above a grade C, there are 1.5 points per grade, below a C, there is 1.0 points between grades. Figure 6.1: a new points scale for old GCSEs Old GCSE Grade Old GCSE Points New GCSE Points for old GCSEs New GCSE points for double award GCSEs A* A B C D E F G

60 To calculate an average GCSE score, colleges need to calculate a points total for each student. For legacy GCSEs, the new points in Figure 6.1 should be used. For 9 1 GCSEs, the numbered grade achieved by the student should be used. The aggregate points score should then be divided by the number of subjects taken. Double award qualifications must be given double points in the calculation, and counted as 2 in the number of subjects taken. Figure 6.2 works through an example of how this works. Figure 6.2: calculating average GCSE scores: an example Old GCSE Grade New GCSE Points for old GCSEs Number of grades achieved Total points on old GCSEs New GCSE Grades Number of grades achieved Total points on new GCSEs A* 8.5 X X A 7.0 X X 1 8 B 5.5 X X 1 7 C 4 X X 1 6 D 3 X 5 X E 2 X 4 X F 1.5 X 3 X G 1.0 X 2 X 1 X Total Points Total entries Old GCSEs New GCSEs 21 3 Total Average GCSE = Total Points / Total Entries Average GCSE = 6.65 For those of you that use six dimensions chances graphs for target setting, we will produce a new version of the Great Expectations analysis which will allow you to set subject specific target grades based on the revised average GCSE score scale. 59

61 Appendix One: performance in revised AS levels This appendix looks at performance in each of the reformed AS level subjects, and explores whether anything has changed in terms of how well students are doing in external examinations. In short, it attempots to establish whether similarly qualified students are getting the same grades in the reformed specifications as they were in pre reform years. To examine this we conduct a points per completer analysis. Of all the measures in the six dimensions analysis this is the one that is closest to a traditional value added model, in that it looks at those students that sit exams, and examines the points per entry attained at different points in the prior attainment spectrum. Figures A1 1 to A1 17 present the points per completer analysis. In all of the graphs, the benchmark established with data is represented by the black line, the performance in reformed specifications is represented by the red line. Note that the data table underneath the graph gives us the the points scores in each band in both years Figure A1 1: GCE AS level Art and Design Fine Art 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE

62 Figure A1 2: GCE AS level Art and Design Graphics 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE Figure A1 3: GCE AS level Art and Design Photography 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE

63 Figure A1 4: GCE AS level Art and Design Textiles 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE Figure A1 5: GCE AS level Art and Design 3D Design 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE

64 Figure A1 6: GCE AS level Biology 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE Figure A1 7: GCE AS level Business Studies 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE

65 Figure A1 8: GCE AS level Chemistry 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE Figure A1 9: GCE AS level Computing 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE

66 Figure A1 10: GCE AS level Economics 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE Figure A1 11: GCE AS level English 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE

67 Figure A1 12: GCE AS level Language and Literature 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE Figure A1 13: GCE AS level English Literature 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE

68 Figure A1 14: GCE AS level History 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE Figure A1 15: GCE AS level Sociology 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE

69 Figure A1 16: GCE AS level Physics 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE Figure A1 17: GCE AS level Psychology 75.0 Points per completer: actual performance compared to benchmarks 15 POINTS = ONE AS LEVEL GRADE

70 Appendix Two: understanding six dimensions reports Six dimensions reports provide an extended value added analysis of performance in an individual school or college. Whereas traditional value added models restrict themselves to examining the grades students get in the exams they take, six dimensions reports consider a much wider basket of measures of performance. The analysis encompasses all students who started courses, not just those that sit exams. The six dimensions measures cover percentage measures and grade based measures. In essence, the measures examine whether students perform at the level that would be expected of similarly qualified students attempting similar qualifications nationally. The percentage measures ask whether the level of performance reached is in line with national patterns. The grades based measures look at whether students gather the grades that would be typical nationally. The percentage measure are: Attendance of those that complete a course, is their attendance typical for the subject in question and the prior attainment profile of the students concerned. Retention of those that start the course, is the proportion of students completing the course typical for the subject in question and the prior attainment profile of the students concerned. Pass of those that complete the course, is the proportion of students that go on to pass the qualification typical for the subject in question and the prior attainment profile of the students concerned. Achievement of those that start the course, is the proportion that go on to achieve the qualification typical for the subject in question and the prior attainment profile of the students concerned. Hitherto, this measure has been referred to as a success rate, but the name has been changed to Achievement to correspond with the changed terminology used by the Data Service in the QAR reports. High grades of those that start the course, is the proportion that go on to secure a high grade pass (defined as grades A* B) typical for the subject in question and the prior attainment profile of the students. The grade based measures are: Grades per starter (GPS) looking at all the students that started the course, are the grades that students attained what would be expected of students starting the subject in question with similar levels of prior attainment. Grades per completer (GPC) looking at the students that completed the course, are the grades that students attained what would be expected of students completing the subject in question with similar levels of prior attainment. 69

71 Grades per achiever (GPA) looking just at those who passed the qualification, are the grades that students attained what would be expected of students passing the subject in question with similar levels of prior attainment. Six Dimensions Reports Core reports for college and subject performance all follow the same basic structure: 1. Student profile 2. Value added analysis 3. Performance by prior attainment band 4. Four year trend analysis The idea here is to give management and subject teams the opportunity to examine the context in terms of the characteristics of the students a department or college is dealing with, explore multiple dimensions of performance, explore performance across the prior attainment spectrum and examine performance over time. Student profile The student profile data is provided to give a sense of what the profile is in a particular institution, how this is changing, and how it relates to national patterns. In our example we see a consistent picture, with a cohort that is 60% female, with around 15% of students drawn from black and minority ethnic groups and an average GCSE score around

72 The prior attainment profile is summarised in the graph above. The right hand bar is the national profile of all students taking the qualification concerned. The blue section represents those with an average GCSE score below 5.8; the red section represents those students with an average GCSE score of 6.7 or above. In our example, we can immediately see whether the profile is in line with what is typical nationally. Department heads often comment on how their students relate to those found in the subject nationally this analysis will allow such discussions to be based on actual evidence. Value Added Analysis The value added analysis section is where we present the outcomes for the current year in terms of the various different dimensions of performance. 20% 1 15% 0.75 Performance versus National % 10% 5% 0% 5% 10% 15% 2% 2% 1% 2% 3% Performance versus National Grades per student 20% 1 The percentage based measures are found towards the left hand side of the graph. Performance in the individual college or subject concerned is represented by blue bars. The grade based measures are presented towards the right hand side and performance in the individual college or subject concerned is represented by red bars. National performance is represented by the zero line. If performance is exactly in line with national performance then the score for the measure in question will be zero. The yellow shaded area represents the middle 50% of sixth form colleges. The upper limit of this yellow zone is the 75 th percentile, the lower limit of the yellow zone is the 25 th percentile. If a bar extends beyond this yellow area, it indicates that performance is in the top or bottom quarter nationally. Position Above 75 th percentile Between 25 th and 75 th percentile Below 25 th percentile Interpretation Scores above the 75 th percentile are in the top quarter nationally This is, broadly speaking, normal performance. Scores in this range are in the middle 50% of colleges Scores below the 25 th percentile are in the bottom quarter nationally 71

73 The red lines indicate the limits of one standard deviation around the national line. 68% of scores will fall within this zone. A bar extending above this would be in the top 16% nationally. In our example here, the department is not performing particularly well. We see that attendance is 2% above what would be expected in this particular subject for the profile of students the department is serving. Retention is some way below expectation, and the pass rate is close to the national line. The achievement rate is 2% below what would be expected. We see the bar for retention extends below the yellow zone, indicating that this performance is in the bottom 25% nationally. It is also vitally important to (literally) get a sense of proportion when interpreting scores. If there were only twenty students on a course, then each student represents 5% of the total. On such a course, a cohort could be 4% below the national rate, but would have been above the national rate if one more student has achieved the qualification. Even with much larger cohorts it is important to get a sense of how many additional passes would have been required to achieve the national average, 75 th percentile and so forth. The grades measures express the difference between college performance and national performance. They express performance in terms of grades per student. If all students get a grade higher than would be expected the score would by 1.0. If all students got a grade lower than would be expected, the score would be 1.0. Score Equates to In a class of 20 students 1.0 All of the students reaching one grade higher than would be expected All 20 students getting a grade higher than would be expected 0.75 Three quarters of the students reaching one grade higher than would be expected 15 students getting a grade higher than would be expected 0.5 Half of the students reaching one grade higher than would be expected 10 students getting a grade higher than would be expected 0.25 A quarter of the students reaching one grade higher than would be expected 5 students getting a grade higher than would be expected 0.1 One in ten students reaching one grade higher than would be expected 2 students getting a grade higher than would be expected 0.05 One in twenty students reaching a grade higher than would be expected 1 student getting a grade higher than would be expected 0 Students get the grades that would be Students get the grades that would be expected expected 0.05 One in twenty students attaining one grade lower than would be expected 1 student getting a grade lower than would be expected 0.1 One in ten students attaining one grade lower than would be expected 2 students getting a grade lower than would be expected 0.25 A quarter of the students attaining one grade lower than would be expected 5 students getting a grade lower than would be expected 0.5 Half of the students attaining a grade lower than would be expected 10 students getting a grade lower than would be expected 0.75 Three quarters of the students attaining one grade lower than would be expected 15 students getting a grade lower than would be expected 1.0 On average, all students attaining a grade lower than would be expected All 20 students getting a grade lower than would be expected 72

74 Performance by Prior attainment band In the performance by prior attainment band section we return to the graphical presentation we used when exploring the relationship between prior attainment, subject and student outcomes. The shaded background represents national performance in a subject; the narrow bars represent performance in an individual subject department. 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% Starts Starts Starts Starts 20 Starts 22 Starts 23 Starts 32 Starts 36 Starts 53 Starts 21 Starts A* B grade C E grade Failed Left course Note that the display includes the number of students that started the course in each band. We should be very cautious about over interpretation if the sample size in a particular band is small. For example, in the band in the above, there are only four students, so each will represent 25% of the total. It is performance in the bands where the majority of students lie that will prove most useful. It does, however, give us a really clear idea of what happens nationally, and if performance (for good or bad) is significantly different to what happens nationally in a number of bands, then we need to know why. Four year trend analysis The final presentation of data contrasts raw and value added performance. The value added scores are colour coded. Performance in the bottom quarter uses a blue font, performance in the middle half uses a black font, and performance in the top quarter is represented by a red font. 73

75 Programmes of Study reports: programme completion and achievement The programmes of study reports look at the performance of students over two years. They take as their starting points students starting a two year programme of study, and examine how many students stayed for the full two years, and how successful they were in gaining qualifications at the end of two years. The analysis looks at those students who started a programme of three or more A levels, or BTEC equivalents. Some of the students follow a pure BTEC course, usually a three A level equivalent extended diploma, others pursue a mix of A levels and BTEC qualifications, some pursue a pure A level programme. As the model reports on the number of qualifications students achieve, general studies has been removed from the analysis, as it would distort the analysis in favour of the small number of colleges that continue to enter students for general studies. In this analysis students on these varying curriculum pathways are grouped together, as the intention is for colleges to be able to see the proportion of students achieving various things at different points in the prior attainment spectrum, regardless of the route they were enrolled on. There are two different programmes of study reports: programme completion and programme achievement: Programme completion reports have three key measures, all based on the number of students who start Year 12: 1. Proportion of students completing Year Proportion of students starting Year Proportion of students completing Year 13 Programme achievement reports have three key measures, all based on the number of students who start Year 12: 1. Proportion of students achieving two or more A levels or BTEC equivalents 2. Proportion of students achieving three or more A levels or BTEC equivalents 3. Proportion of students achieving four or more A levels or BTEC equivalents Programme Completion Analysis Completed Year 12 Started Year 13 Completed Year This graph summarises the overall performance of an institution, which is represented by a red bar. If the bar extends above the zero line, it indicates that more were retained than would be expected. If a bar extended below the line, it suggests that fewer students than would have been expected had 74

76 been retained. If the red bar is close to or at zero, it suggests that performance is as would be expected for students with that prior attainment profile. The yellow background represents performance in the middle half of colleges. If a bar extends above this yellow zone, it indicates that a cohort is in the top quarter nationally. If a bar extends below the yellow zone, it suggests a college is in the bottom quarter nationally. The graph reports in percentages. In the illustration above, the college performs particularly well in terms of the proportion of students that enter Year 13, despite the fact that the proportion of students completing year 12 is entirely average. It suggests that far fewer students than would be expected leave between Year 12 and Year 13. Programme Completion by Prior Attainment The retention analysis by prior attainment graph explores performance across the prior attainment spectrum. The background shading represents national performance, and the thin lollipop sticks represent performance in the individual institution. Performance is divided into four possible outcomes: leaving during Year 12, leaving between Year 12 and 13, leaving during Year 13, and retained. 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% Retained Left during Year 13 Left Between Y12 + Y13 Left During Year 12 75

77 Programme achievement analysis Six dimensions reports have three measures of achievement at student level Achieved 2+ Alevels/BTEC Achieved 3+ Alevels/BTEC Achieved 4+ Alevels/BTEC 15 As with all six dimensions analysis, the expected level of performance is adjusted according to the prior attainment profile of the students involved. A score of zero suggests performance exactly in line with national standards. The relationship between the three measures can often prove interesting. Figure 5.4 shows the scores for a college which is getting a perfectly respectable proportion of its students to achieve at least two A levels, is above average for achieving at least three A levels, but a little off the pace when it comes to the proportion of students achieving four or more A levels. Figure 5.5: Whole student success by prior attainment band 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% Achieving 4+ Achieving 3 Achieving 2 Other outcomes 76

78 Equal Opportunities Monitoring Reports The equal opportunities monitoring reports compare success rates in a particular college with the patterns found nationally. In the reports, national performance is represented by the cream coloured bars, performance in the individual college is represented by black bars. The score in the college is given in red. Equal opportunities monitoring extract from an example report Female 2591 starts Male 1998 starts White British 3021 starts White Irish 605 starts White Other 3514 starts Asian any other Asian 80 starts Asian Bangladeshi 483 starts Asian Chinese 29 starts It is worth being clear about what the national lines and college bars represent. Both bars have been adjusted for prior attainment and subject choice. At national level, the line reports on the question does the equality and diversity group in question perform in line with performance for all students in terms of success rates once prior attainment and subject choice have been taken into account. In the extract above we see that the national performance for male and female students is so close to the national line that there is no cream bar extending above or below the zero line. There are a few categories where national performance is above what would be expected, and we see a cream bar extending from the zero line: White Irish students and Asian Chinese for example. Including these national variations allows us to avoid lazy assumptions about how well groups perform. It is often assumed that black groups in particular underperform in the English educational system. A college may explain away underperformance among black students by reference to the idea that such underperformance is normal. The background national data in the equality and diversity graphs reveals the true patterns of performance. The performance in a particular college is overlaid on the national picture, and represented by the black bars. If a black bar extends out of the cream bar, it means that performance is more extreme than is found nationally. In the figure above we see that Asian Chinese students have a success rate 7.2+ above what would be expected of similarly qualified students doing similar subjects nationally. Perhaps much more significant for this college is the figure for Asian Bangladeshi students, which is well short of what would be expected nationally, and in contrast to Bangladeshi student nationally, who perform in line with expectation once prior attainment and subject are factored in. Note also that this college has a large cohort of Bangladeshi students (483 starts), so we cannot explain away difference to national performance by saying that the cohort is very small, and therefore any variation is statistically meaningless. 77

79 The whole college reporting also includes a summary of performance by equality and diversity category across three key measures: retention, success and points per completer. It also reports on programmes of study measures. Six Dimensions of Performance Report Example College: E&D Summary Report Performance by high level equality and diversity category: outcomes: GCE AS level Retention Achievement Grades Per Completer Starts Act Exp Act Exp Nat Act Exp Act Exp Nat Act Exp Act Exp Nat All Female Male White All BME This report provides a highly effective way to monitor performance, as the outcomes for each category are placed alongside each other with direct reference to how the group in question performs nationally. The extract below examines the data for retention. The actual rate for the group concerned The rate expected by the six dimensions model The difference between actual performance and expected performance How the group in question performs nationally The Act column gives, in effect, the raw retention rate for the college in question. The Exp column indicates the level of performance that would be expected of similarly qualified students pursuing the same courses nationally. The Act Exp column indicates whether performance at the college is above or below national rates. The Nat column is designed to show whether the group in question tends to do well nationally, and provides a useful reference point for the college score. 78

80 Performance by qualification by teacher In the performance by qualification by teacher reports there is a line for each individual teacher relating to each qualification that they deliver. If they teach at AS and A2 level, there will be two separate lines, and if they teach on two or more different specifications these will generate individual lines. The analysis does not take performance down to individual class level. The reports contain two sets of measures: raw outcomes and value added outcomes. The raw measures are the outcomes before any adjustment is made for prior attainment. The value added outcomes look at performance once it has been adjusted for prior attainment and subject choice. In the value added measures, performance in line with national expectations for similarly qualified students following the course in question is represented by zero. A score above zero suggests that performance is above national expectation, a score below zero suggests performance below national expectation The colour coding used in this report is different from that elsewhere used in the six dimensions analysis. In the measures that are expressed in percentages (Attendance, Retention, Achievement, Success, and High Grades), if a score is 5% or more above expectation it is coloured red, and if it is 5% or more below expectation it is coloured blue. In the measures that are expressed in grades, if performance is half a grade per student above expectation (a score of 0.5 or higher) it is coloured red, and if it is half a grade per entry below expectation it is coded blue. Nick Allen, 12/06/17 End 79

International Advanced level examinations

International Advanced level examinations International Advanced level examinations Entry, Aggregation and Certification Procedures and Rules Effective from 2014 onwards Document running section Contents Introduction 3 1. Making entries 4 2. Receiving

More information

INTRODUCTION TO TEACHING GUIDE

INTRODUCTION TO TEACHING GUIDE GCSE REFORM INTRODUCTION TO TEACHING GUIDE February 2015 GCSE (9 1) History B: The Schools History Project Oxford Cambridge and RSA GCSE (9 1) HISTORY B Background GCSE History is being redeveloped for

More information

Guide to the Uniform mark scale (UMS) Uniform marks in A-level and GCSE exams

Guide to the Uniform mark scale (UMS) Uniform marks in A-level and GCSE exams Guide to the Uniform mark scale (UMS) Uniform marks in A-level and GCSE exams This booklet explains why the Uniform mark scale (UMS) is necessary and how it works. It is intended for exams officers and

More information

GCSE English Language 2012 An investigation into the outcomes for candidates in Wales

GCSE English Language 2012 An investigation into the outcomes for candidates in Wales GCSE English Language 2012 An investigation into the outcomes for candidates in Wales Qualifications and Learning Division 10 September 2012 GCSE English Language 2012 An investigation into the outcomes

More information

Oasis Academy Coulsdon

Oasis Academy Coulsdon School report Oasis Academy Coulsdon Homefield Road, Old Coulsdon, Croydon, CR5 1ES Inspection dates 4-5 March 2015 Overall effectiveness Previous inspection: Good 2 This inspection: Good 2 Leadership

More information

Language learning in primary and secondary schools in England Findings from the 2012 Language Trends survey

Language learning in primary and secondary schools in England Findings from the 2012 Language Trends survey Language learning in primary and secondary schools in England Research report Teresa Tinsley Kathryn Board OBE Welcome to CfBT Education Trust CfBT Education Trust is a top 30* UK charity providing education

More information

Changes to GCSE and KS3 Grading Information Booklet for Parents

Changes to GCSE and KS3 Grading Information Booklet for Parents Changes to GCSE and KS3 Grading Information Booklet for Parents Changes to assessment in Years 10 & 11 As you are probably aware the government has made radical changes to the structure and assessment

More information

Job Description for Virtual Learning Platform Assistant and Staff ICT Trainer

Job Description for Virtual Learning Platform Assistant and Staff ICT Trainer Job Description for Virtual Learning Platform Assistant and Staff ICT Trainer Bristol Grammar School: a company limited by guarantee, company number: 5142007 Registered Office: University Road, Bristol,

More information

Short inspection of Maria Fidelis Roman Catholic Convent School FCJ

Short inspection of Maria Fidelis Roman Catholic Convent School FCJ Ofsted Piccadilly Gate Store Street Manchester M1 2WD T 0300 123 4234 www.gov.uk/ofsted 23 December 2016 Mrs Helen Gill Headteacher Maria Fidelis Roman Catholic Convent School FCJ 34 Phoenix Road London

More information

St Philip Howard Catholic School

St Philip Howard Catholic School School report St Philip Howard Catholic School St Mary's Road, Glossop, SK13 8DR Inspection dates 4 November 1 December 2014 Overall effectiveness Previous inspection: Requires improvement 3 This inspection:

More information

Curriculum Policy. November Independent Boarding and Day School for Boys and Girls. Royal Hospital School. ISI reference.

Curriculum Policy. November Independent Boarding and Day School for Boys and Girls. Royal Hospital School. ISI reference. Curriculum Policy Independent Boarding and Day School for Boys and Girls Royal Hospital School November 2017 ISI reference Key author Reviewing body Approval body Approval frequency 2a Director of Curriculum,

More information

Initial teacher training in vocational subjects

Initial teacher training in vocational subjects Initial teacher training in vocational subjects This report looks at the quality of initial teacher training in vocational subjects. Based on visits to the 14 providers that undertake this training, it

More information

Australia s tertiary education sector

Australia s tertiary education sector Australia s tertiary education sector TOM KARMEL NHI NGUYEN NATIONAL CENTRE FOR VOCATIONAL EDUCATION RESEARCH Paper presented to the Centre for the Economics of Education and Training 7 th National Conference

More information

Assessment booklet Assessment without levels and new GCSE s

Assessment booklet Assessment without levels and new GCSE s Assessment booklet Assessment without levels and new GCSE s Dear Parent/Carer There is much change going on in education, including a new National Curriculum, and new, more challenging GCSE s. along with

More information

PROPOSED MERGER - RESPONSE TO PUBLIC CONSULTATION

PROPOSED MERGER - RESPONSE TO PUBLIC CONSULTATION PROPOSED MERGER - RESPONSE TO PUBLIC CONSULTATION Paston Sixth Form College and City College Norwich Vision for the future of outstanding Post-16 Education in North East Norfolk Date of Issue: 22 September

More information

BASIC EDUCATION IN GHANA IN THE POST-REFORM PERIOD

BASIC EDUCATION IN GHANA IN THE POST-REFORM PERIOD BASIC EDUCATION IN GHANA IN THE POST-REFORM PERIOD By Abena D. Oduro Centre for Policy Analysis Accra November, 2000 Please do not Quote, Comments Welcome. ABSTRACT This paper reviews the first stage of

More information

UPPER SECONDARY CURRICULUM OPTIONS AND LABOR MARKET PERFORMANCE: EVIDENCE FROM A GRADUATES SURVEY IN GREECE

UPPER SECONDARY CURRICULUM OPTIONS AND LABOR MARKET PERFORMANCE: EVIDENCE FROM A GRADUATES SURVEY IN GREECE UPPER SECONDARY CURRICULUM OPTIONS AND LABOR MARKET PERFORMANCE: EVIDENCE FROM A GRADUATES SURVEY IN GREECE Stamatis Paleocrassas, Panagiotis Rousseas, Vassilia Vretakou Pedagogical Institute, Athens Abstract

More information

AUTHORITATIVE SOURCES ADULT AND COMMUNITY LEARNING LEARNING PROGRAMMES

AUTHORITATIVE SOURCES ADULT AND COMMUNITY LEARNING LEARNING PROGRAMMES AUTHORITATIVE SOURCES ADULT AND COMMUNITY LEARNING LEARNING PROGRAMMES AUGUST 2001 Contents Sources 2 The White Paper Learning to Succeed 3 The Learning and Skills Council Prospectus 5 Post-16 Funding

More information

About our academy. Joining our community

About our academy. Joining our community Hethersett Academy is part of the Inspiration Trust, a not-for-profit charity whose mission is to transform the lives of young people in Norfolk and Suffolk through education. Led by nationally-recognised

More information

Higher education is becoming a major driver of economic competitiveness

Higher education is becoming a major driver of economic competitiveness Executive Summary Higher education is becoming a major driver of economic competitiveness in an increasingly knowledge-driven global economy. The imperative for countries to improve employment skills calls

More information

I set out below my response to the Report s individual recommendations.

I set out below my response to the Report s individual recommendations. Written Response to the Enterprise and Business Committee s Report on Science, Technology, Engineering and Maths (STEM) Skills by the Minister for Education and Skills November 2014 I would like to set

More information

Sixth Form Admissions Procedure

Sixth Form Admissions Procedure University of Birmingham School Sixth Form Admissions Procedure September 2018 University of Birmingham School Sixth Form Admission Procedures Review Frequency Review date Governing Committee Approved

More information

Eastbury Primary School

Eastbury Primary School Eastbury Primary School Dawson Avenue, Barking, IG11 9QQ Inspection dates 26 27 September 2012 Overall effectiveness Previous inspection: Satisfactory 3 This inspection: Requires improvement 3 Achievement

More information

Pupil Premium Grants. Information for Parents. April 2016

Pupil Premium Grants. Information for Parents. April 2016 Pupil Premium Grants Information for Parents April 2016 This leaflet covers: The Pupil Premium The Service Premium What is the Pupil Premium? The Pupil Premium was introduced in April 2011. It is additional

More information

IMPACTFUL, QUANTIFIABLE AND TRANSFORMATIONAL?

IMPACTFUL, QUANTIFIABLE AND TRANSFORMATIONAL? IMPACTFUL, QUANTIFIABLE AND TRANSFORMATIONAL? EVALUATION OF THE IMPROVING QUALITY TOGETHER (IQT) NATIONAL LEARNING PROGRAMME Report for 1000 Lives Improvement Service, Public Health Wales Mark Llewellyn,

More information

How to Judge the Quality of an Objective Classroom Test

How to Judge the Quality of an Objective Classroom Test How to Judge the Quality of an Objective Classroom Test Technical Bulletin #6 Evaluation and Examination Service The University of Iowa (319) 335-0356 HOW TO JUDGE THE QUALITY OF AN OBJECTIVE CLASSROOM

More information

Timeline. Recommendations

Timeline. Recommendations Introduction Advanced Placement Course Credit Alignment Recommendations In 2007, the State of Ohio Legislature passed legislation mandating the Board of Regents to recommend and the Chancellor to adopt

More information

Archdiocese of Birmingham

Archdiocese of Birmingham Archdiocese of Birmingham Section 48 Inspection SS MARY AND JOHN CATHOLIC PRIMARY SCHOOL Part of the Bishop Cleary Catholic Multi-Academy Company Caledonia Rd, Wolverhampton WV2 1HZ Inspection date 19

More information

Inspection dates Overall effectiveness Good Summary of key findings for parents and pupils This is a good school

Inspection dates Overall effectiveness Good Summary of key findings for parents and pupils This is a good school School report Odessa Infant School Wellington Road, Forest Gate, London E7 9BY Inspection dates 25 26 May 2016 Overall effectiveness Effectiveness of leadership and management Quality of teaching, learning

More information

INTERNATIONAL BACCALAUREATE AT IVANHOE GRAMMAR SCHOOL. An Introduction to the International Baccalaureate Diploma Programme For Students and Families

INTERNATIONAL BACCALAUREATE AT IVANHOE GRAMMAR SCHOOL. An Introduction to the International Baccalaureate Diploma Programme For Students and Families INTERNATIONAL BACCALAUREATE AT IVANHOE GRAMMAR SCHOOL An Introduction to the International Baccalaureate Diploma Programme For Students and Families 2018-2019 The International Baccalaureate Organization

More information

Ten years after the Bologna: Not Bologna has failed, but Berlin and Munich!

Ten years after the Bologna: Not Bologna has failed, but Berlin and Munich! EUROPE BULDING POLICY IN GERMANY: THE BOLOGNA PROCESS Ten years after the Bologna: Not Bologna has failed, but Berlin and Munich! Dr. Aneliya Koeva The beginning... The Bologna Declaration of 19 June 1999

More information

Charles de Gaulle European High School, setting its sights firmly on Europe.

Charles de Gaulle European High School, setting its sights firmly on Europe. Charles de Gaulle European High School, setting its sights firmly on Europe. Since its creation in 1990, this high school has set itself the task of focusing on Europe. It is open to different cultures

More information

Programme Specification

Programme Specification Programme Specification Title of Course: Foundation Year in Science, Computing & Mathematics Date Specification Produced: January 2013 Date Specification Last Revised: May 2013 This Programme Specification

More information

DIOCESE OF PLYMOUTH VICARIATE FOR EVANGELISATION CATECHESIS AND SCHOOLS

DIOCESE OF PLYMOUTH VICARIATE FOR EVANGELISATION CATECHESIS AND SCHOOLS DIOCESE OF PLYMOUTH VICARIATE FOR EVANGELISATION CATECHESIS AND SCHOOLS St. Boniface Catholic College Boniface Lane Plymouth Devon PL5 3AG URN 113558 Head Teacher: Mr Frank Ashcroft Chair of Governors:

More information

Audit Of Teaching Assignments. An Integrated Analysis of Teacher Educational Background and Courses Taught October 2007

Audit Of Teaching Assignments. An Integrated Analysis of Teacher Educational Background and Courses Taught October 2007 Audit Of Teaching Assignments October 2007 Audit Of Teaching Assignments Audit of Teaching Assignments Crown copyright, Province of Nova Scotia, 2007 The contents of this publication may be reproduced

More information

Archdiocese of Birmingham

Archdiocese of Birmingham Archdiocese of Birmingham INSPECTION REPORT THE GIFFARD CATHOLIC PRIMARY SCHOOL WOLVERHAMPTON Inspection dates 25 th -26 th June 2013 Reporting Inspector Paul Nutt Inspection carried out under Section

More information

This Access Agreement is for only, to align with the WPSA and in light of the Browne Review.

This Access Agreement is for only, to align with the WPSA and in light of the Browne Review. University of Essex Access Agreement 2011-12 The University of Essex Access Agreement has been updated in October 2010 to include new tuition fee and bursary provision for 2011 entry and account for the

More information

University of Cambridge: Programme Specifications POSTGRADUATE ADVANCED CERTIFICATE IN EDUCATIONAL STUDIES. June 2012

University of Cambridge: Programme Specifications POSTGRADUATE ADVANCED CERTIFICATE IN EDUCATIONAL STUDIES. June 2012 University of Cambridge: Programme Specifications Every effort has been made to ensure the accuracy of the information in this programme specification. Programme specifications are produced and then reviewed

More information

Newlands Girls School

Newlands Girls School School report Newlands Girls School Farm Road, Maidenhead, Berkshire, SL6 5JB Inspection dates 02-03 October 2012 Overall effectiveness Previous inspection: Good 2 This inspection: Good 2 Achievement of

More information

LITERACY ACROSS THE CURRICULUM POLICY

LITERACY ACROSS THE CURRICULUM POLICY "Pupils should be taught in all subjects to express themselves correctly and appropriately and to read accurately and with understanding." QCA Use of Language across the Curriculum "Thomas Estley Community

More information

Post-16 transport to education and training. Statutory guidance for local authorities

Post-16 transport to education and training. Statutory guidance for local authorities Post-16 transport to education and training Statutory guidance for local authorities February 2014 Contents Summary 3 Key points 4 The policy landscape 4 Extent and coverage of the 16-18 transport duty

More information

Idaho Public Schools

Idaho Public Schools Advanced Placement: Student Participation 13.5% increase in the number of students participating between 25 and 26 In 26: 3,79 Idaho Public School Students took AP Exams In 25: 3,338 Idaho Public School

More information

Program Review

Program Review De Anza College, Cupertino, CA 1 Description and Mission of the Program A) The Manufacturing and CNC Program (MCNC) offers broad yet in-depth curriculum that imparts a strong foundation for direct employment

More information

Research Update. Educational Migration and Non-return in Northern Ireland May 2008

Research Update. Educational Migration and Non-return in Northern Ireland May 2008 Research Update Educational Migration and Non-return in Northern Ireland May 2008 The Equality Commission for Northern Ireland (hereafter the Commission ) in 2007 contracted the Employment Research Institute

More information

A European inventory on validation of non-formal and informal learning

A European inventory on validation of non-formal and informal learning A European inventory on validation of non-formal and informal learning Finland By Anne-Mari Nevala (ECOTEC Research and Consulting) ECOTEC Research & Consulting Limited Priestley House 12-26 Albert Street

More information

PUPIL PREMIUM POLICY

PUPIL PREMIUM POLICY PUPIL PREMIUM POLICY 2017-2018 Reviewed September 2017 1 CONTENTS 1. OUR ACADEMY 2. THE PUPIL PREMIUM 3. PURPOSE OF THE PUPIL PREMIUM POLICY 4. HOW WE WILL MAKE DECISIONS REGARDING THE USE OF THE PUPIL

More information

Proficiency Illusion

Proficiency Illusion KINGSBURY RESEARCH CENTER Proficiency Illusion Deborah Adkins, MS 1 Partnering to Help All Kids Learn NWEA.org 503.624.1951 121 NW Everett St., Portland, OR 97209 Executive Summary At the heart of the

More information

Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd. Hertfordshire International College

Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd. Hertfordshire International College Higher Education Review (Embedded Colleges) of Navitas UK Holdings Ltd April 2016 Contents About this review... 1 Key findings... 2 QAA's judgements about... 2 Good practice... 2 Theme: Digital Literacies...

More information

FARLINGAYE HIGH SCHOOL

FARLINGAYE HIGH SCHOOL FARLINGAYE HIGH SCHOOL Maths, Computing & Arts Specialist School Teacher of English INFORMATION FOR APPLICANTS FULL TIME TEACHER OF ENGLISH (MPR) The English Faculty The English Faculty at Farlingaye High

More information

Teacher of Psychology and Health and Social Care

Teacher of Psychology and Health and Social Care EGGBUCKLAND COMMUNITY COLLEGE T H E P E R F E C T E N V I RO N M E N T Teacher of Psychology and Health and Social Care Candidate Information L E A R N I N G C A R I N G AC H I E V I N G Dear Colleague

More information

Teacher of Art & Design (Maternity Cover)

Teacher of Art & Design (Maternity Cover) Teacher of Art & Design (Maternity Cover) Closing date: Monday 27th November 2017 Application Pack Click for Website Furze Platt Road, Maidenhead, Berkshire SL6 7NQ Email: office@furzeplatt.com Website:

More information

Chapter 2. University Committee Structure

Chapter 2. University Committee Structure Chapter 2 University Structure 2. UNIVERSITY COMMITTEE STRUCTURE This chapter provides details of the membership and terms of reference of Senate, the University s senior academic committee, and its Standing

More information

Qualification handbook

Qualification handbook Qualification handbook BIIAB Level 3 Award in 601/5960/1 Version 1 April 2015 Table of Contents 1. About the BIIAB Level 3 Award in... 1 2. About this pack... 2 3. BIIAB Customer Service... 2 4. What are

More information

The Good Judgment Project: A large scale test of different methods of combining expert predictions

The Good Judgment Project: A large scale test of different methods of combining expert predictions The Good Judgment Project: A large scale test of different methods of combining expert predictions Lyle Ungar, Barb Mellors, Jon Baron, Phil Tetlock, Jaime Ramos, Sam Swift The University of Pennsylvania

More information

5. UPPER INTERMEDIATE

5. UPPER INTERMEDIATE Triolearn General Programmes adapt the standards and the Qualifications of Common European Framework of Reference (CEFR) and Cambridge ESOL. It is designed to be compatible to the local and the regional

More information

Swinburne University of Technology 2020 Plan

Swinburne University of Technology 2020 Plan Swinburne University of Technology 2020 Plan science technology innovation Swinburne University of Technology 2020 Plan Embracing change This is an exciting time for Swinburne. Tertiary education is undergoing

More information

Business. Pearson BTEC Level 1 Introductory in. Specification

Business. Pearson BTEC Level 1 Introductory in. Specification Pearson BTEC Level 1 Introductory in Business Specification Pearson BTEC Level 1 Introductory Certificate in Business Pearson BTEC Level 1 Introductory Diploma in Business Pearson BTEC Level 1 Introductory

More information

Referencing the Danish Qualifications Framework for Lifelong Learning to the European Qualifications Framework

Referencing the Danish Qualifications Framework for Lifelong Learning to the European Qualifications Framework Referencing the Danish Qualifications for Lifelong Learning to the European Qualifications Referencing the Danish Qualifications for Lifelong Learning to the European Qualifications 2011 Referencing the

More information

Job Description Head of Religious, Moral and Philosophical Studies (RMPS)

Job Description Head of Religious, Moral and Philosophical Studies (RMPS) Job Description Head of Religious, Moral and Philosophical Studies (RMPS) George Watson s College wishes to appoint a Head of Religious, Moral and Philosophical Studies (RMPS) from January 2018. The post

More information

Tuesday 24th January Mr N Holmes Principal. Mr G Hughes Vice Principal (Curriculum) Mr P Galloway Vice Principal (Key Stage 3)

Tuesday 24th January Mr N Holmes Principal. Mr G Hughes Vice Principal (Curriculum) Mr P Galloway Vice Principal (Key Stage 3) Y9 PATHWAYS 2017 Tuesday 24th January 2017 Mr N Holmes Principal Mr G Hughes Vice Principal (Curriculum) Mr P Galloway Vice Principal (Key Stage 3) PATHWAYS 2017 80% 71% 5+ A*-C Grades (inc English & Maths)

More information

Rubric for Scoring English 1 Unit 1, Rhetorical Analysis

Rubric for Scoring English 1 Unit 1, Rhetorical Analysis FYE Program at Marquette University Rubric for Scoring English 1 Unit 1, Rhetorical Analysis Writing Conventions INTEGRATING SOURCE MATERIAL 3 Proficient Outcome Effectively expresses purpose in the introduction

More information

The International Baccalaureate Diploma Programme at Carey

The International Baccalaureate Diploma Programme at Carey The International Baccalaureate Diploma Programme at Carey Contents ONNECT What is the IB? 2 How is the IB course structured? 3 The IB Learner Profile 4-5 What subjects does Carey offer? 6 The IB Diploma

More information

The Waldegrave Trust Waldegrave School, Fifth Cross Road, Twickenham, TW2 5LH TEL: , FAX:

The Waldegrave Trust Waldegrave School, Fifth Cross Road, Twickenham, TW2 5LH TEL: , FAX: The Waldegrave Trust Waldegrave School, Fifth Cross Road, Twickenham, TW2 5LH TEL: 020 8894 3244, FAX: 020 8893 3670 May 2015 Dear Applicant Finance Assistant Permanent Contract, 12 hours per week, term

More information

Politics and Society Curriculum Specification

Politics and Society Curriculum Specification Leaving Certificate Politics and Society Curriculum Specification Ordinary and Higher Level 1 September 2015 2 Contents Senior cycle 5 The experience of senior cycle 6 Politics and Society 9 Introduction

More information

This Access Agreement is for only, to align with the WPSA and in light of the Browne Review.

This Access Agreement is for only, to align with the WPSA and in light of the Browne Review. University of Essex Access Agreement 2011-12 The University of Essex Access Agreement has been updated in October 2010 to include new tuition fee and bursary provision for 2011 entry and account for the

More information

TABLE OF CONTENTS Credit for Prior Learning... 74

TABLE OF CONTENTS Credit for Prior Learning... 74 TABLE OF CONTENTS Credit for Prior Learning... 74 Credit by Examination...74 Specific Course Credit...74 General Education and Associate Degree Credit by Exam...74 Advanced Placement (AP) Examination:

More information

Alignment of Australian Curriculum Year Levels to the Scope and Sequence of Math-U-See Program

Alignment of Australian Curriculum Year Levels to the Scope and Sequence of Math-U-See Program Alignment of s to the Scope and Sequence of Math-U-See Program This table provides guidance to educators when aligning levels/resources to the Australian Curriculum (AC). The Math-U-See levels do not address

More information

Foundation Certificate in Higher Education

Foundation Certificate in Higher Education Programme Specification Foundation Certificate in Higher Education Certificate of Credit in English for Academic Purposes Certificate of Credit in Study Skills for Higher Educaiton Certificate of Credit

More information

STUDENT LEARNING ASSESSMENT REPORT

STUDENT LEARNING ASSESSMENT REPORT STUDENT LEARNING ASSESSMENT REPORT PROGRAM: Sociology SUBMITTED BY: Janine DeWitt DATE: August 2016 BRIEFLY DESCRIBE WHERE AND HOW ARE DATA AND DOCUMENTS USED TO GENERATE THIS REPORT BEING STORED: The

More information

Assessment and Evaluation

Assessment and Evaluation Assessment and Evaluation 201 202 Assessing and Evaluating Student Learning Using a Variety of Assessment Strategies Assessment is the systematic process of gathering information on student learning. Evaluation

More information

Biomedical Sciences (BC98)

Biomedical Sciences (BC98) Be one of the first to experience the new undergraduate science programme at a university leading the way in biomedical teaching and research Biomedical Sciences (BC98) BA in Cell and Systems Biology BA

More information

Quality in University Lifelong Learning (ULLL) and the Bologna process

Quality in University Lifelong Learning (ULLL) and the Bologna process Quality in University Lifelong Learning (ULLL) and the Bologna process The workshop will critique various quality models and tools as a result of EU LLL policy, such as consideration of the European Standards

More information

Thameside Primary School Rationale for Assessment against the National Curriculum

Thameside Primary School Rationale for Assessment against the National Curriculum Thameside Primary School Rationale for Assessment against the National Curriculum We are a rights respecting school: Article 28: (Right to education): All children have the right to a primary education.

More information

Effective Pre-school and Primary Education 3-11 Project (EPPE 3-11)

Effective Pre-school and Primary Education 3-11 Project (EPPE 3-11) Effective Pre-school and Primary Education 3-11 Project (EPPE 3-11) A longitudinal study funded by the DfES (2003 2008) Exploring pupils views of primary school in Year 5 Address for correspondence: EPPSE

More information

Summary Report. ECVET Agent Exploration Study. Prepared by Meath Partnership February 2015

Summary Report. ECVET Agent Exploration Study. Prepared by Meath Partnership February 2015 Summary Report ECVET Agent Exploration Study Prepared by Meath Partnership February 2015 The European Commission support for the production of this publication does not constitute an endorsement of the

More information

University of Exeter College of Humanities. Assessment Procedures 2010/11

University of Exeter College of Humanities. Assessment Procedures 2010/11 University of Exeter College of Humanities Assessment Procedures 2010/11 This document describes the conventions and procedures used to assess, progress and classify UG students within the College of Humanities.

More information

A Note on Structuring Employability Skills for Accounting Students

A Note on Structuring Employability Skills for Accounting Students A Note on Structuring Employability Skills for Accounting Students Jon Warwick and Anna Howard School of Business, London South Bank University Correspondence Address Jon Warwick, School of Business, London

More information

CEFR Overall Illustrative English Proficiency Scales

CEFR Overall Illustrative English Proficiency Scales CEFR Overall Illustrative English Proficiency s CEFR CEFR OVERALL ORAL PRODUCTION Has a good command of idiomatic expressions and colloquialisms with awareness of connotative levels of meaning. Can convey

More information

School Inspection in Hesse/Germany

School Inspection in Hesse/Germany Hessisches Kultusministerium School Inspection in Hesse/Germany Contents 1. Introduction...2 2. School inspection as a Procedure for Quality Assurance and Quality Enhancement...2 3. The Hessian framework

More information

CX 105/205/305 Greek Language 2017/18

CX 105/205/305 Greek Language 2017/18 The University of Warwick Department of Classics and Ancient History CX 105/205/305 Greek Language 2017/18 Module Convenor: Clive Letchford, Room H.2.39 C.A.Letchford@warwick.ac.uk detail from Codex Sinaiticus,

More information

An APEL Framework for the East of England

An APEL Framework for the East of England T H E L I F E L O N G L E A R N I N G N E T W O R K F O R T H E E A S T O F E N G L A N D An APEL Framework for the East of England Developing core principles and best practice Part of the Regional Credit

More information

November 2012 MUET (800)

November 2012 MUET (800) November 2012 MUET (800) OVERALL PERFORMANCE A total of 75 589 candidates took the November 2012 MUET. The performance of candidates for each paper, 800/1 Listening, 800/2 Speaking, 800/3 Reading and 800/4

More information

Opening up Opportunities for year olds

Opening up Opportunities for year olds 2016-17 Opening up Opportunities for 16-19 year olds What is the South Bucks Partnership? It s a group of schools that work together to widen the opportunities of all their students at Sixth Form level.

More information

Critical Thinking in Everyday Life: 9 Strategies

Critical Thinking in Everyday Life: 9 Strategies Critical Thinking in Everyday Life: 9 Strategies Most of us are not what we could be. We are less. We have great capacity. But most of it is dormant; most is undeveloped. Improvement in thinking is like

More information

Summary: Impact Statement

Summary: Impact Statement Summary: Impact Statement 2015-16 The following table summarises the attainment and progress gaps over the past two years by the new national performance measures. National data is not yet available and

More information

Putnoe Primary School

Putnoe Primary School School report Putnoe Primary School Church Lane, Bedford, MK41 0DH Inspection dates 20 21 May 2015 Overall effectiveness Previous inspection: Outstanding 1 This inspection: Good 2 Leadership and management

More information

IMPERIAL COLLEGE LONDON ACCESS AGREEMENT

IMPERIAL COLLEGE LONDON ACCESS AGREEMENT IMPERIAL COLLEGE LONDON ACCESS AGREEMENT BACKGROUND 1. This Access Agreement for Imperial College London is framed by the College s mission, our admissions requirements and our commitment to widening participation.

More information

Out of the heart springs life

Out of the heart springs life Exam Results & Student Destinations 2016 Ex Corde Vita Out of the heart springs life The pattern of King Alfred School may, I think, be likened to the valley of a river. The width and length of this valley

More information

Oasis Academy South Bank

Oasis Academy South Bank School report Oasis Academy South Bank 75 Westminster Bridge Road, London, SE1 7HS Inspection dates 24 25 June 2015 Overall effectiveness Previous inspection: Not previously inspected This inspection:

More information

1 Use complex features of a word processing application to a given brief. 2 Create a complex document. 3 Collaborate on a complex document.

1 Use complex features of a word processing application to a given brief. 2 Create a complex document. 3 Collaborate on a complex document. National Unit specification General information Unit code: HA6M 46 Superclass: CD Publication date: May 2016 Source: Scottish Qualifications Authority Version: 02 Unit purpose This Unit is designed to

More information

THE QUEEN S SCHOOL Whole School Pay Policy

THE QUEEN S SCHOOL Whole School Pay Policy The Queen s Church of England Primary School Encouraging every child to reach their full potential, nurtured and supported in a Christian community which lives by the values of Love, Compassion and Respect.

More information

The Curriculum in Primary Schools

The Curriculum in Primary Schools The Curriculum in Primary Schools Seminar on findings from Curriculum Implementation Evaluation, DES Inspectorate Primary Curriculum Review, Phase 1, NCCA May 11 th 2005 Planning the curriculum whole school

More information

Linking the Common European Framework of Reference and the Michigan English Language Assessment Battery Technical Report

Linking the Common European Framework of Reference and the Michigan English Language Assessment Battery Technical Report Linking the Common European Framework of Reference and the Michigan English Language Assessment Battery Technical Report Contact Information All correspondence and mailings should be addressed to: CaMLA

More information

Candidates must achieve a grade of at least C2 level in each examination in order to achieve the overall qualification at C2 Level.

Candidates must achieve a grade of at least C2 level in each examination in order to achieve the overall qualification at C2 Level. The Test of Interactive English, C2 Level Qualification Structure The Test of Interactive English consists of two units: Unit Name English English Each Unit is assessed via a separate examination, set,

More information

The Ohio State University. Colleges of the Arts and Sciences. Bachelor of Science Degree Requirements. The Aim of the Arts and Sciences

The Ohio State University. Colleges of the Arts and Sciences. Bachelor of Science Degree Requirements. The Aim of the Arts and Sciences The Ohio State University Colleges of the Arts and Sciences Bachelor of Science Degree Requirements Spring Quarter 2004 (May 4, 2004) The Aim of the Arts and Sciences Five colleges comprise the Colleges

More information

Julia Smith. Effective Classroom Approaches to.

Julia Smith. Effective Classroom Approaches to. Julia Smith @tessmaths Effective Classroom Approaches to GCSE Maths resits julia.smith@writtle.ac.uk Agenda The context of GCSE resit in a post-16 setting An overview of the new GCSE Key features of a

More information

Summary results (year 1-3)

Summary results (year 1-3) Summary results (year 1-3) Evaluation and accountability are key issues in ensuring quality provision for all (Eurydice, 2004). In Europe, the dominant arrangement for educational accountability is school

More information

WOODBRIDGE HIGH SCHOOL

WOODBRIDGE HIGH SCHOOL WOODBRIDGE HIGH SCHOOL EXAM POLICY 2017-2018 The 11-19 Exam Policy The purpose of this exam policy is: to ensure the planning and management of exams is conducted efficiently and in the best interest of

More information

Interpreting ACER Test Results

Interpreting ACER Test Results Interpreting ACER Test Results This document briefly explains the different reports provided by the online ACER Progressive Achievement Tests (PAT). More detailed information can be found in the relevant

More information

GCSE. Mathematics A. Mark Scheme for January General Certificate of Secondary Education Unit A503/01: Mathematics C (Foundation Tier)

GCSE. Mathematics A. Mark Scheme for January General Certificate of Secondary Education Unit A503/01: Mathematics C (Foundation Tier) GCSE Mathematics A General Certificate of Secondary Education Unit A503/0: Mathematics C (Foundation Tier) Mark Scheme for January 203 Oxford Cambridge and RSA Examinations OCR (Oxford Cambridge and RSA)

More information