Education Secretary Gavin Williamson is under pressure to resign because of the A level results fiasco Kuhlmann/ MSC

As pressure mounts on universities, including Cambridge, to be more flexible with their offers and follow the example of Worcester College Oxford by accepting all state school offer-holders regardless of their grades, Varsity explains the controversy over this year's A level results.

Why are A levels this year different and how are grades being awarded? 

Due to the Covid-19 pandemic, school students didn’t sit any exams this year and so the government was forced to devise a new grading system. This system took a student’s predicted grades before moderating them according to a “standardisation model”.

To award students grades, teachers submitted predicted grades, alongside rankings of students, to exam boards. These grades and rankings were then modulated by a pupil’s schools grade distribution from 2017 to 2019 and a pupil’s past examination performance to create a student’s final grade. 

In practice, this means if a student was ranked in the middle of their school, their grade will be in part determined by placing them in the middle of their school’s past grade distribution. A pupil’s ranking will therefore be contextualised in terms of their past performance and where they featured in their year group. This technical wrinkle supposedly helps to make the system fairer. 

However, analysis of the data reveals that rather than past examination determining a pupil’s A level results, it is really their schools’ past performance which is most influential. It is this aspect of the algorithm that has sparked the most controversy. since there is considerable uncertainty about the fairness of the statistical model.

Why is the algorithm considered unfair? 

Over half of A Level results showed no change from a student’s predicted grade, and just over 2.2% of student’s final results increased on their predicted grades. But that still means two fifths of student’s A-level grades were lowered from their predictions and because the algorithm values a school’s past performance so highly, the downgradings were largely concentrated in poorly performing schools which are much more likely to be in disadvantaged areas.

So, the students hardest hit by the system are high achieving students in poorly performing schools. 

Even before results were released, the House of Commons education committee expressed alarm that Ofqual’s standardisation model would unfairly penalise “BAME pupils, pupils with special educational needs and disabilities, children looked after, and free school meal-eligible pupils” by comparing them to their past performance and their school’s grade distribution in previous years. 

In the conclusion of their report the committee called on Ofqual to adjust the grades upwards of students who have been “systematically disadvantaged by calculated grades”. But Ofqual did not heed these warnings and the algorithm was not altered.  

Besides the built-in disadvantage facing students from deprived backgrounds, private schools also have an in-built advantage derived from the algorithm: smaller classes and less common subjects. 

Teacher-assessed grades were given more weight  in subjects with a smaller number of students - five or fewer. Teachers’ predicted grades were found to be “typically higher” than grades determined by Ofqual’s moderation process. 

This means subjects with smaller entry numbers were more likely to experience an increase in their proportion of top grades. Independent schools are more likely to have subjects with fewer candidates, both as a result of smaller class sizes generally as well as less common subjects, such as Classics, being offered. 

While Ofqual claim their moderation process “contains no bias, either in favour or against” any particular school, independent school pupils’ grades were less likely to go through a more intense downgrading process because of fee-paying schools' smaller cohorts.  

What has been the effect of the grading system on results?

The raw figures from this year’s A Level results suggest a continuation in many of the trends that have been apparent over the past few years: a higher proportion of students achieved As or A*s and larger numbers are heading to university.

However, independent schools achieved a greater increase in As and A*s than other types of schools. The proportion of A*s and As achieved by independent school students rose by 4.7 percentage points while the equivalent figure for comprehensive schools was 2 percentage points. For state sixth form colleges it was 0.3 percentage points.

The iniquitous effect of the algorithm can be seen most clearly in the percentage of students from certain backgrounds who had their predicted grades lowered. While 8.3% of students from the least deprived backgrounds had grades lowered, that figure rose to 10.4% for pupils from the most deprived backgrounds.

Regional variations were also pronounced. While all regions recorded a rise in the proportion of As and A*s issued, the increase was smaller in the north-west and north-east. In those regions there was a 1.8 and 1.9 increase in percentage points respectively compared to 3.4 in the east Midlands and 2.9 in London.

What is the triple lock system? 

Gavin Williamson, the Education Secretary, announced the ‘triple lock’ on the eve of results day . The ‘triple lock’ means that a student’s final result will be the highest from the pupil’s given grade on result’s day, their mock exam results or an optional exam taken in the Autumn. 

However, the policy has been roundly criticised not only for creating confusion so soon to results day but also for misunderstanding the nature of mock exams and for creating uncertainty for those wanting to take the Autumn exam.

Regarding mock exams Geoff Barton, leader of the ASCL head teachers’ union, told the BBC that “the government doesn’t appear to understand how mock exams work. They aren't a set of exams which all conform to the same standards. The clue is in the name 'mock'."

This position was echoed by Lariss Kennedy, president of the National Union of Students, who said “the use of mock exams results risks making a mockery of the whole system." She warned there was a "lack of a standard approach to mock exams" and "they are not taken by all candidates".

Can students appeal their results?

The simple answer is yes and no. While Ofqual has barred individual students from making an appeal on academic grounds, schools can appeal on behalf of their students if they receive “a very different pattern of grades to results in previous years.” 

Schools can also appeal if they believe a mistake has been made or if there is evidence that this year’s grades were lower than expected because previous cohorts do not adequately represent this year’s students. 

The government today (15/08) announced that they will cover the cost of appeals so as not to deter head teachers from making appeals. There had been fears that the cost of an appeal - which could reach up to £150 per grade - would prevent schools taking on some students’ cases. 

The Education Secretary has also instructed the schools minister, Nick Gibb, to create a taskforce to oversee the appeals process. 


Mountain View

Demands for the University to honour state school students’ offers amid results day uproar

However, there is still great uncertainty surrounding the appeals process. For example, it is still unknown how long the appeals process will take which is particularly damaging for students who are awaiting appealed grades to be able to go to University. 

Some even claim the appeals process is a “dishonest” farce. Jo Maugham, Director of the Good Law Project, which is preparing a judicial review against Ofqual, characterises the appeals process as aimed at only “truly exceptional circumstances” and  that in their letter to Ofqual they observe that "there is no ability to appeal an awarded grade on the merits... on the basis of a disparity between their Centre Assessment Grades and their awarded grades."