WHY ALGORITHMS AREN’T MAKING THE GRADE
When the UK government decided to cancel school exams due to the coronavirus pandemic, they gave examination regulators Ofqual a challenge: allocate grades to students anyway, and make sure the grades given out this year are equivalent in standard to previous years. Ofqual’s solution was to create an algorithm – a computer program designed to predict what grades the students would have received if they had taken exams.
Unfortunately, when the computer-generated grades were issued, 40 per cent of A-Level students got lower grades than their teachers had predicted. Some of them several grades lower. Promised university places were withdrawn. Lawyers offered to take legal action against Ofqual. Angry teenagers took to the streets with placards saying ‘F**k the algorithm’.
Worse, students at large state schools seemed to have lost out more than those at private schools or those studying less popular subjects like classics and law. Faced with such glaring unfairness, the education minister announced that teacher predictions would be accepted after all, but not before universities had filled places on their courses, leaving everyone scrambling to sort out revised offers and over-subscribed courses.
Prime minister Boris Johnson called it a “mutant algorithm”, as if the computer program had somehow evolved evil powers to wreck teenagers’
You’re reading a preview, subscribe to read more.
Start your free 30 days