Beware the harmful algorithm

These past few days British media outlets have been full of stories about the scandalous way 17 and 18 year olds have been dealt with following the cancellation of ‘A’ Level exams because of the virus.  At the root of the problem was an algorithm. Or rather, it was probably not the algorithm that was the problem, as such,  but how it was applied.

‘A’ Level results essentially determine which University or other educational or training opportunity you end up with as your life journey moves to another level at the end of your time at school. It is usually a pivotal moment in a young person’s life. Not necessarily decisive, but hugely important.

Teachers’ predictions

For readers outside the UK:  this year every young person’s teachers were asked to predict the results they would have obtained had they sat  ‘A’ Levels.  This happens every year so it is possible to compare predictions with actual outcomes.  A great many children perform exactly in line with predictions. A great many do not. That gap is important because it is populated  by people, not robots.

In the usual way, on the basis of teachers’ predictions University places and the like were offered, or not.

Devising an algorithm 

In the absence of actual exams how were the authorities to determine the final results? The answer they all came up with was to devise an algorithm then apply it to the mass of data showing the predicted grades.

The process of devising the algorithm appears to have started by looking at historic data for the subjects concerned, and the type of school concerned, for which read the demography of its intake.

These data would show that in school A, in a certain kind of area x% of  students could be expected to receive top level grades in Maths,  y%  would get the lowest grades in History and so on, subject by subject, school by school.

Class size matters

One of the other factors in the equation was class size, which is typically a proxy for parental income. Children in smaller classes tend to do better than children in larger classes. Who knew? “Smaller classes”  is often just another way of saying “private school”  or a school in a prosperous part of town. Which tends to get us back to parental income.

According to this way of looking at the matter, year after year a predictable proportion of young people will get a certain spread of grades and therefore end up going to a certain spread of Universities or whatever.

But here’s the kicker. Teachers’ predictions for each child were the very last factor to be entered into the calculations. And it looks like they counted for a lot less than the “framework” established by the historic data.

“A process” became a cruel farce.

Marked down

The process has had terrible consequences for huge numbers of children. Why? Because in making the end result fit the pattern of previous years,  large numbers of youngsters were marked down from their teachers’ predictions.

In defence of the system, some officials tried to argue that because teachers from certain types of schools, seemingly with equally certain mathematical predictability, are more prone to overestimate a child’s performance than teachers from other schools, the  large scale marking down was justified.

Dazzled by the maths. Lost sight of the young human being

If you happened to be in an improving school about to register its breakthrough moment, well that’s just bad luck according to the faceless ones who gave their blessing to all this. In  allowing themselves to be dazzled by the logic of the maths they lost sight of the humanity required when handling young people’s dreams.

I listened to two distinguished statisticians explain that “Algorithms work extremely well for populations, but not necessarily for individuals.” They said this quite dispassionately and not in any way to justify what happened this year.

A system that only works for populations and allows for substantial injustice to be suffered by an individual is a system not worthy of the name. Young people can improve their individual performance between mocks and finals. Poor mock results are often the spur to pull your finger out. That’s not something a blunt instrument like an algorithm can detect. With young people’s lives we need precision lasers not hammers.

With a shrug of the shoulders a bureaucrat cannot be allowed to sweep aside and crush a young person. I am tempted to invoke historical examples of other forms of brutal indifference to the individual in the interest of “the plan” but the point is obvious to anyone with a brain and a heart.

Maths first. People nowhere. Not acceptable.

Worcester College, Oxford shows the way

Starting with Worcester College, Oxford some Universities have declared they will accept the teachers’ original predictions and ignore any marking down. I applaud them. How  easy it will be for others to follow suit I don’t know, but they should  all certainly make the effort and stand ready to explain why they didn’t.

GDPR no help but…

Article 22 of the GDPR says individuals shall have the right  “not to be subject to a decision based solely on automated processing, including profiling……”

That is  pretty close to what has happened here although because it isn’t exactly the same, Article 22 is of no use (the process was not based ‘solely’ on automated processing).

Yet the spirit of Article 22 is clear. There is an appeals process but it costs money and it may not be possible to complete the vast numbers of appeals now anticipated before the University  and other terms begin.

With the inevitable drop in students from overseas (maybe even all Chinese students) why can’t everyone do what Worcester College did?  In the next two to three years perhaps there will be a higher drop out rate but at least each student would know this was down to them, not an invisible, unaccountable hand in Whitehall, Holyrood, Belfast or Cardiff. And set against that there will some students who get in to University or other place who otherwise might not have done. And they will shine.

We should not have to choose between injustices

One last word: I have heard people complain that children from private schools and more affluent parts of town fared less badly than children from elsewhere. That is the inevitable result of how algorithms work. If there is bias in a system it will be reflected in the data and be reproduced by it.

This is not a reason to punish the children of well off families. The individual is what matters here, not their parentage. Insisting that little Billy Rich or Jenny Affluent does not get the University place he or she dreamed of because Frankie Poor and Lucy Skint from down the road didn’t get theirs is the wrong answer to the wrong question.

There are lessons here for all of us as algorithms seem set to play an ever more important role in the way all kinds of things work. Particularly over the internet.

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International, Technical Adviser to the European NGO Alliance for Child Safety Online, which is administered by Save the Children Italy and an Advisory Council Member of Beyond Borders (Canada). Amongst other things John is or has been an Adviser to the United Nations, ITU, the European Union, the Council of Europe and European Union Agency for Network and Information Security and is a former Board Member of the UK Council for Child Internet Safety. He is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John has advised many of the world's largest internet companies on online child safety. In June, 2012, John was appointed a Visiting Senior Fellow at the London School of Economics and Political Science. More: http://johncarrcv.blogspot.com
This entry was posted in Privacy, Regulation, Self-regulation, Uncategorized. Bookmark the permalink.