London Daily

Focus on the big picture.
Sunday, Mar 01, 2026

Algorithms can drive inequality. Just look at Britain's school exam chaos

Algorithms can drive inequality. Just look at Britain's school exam chaos

Philip blames an algorithm for potentially losing his place to study law at university.

The 18-year-old, whose full name CNN is not disclosing because he feared repercussions from universities, was among more than 300,000 pupils in England, Wales and Northern Ireland who woke on August 13 to critically important A-level exam results, which are broadly equivalent to the US high school diploma.

These exams were canceled this summer due to the pandemic. Student marks were instead determined by an algorithm, the Direct Centre Performance Model, which was chosen by the government's exam regulator. The model drew on a collection of data to produce the grades. A subsequent outcry over alleged algorithmic bias against pupils from more disadvantaged backgrounds has now left teenagers and experts alike calling for greater scrutiny of such technology.

The teachers at Philip's west London school predicted he would gain 2 A grades and a B in his exams, which would have comfortably secured his spot to study law at Exeter University.


Students demonstrate outside the Department for Education in central London on August 14.


On August 13, the student sat at home trying to access the website that would confirm whether or not he had a university place.

"I was upstairs trying to get [the website] to load and my Mum was downstairs doing the same thing," he told CNN. "She got it open and shouted out. And they'd declined me.

"I didn't feel too good," Philip added. "Yeah, I was pretty cross about it. But everyone I was with was in a similar situation."

The model awarded Philip a B grade and 2 Cs. The teenager was not alone; close to 40% of grades in England were downgraded from teacher-predicted marks, with pupils at state-funded schools hit harder by the system than their private school peers. Many subsequently lost their place at university.

Uproar followed, with some teenagers protesting outside the UK department of education. Videos from the student protests were widely shared online, including those in which teenagers chanted: "F**k the algorithm!"

Following several days of negative headlines, Education Secretary Gavin Williamson announced that students would be awarded teacher-predicted grades, instead of marks allocated by the model.

The chosen algorithm was meant to guarantee fairness, by ensuring grade distribution for the 2020 cohort followed the pattern of previous years, with a similar number of high and low marks. It drew on teacher-predicted grades and teacher rankings of students to determine grades. But crucially it also took into account the historical performance of schools, which benefited students from more affluent backgrounds.

Private schools in England, which charge parents fees, typically have smaller classes, with grades that could not easily be standardized by the model. The algorithm thus gave more weight to the teacher-predicted grades for these cohorts, which are often wealthier and whiter than their downgraded peers at state schools.

"One of the complexities that we have is that there are lots of ways an algorithm can be fair," said Helena Webb, senior researcher at Oxford University's Department of Computer Science.

"You can see an argument where [the government] said [it] wanted to get results that look similar to last year's. And at a country-wide level, that could be argued as [being] fair. But it completely misses what was fair for individuals.

"Obviously this algorithm is reflecting and mirroring what has happened in previous years," she added. "So it doesn't [reflect] the fact that schools might [improve.] And of course that's going to have worse effects on state schools than on very well known private schools which have consistently higher grades."

"What's made me angry is the way [they] treated state schools," said Josh Wicks, 18, a pupil from Chippenham in Wiltshire, western England. His marks were downgraded from 2 A* and an A to 3 As.

"The algorithm thought that if the school hadn't achieved [high grades] before, [pupils] couldn't get them now," he told CNN. "I just think it's patronizing."

The political storm has left ministers in Boris Johnson's government scrambling for explanations, following heavy criticism of its handling of the coronavirus pandemic. Covid-19 has killed more than 41,000 people in the UK, making it the worst-hit country in Europe.

Why are some algorithms accused of bias?


Algorithms are used across every part of society today, from social media and visa application systems, to facial recognition technology and exam grading.
The technology can be liberating for cash-strapped governments and for corporations chasing innovation. But experts have long warned of the existence of algorithmic bias and as automated processes become more widespread, so do accusations of discrimination.

"The A-levels thing is the tip of the iceberg," said Cori Crider, co-founder of Foxglove, an organization that challenges the alleged abuse of digital technology. Crider told CNN that the algorithms replicated the biases found in the raw data used.


Students hold placards as they protest outside the Department for Education in central London on August 14.


But Crider warned against the impulse to simply blame policy issues on the technology.

"Anybody who tells you it's a tech problem is [lying]," she said.

"What happened [with the exams] is that a political choice was made to minimize grade inflation. That's a political choice, not a tech one."

Foxglove and the Joint Council for the Welfare of Immigrants recently challenged the British Home Office over its use of an algorithm designed to stream visa applications. The activist groups alleged that the algorithm was biased against applicants from certain countries, making it automatically more likely that such applicants would be denied a visa.

Foxglove alleged that the screening system suffered from a feedback loop,"where past bias and discrimination, fed into a computer program, reinforce future bias and discrimination."

"We have been reviewing how the visa application streaming tool operates and will be redesigning our processes to make them even more streamlined and secure," a UK Home Office spokesperson told CNN.

"But we do not accept the allegations Joint Council for the Welfare of Immigrants made in their Judicial Review claim and whilst litigation is still ongoing it would not be appropriate for the department to comment any further."

Crider said the problems Foxglove found with past data leading to biased algorithms were evident elsewhere, pointing to the debate over predictive policing programs in the United States.

In June, the Californian city of Santa Cruz banned predictive policing over concerns that the analytic software program officers used in their work was discriminating against people of color.

"We have technology that could target people of color in our community -- it's technology that we don't need," Mayor Justin Cummings told Reuters news agency in June.

"Part of the problem is the data being fed in," Crider said.

"Historical data is being fed in [to algorithms] and they are replicating the [existing] bias."

Webb agrees. "A lot of [the issue] is about the data that the algorithm learns from," she said. "For example, a lot of facial recognition technology has come out ... the problem is, a lot of [those] systems were trained on a lot of white, male faces.

"So when the software comes to be used it's very good at recognizing white men, but not so good at recognizing women and people of color. And that comes from the data and the way the data was put into the algorithm."

Webb added that she believed the problems could partly be mitigated through "a greater attention to inclusivity in datasets" and a push to add a greater "multiplicity of voices" around the development of algorithms.

Increased regulation?


Activists and experts told CNN they hoped recent debates around algorithms would lead to greater oversight of the technology.

"There's a lack of regulatory oversight over how these systems are used," Webb said, adding that companies could also choose to self-regulate.

Some companies are becoming notably more vocal on the issue.

"Some technologies risk repeating the patterns developed by our biased societies," Instagram CEO Adam Mosseri wrote in a statement in June on the company's diversity efforts. "While we do a lot of work to help prevent subconscious bias in our products, we need to take a harder look at the underlying systems we've built, and where we need to do more to keep bias out of these decisions."

Facebook, which owns Instagram, subsequently created new teams to review bias in company systems.

"I would like to see democratic pushback on [the use of algorithms]," Crider said. "Are there areas in public life where it's not acceptable to have these systems at all?"

While the debate continues in boardrooms and academia, these automated systems continue to determine people's lives in numerous and subtle ways.

For Philip, the UK government's scrapping of the exams algorithm has left him in limbo.

"We emailed Exeter [University] and phoned and they're in a kind of mess," he said, adding that he was hopeful he could win his place back. "I think I'll just defer now anyway."

He said he was grateful to be given his predicted grades but said the experience had gone "pretty badly."
"[The government] had months to sort this out," he said. "I get that there's a lot of things going on with the health stuff but [...] it's a pretty poor showing."

Newsletter

Related Articles

0:00
0:00
Close
Violent Pro-Iranian Protesters Storm U.S. Consulate in Karachi
Missile Debris Sparks Fires at Dubai’s Jebel Ali Port Near Palm Jumeirah
Iran Strikes U.S. Fifth Fleet Headquarters in Bahrain Amid Wider Gulf Retaliation
When the State Replaces the Parent: How Gender Policy Is Redefining Custody and Coercion
Bill Clinton Denies Knowing Woman in Hot Tub Photo During Closed-Door Epstein Deposition
Former U.S. President Bill Clinton Testifies on Ties to Jeffrey Epstein Before Congressional Oversight Committee
Dyson Reaches Settlement in Landmark UK Forced Labour Case
Barclays and Jefferies Shares Fall After UK Mortgage Lender Collapse Rekindles Credit Market Concerns
Play Exploring Donald Trump’s Rise to Power by ‘Lehman Trilogy’ Author to Premiere in the UK
Man Arrested After Churchill Statue Defaced in Central London
Keir Starmer Faces Political Setback as Labour Finishes Third in High-Profile By-Election
UK Assisted Dying Bill Set to Fall Short in Parliament as Regional Initiatives Gain Ground
UK Defence Ministry Clarifies Position After Reports of Imminent Helicopter Contract
Independent Left-Wing Plumber Secures Shock Victory as Greens Surge in UK By-Election
Reform UK Refers Alleged ‘Family Voting’ Incidents in By-Election to Police
United Kingdom Temporarily Withdraws Embassy Staff from Iran Amid Heightened Regional Tensions
UK Government Reaches Framework Agreement on Release of Mandelson Vetting Files
UK Police Contracts With Israeli Surveillance Firms Spark Debate Over Ethics and Oversight
United Airlines Passenger Hears Cockpit Conversations After Accessing In-Flight Audio Channel
Spain to Conduct Border Checks on Gibraltar Arrivals Under New Post-Brexit Framework
Engie Shares Jump After $14 Billion Agreement to Acquire UK Power Grid Assets
BNP Paribas Overtakes Goldman Sachs in UK Investment Banking League Tables
Geothermal Project to Power Ten Thousand Homes Marks UK Renewable Energy Milestone
UK Visa Grants Drop Nineteen Percent in 2025 as Migration Controls Tighten
Barclays and Jefferies Among Banks Exposed to Collapse of UK Mortgage Lender MFS
UK Asylum Applications Edge Down in 2025 Despite Rise in Small Boat Crossings
Jefferies Reports Significant Exposure After Collapse of UK Lender MFS
FTSE 100 Reaches Fresh Record Highs as Major Share Buybacks and Earnings Lift London Stocks
So, what's happened is, I think, government policy, not just under Labour, but under the Conservatives as well, has driven a lot of small landlords out of business.
Larry Summers, the former U.S. Treasury Secretary, is resigning from Harvard University as fallout continues over his ties to Jeffrey Epstein.
U.S. stocks ended higher on Wednesday, with the Dow gaining about six-tenths of a percent, the S&P 500 adding eight-tenths of a percent, and the tech-heavy Nasdaq climbing roughly one-and-a-quarter percent.
From fears of AI-fuelled unemployment to Big Tech's record investment, this is AI Weekly.
Apple just dropped iOS 26.4.
US Lawmakers Seek Briefing from UK Over Reported Encryption Order Directed at Apple
UK Business Secretary Calls on EU to Remove Trade Barriers Hindering Growth
Legal Pathways for Removing Prince Andrew from Britain’s Line of Succession Examined
PM Netanyahu welcome India PM Narendra Modi to Israel
Shadow Diplomacy: How Harry and Meghan’s Jordan Trip Undermines the Monarchy
Sir Jim Ratcliffe, co-owner of Manchester United, comments on immigration in the UK.
Bill Gates, the UN and the WEF are attempting to construct "a giant digital gulag for all of humanity" via digital ID, CBDCs and vaccine passport infrastructure.
Britain’s Channel Crisis: Paying Billions While the Boats Keep Coming
Downing Street’s Veteran Deception Scandal
UK HealthCare Expands ‘Food as Health’ Initiative Statewide to Tackle Chronic Illness in Kentucky
Leonardo Chief Says UK Set to Decide on New Medium Helicopter Programme
UK Slows Chagos Islands Agreement After Concerns Raised in Washington
European and UK Stock Markets Reach Fresh Highs as Banks and Miners Lead Rally
UK Government Insists Chagos Islands Negotiations Continue After Minister’s ‘Pause’ Remark
No Confirmed Deal for Engie to Acquire UK Power Networks Amid Market Speculation
UK Reaffirms Updated Entry Requirements for Travellers as of February 25, 2026
General Atlantic to sell equity stake in ByteDance, valuing the company at $550 billion
×