London Daily

Focus on the big picture.
Thursday, Jan 29, 2026

Algorithms can drive inequality. Just look at Britain's school exam chaos

Algorithms can drive inequality. Just look at Britain's school exam chaos

Philip blames an algorithm for potentially losing his place to study law at university.

The 18-year-old, whose full name CNN is not disclosing because he feared repercussions from universities, was among more than 300,000 pupils in England, Wales and Northern Ireland who woke on August 13 to critically important A-level exam results, which are broadly equivalent to the US high school diploma.

These exams were canceled this summer due to the pandemic. Student marks were instead determined by an algorithm, the Direct Centre Performance Model, which was chosen by the government's exam regulator. The model drew on a collection of data to produce the grades. A subsequent outcry over alleged algorithmic bias against pupils from more disadvantaged backgrounds has now left teenagers and experts alike calling for greater scrutiny of such technology.

The teachers at Philip's west London school predicted he would gain 2 A grades and a B in his exams, which would have comfortably secured his spot to study law at Exeter University.


Students demonstrate outside the Department for Education in central London on August 14.


On August 13, the student sat at home trying to access the website that would confirm whether or not he had a university place.

"I was upstairs trying to get [the website] to load and my Mum was downstairs doing the same thing," he told CNN. "She got it open and shouted out. And they'd declined me.

"I didn't feel too good," Philip added. "Yeah, I was pretty cross about it. But everyone I was with was in a similar situation."

The model awarded Philip a B grade and 2 Cs. The teenager was not alone; close to 40% of grades in England were downgraded from teacher-predicted marks, with pupils at state-funded schools hit harder by the system than their private school peers. Many subsequently lost their place at university.

Uproar followed, with some teenagers protesting outside the UK department of education. Videos from the student protests were widely shared online, including those in which teenagers chanted: "F**k the algorithm!"

Following several days of negative headlines, Education Secretary Gavin Williamson announced that students would be awarded teacher-predicted grades, instead of marks allocated by the model.

The chosen algorithm was meant to guarantee fairness, by ensuring grade distribution for the 2020 cohort followed the pattern of previous years, with a similar number of high and low marks. It drew on teacher-predicted grades and teacher rankings of students to determine grades. But crucially it also took into account the historical performance of schools, which benefited students from more affluent backgrounds.

Private schools in England, which charge parents fees, typically have smaller classes, with grades that could not easily be standardized by the model. The algorithm thus gave more weight to the teacher-predicted grades for these cohorts, which are often wealthier and whiter than their downgraded peers at state schools.

"One of the complexities that we have is that there are lots of ways an algorithm can be fair," said Helena Webb, senior researcher at Oxford University's Department of Computer Science.

"You can see an argument where [the government] said [it] wanted to get results that look similar to last year's. And at a country-wide level, that could be argued as [being] fair. But it completely misses what was fair for individuals.

"Obviously this algorithm is reflecting and mirroring what has happened in previous years," she added. "So it doesn't [reflect] the fact that schools might [improve.] And of course that's going to have worse effects on state schools than on very well known private schools which have consistently higher grades."

"What's made me angry is the way [they] treated state schools," said Josh Wicks, 18, a pupil from Chippenham in Wiltshire, western England. His marks were downgraded from 2 A* and an A to 3 As.

"The algorithm thought that if the school hadn't achieved [high grades] before, [pupils] couldn't get them now," he told CNN. "I just think it's patronizing."

The political storm has left ministers in Boris Johnson's government scrambling for explanations, following heavy criticism of its handling of the coronavirus pandemic. Covid-19 has killed more than 41,000 people in the UK, making it the worst-hit country in Europe.

Why are some algorithms accused of bias?


Algorithms are used across every part of society today, from social media and visa application systems, to facial recognition technology and exam grading.
The technology can be liberating for cash-strapped governments and for corporations chasing innovation. But experts have long warned of the existence of algorithmic bias and as automated processes become more widespread, so do accusations of discrimination.

"The A-levels thing is the tip of the iceberg," said Cori Crider, co-founder of Foxglove, an organization that challenges the alleged abuse of digital technology. Crider told CNN that the algorithms replicated the biases found in the raw data used.


Students hold placards as they protest outside the Department for Education in central London on August 14.


But Crider warned against the impulse to simply blame policy issues on the technology.

"Anybody who tells you it's a tech problem is [lying]," she said.

"What happened [with the exams] is that a political choice was made to minimize grade inflation. That's a political choice, not a tech one."

Foxglove and the Joint Council for the Welfare of Immigrants recently challenged the British Home Office over its use of an algorithm designed to stream visa applications. The activist groups alleged that the algorithm was biased against applicants from certain countries, making it automatically more likely that such applicants would be denied a visa.

Foxglove alleged that the screening system suffered from a feedback loop,"where past bias and discrimination, fed into a computer program, reinforce future bias and discrimination."

"We have been reviewing how the visa application streaming tool operates and will be redesigning our processes to make them even more streamlined and secure," a UK Home Office spokesperson told CNN.

"But we do not accept the allegations Joint Council for the Welfare of Immigrants made in their Judicial Review claim and whilst litigation is still ongoing it would not be appropriate for the department to comment any further."

Crider said the problems Foxglove found with past data leading to biased algorithms were evident elsewhere, pointing to the debate over predictive policing programs in the United States.

In June, the Californian city of Santa Cruz banned predictive policing over concerns that the analytic software program officers used in their work was discriminating against people of color.

"We have technology that could target people of color in our community -- it's technology that we don't need," Mayor Justin Cummings told Reuters news agency in June.

"Part of the problem is the data being fed in," Crider said.

"Historical data is being fed in [to algorithms] and they are replicating the [existing] bias."

Webb agrees. "A lot of [the issue] is about the data that the algorithm learns from," she said. "For example, a lot of facial recognition technology has come out ... the problem is, a lot of [those] systems were trained on a lot of white, male faces.

"So when the software comes to be used it's very good at recognizing white men, but not so good at recognizing women and people of color. And that comes from the data and the way the data was put into the algorithm."

Webb added that she believed the problems could partly be mitigated through "a greater attention to inclusivity in datasets" and a push to add a greater "multiplicity of voices" around the development of algorithms.

Increased regulation?


Activists and experts told CNN they hoped recent debates around algorithms would lead to greater oversight of the technology.

"There's a lack of regulatory oversight over how these systems are used," Webb said, adding that companies could also choose to self-regulate.

Some companies are becoming notably more vocal on the issue.

"Some technologies risk repeating the patterns developed by our biased societies," Instagram CEO Adam Mosseri wrote in a statement in June on the company's diversity efforts. "While we do a lot of work to help prevent subconscious bias in our products, we need to take a harder look at the underlying systems we've built, and where we need to do more to keep bias out of these decisions."

Facebook, which owns Instagram, subsequently created new teams to review bias in company systems.

"I would like to see democratic pushback on [the use of algorithms]," Crider said. "Are there areas in public life where it's not acceptable to have these systems at all?"

While the debate continues in boardrooms and academia, these automated systems continue to determine people's lives in numerous and subtle ways.

For Philip, the UK government's scrapping of the exams algorithm has left him in limbo.

"We emailed Exeter [University] and phoned and they're in a kind of mess," he said, adding that he was hopeful he could win his place back. "I think I'll just defer now anyway."

He said he was grateful to be given his predicted grades but said the experience had gone "pretty badly."
"[The government] had months to sort this out," he said. "I get that there's a lot of things going on with the health stuff but [...] it's a pretty poor showing."

Newsletter

Related Articles

0:00
0:00
Close
Former South Korean First Lady Kim Keon Hee Sentenced to 20 Months for Bribery
Tesla Ends Model S and X Production and Sends $2 Billion to xAI as 2025 Revenue Declines
China Executes 11 Members of the Ming Clan in Cross-Border Scam Case Linked to Myanmar’s Lawkai
Trump Administration Officials Held Talks With Group Advocating Alberta’s Independence
Starmer Signals UK Push for a More ‘Sophisticated’ Relationship With China in Talks With Xi
Shopping Chatbots Move From Advice to Checkout as Walmart Pushes Faster Than Amazon
Starmer Seeks Economic Gains From China Visit While Navigating US Diplomatic Sensitivities
Starmer Says China Visit Will Deliver Economic Benefits as He Prepares to Meet Xi Jinping
UK Prime Minister Starmer Arrives in China to Bolster Trade and Warn Firms of Strategic Opportunities
The AI Hiring Doom Loop — Algorithmic Recruiting Filters Out Top Talent and Rewards Average or Fake Candidates
Amazon to Cut 16,000 Corporate Jobs After Earlier 14,000 Reduction, Citing Streamlining and AI Investment
Federal Reserve Holds Interest Rate at 3.75% as Powell Faces DOJ Criminal Investigation During 2026 Decision
Putin’s Four-Year Ukraine Invasion Cost: Russia’s Mass Casualty Attrition and the Donbas Security-Guarantee Tradeoff
Wall Street Bets on Strong US Growth and Currency Moves as Dollar Slips After Trump Comments
UK Prime Minister Traveled to China Using Temporary Phones and Laptops to Limit Espionage Risks
Google’s $68 Million Voice Assistant Settlement Exposes Incentives That Reward Over-Collection
Kim Kardashian Admits Faking Paparazzi Visit to Britney Spears for Fame in Early 2000s
UPS to Cut 30,000 More Jobs by 2026 Amid Shift to High-Margin Deliveries
France Plans to Replace Teams and Zoom Across Government With Homegrown Visio by 2027
Trump Removes Minneapolis Deportation Operation Commander After Fatal Shooting of Protester
Iran’s Elite Wealth Abroad and Sanctions Leakage: How Offshore Luxury Sustains Regime Resilience
U.S. Central Command Announces Regional Air Exercise as Iran Unveils Drone Carrier Footage
Four Arrested in Andhra Pradesh Over Alleged HIV-Contaminated Injection Attack on Doctor
Hot Drinks, Hidden Particles: How Disposable Cups Quietly Increase Microplastic Exposure
UK Banks Pledge £11 Billion Lending Package to Help Firms Expand Overseas
Suella Braverman Defects to Reform UK, Accusing Conservatives of Betrayal on Core Policies
Melania Trump Documentary Sees Limited Box Office Traction in UK Cinemas
Meta and EssilorLuxottica Ray-Ban Smart Glasses and the Non-Consensual Public Recording Economy
WhatsApp Develops New Meta AI Features to Enhance User Control
Germany Considers Gold Reserves Amidst Rising Tensions with the U.S.
Michael Schumacher Shows Significant Improvement in Health Status
Greenland’s NATO Stress Test: Coercion, Credibility, and the New Arctic Bargaining Game
Diego Garcia and the Chagos Dispute: When Decolonization Collides With Alliance Power
Trump Claims “Total” U.S. Access to Greenland as NATO Weighs Arctic Basing Rights and Deterrence
Air France and KLM Suspend Multiple Middle East Routes as Regional Tensions Disrupt Aviation
U.S. winter storm triggers 13,000-plus flight cancellations and 160,000 power outages
Poland delays euro adoption as Domański cites $1tn economy and zloty advantage
White House: Trump warns Canada of 100% tariff if Carney finalizes China trade deal
PLA opens CMC probe of Zhang Youxia, Liu Zhenli over Xi authority and discipline violations
ICE and DHS immigration raids in Minneapolis: the use-of-force accountability crisis in mass deportation enforcement
UK’s Starmer and Trump Agree on Urgent Need to Bolster Arctic Security
Starmer Breaks Diplomatic Restraint With Firm Rebuke of Trump, Seizing Chance to Advocate for Europe
UK Finance Minister Reeves to Join Starmer on China Visit to Bolster Trade and Economic Ties
Prince Harry Says Sacrifices of NATO Forces in Afghanistan Deserve ‘Respect’ After Trump Remarks
Barron Trump Emerges as Key Remote Witness in UK Assault and Rape Trial
Nigel Farage Attended Davos 2026 Using HP Trust Delegate Pass Linked to Sasan Ghandehari
Gold Jumps More Than 8% in a Week as the Dollar Slides Amid Greenland Tariff Dispute
BlackRock Executive Rick Rieder Emerges as Leading Contender to Succeed Jerome Powell as Fed Chair
Boston Dynamics Atlas humanoid robot and LG CLOiD home robot: the platform lock-in fight to control Physical AI
United States under President Donald Trump completes withdrawal from the World Health Organization: health sovereignty versus global outbreak early-warning access
×