London Daily

Focus on the big picture.
Friday, Oct 17, 2025

Algorithms can drive inequality. Just look at Britain's school exam chaos

Algorithms can drive inequality. Just look at Britain's school exam chaos

Philip blames an algorithm for potentially losing his place to study law at university.

The 18-year-old, whose full name CNN is not disclosing because he feared repercussions from universities, was among more than 300,000 pupils in England, Wales and Northern Ireland who woke on August 13 to critically important A-level exam results, which are broadly equivalent to the US high school diploma.

These exams were canceled this summer due to the pandemic. Student marks were instead determined by an algorithm, the Direct Centre Performance Model, which was chosen by the government's exam regulator. The model drew on a collection of data to produce the grades. A subsequent outcry over alleged algorithmic bias against pupils from more disadvantaged backgrounds has now left teenagers and experts alike calling for greater scrutiny of such technology.

The teachers at Philip's west London school predicted he would gain 2 A grades and a B in his exams, which would have comfortably secured his spot to study law at Exeter University.


Students demonstrate outside the Department for Education in central London on August 14.


On August 13, the student sat at home trying to access the website that would confirm whether or not he had a university place.

"I was upstairs trying to get [the website] to load and my Mum was downstairs doing the same thing," he told CNN. "She got it open and shouted out. And they'd declined me.

"I didn't feel too good," Philip added. "Yeah, I was pretty cross about it. But everyone I was with was in a similar situation."

The model awarded Philip a B grade and 2 Cs. The teenager was not alone; close to 40% of grades in England were downgraded from teacher-predicted marks, with pupils at state-funded schools hit harder by the system than their private school peers. Many subsequently lost their place at university.

Uproar followed, with some teenagers protesting outside the UK department of education. Videos from the student protests were widely shared online, including those in which teenagers chanted: "F**k the algorithm!"

Following several days of negative headlines, Education Secretary Gavin Williamson announced that students would be awarded teacher-predicted grades, instead of marks allocated by the model.

The chosen algorithm was meant to guarantee fairness, by ensuring grade distribution for the 2020 cohort followed the pattern of previous years, with a similar number of high and low marks. It drew on teacher-predicted grades and teacher rankings of students to determine grades. But crucially it also took into account the historical performance of schools, which benefited students from more affluent backgrounds.

Private schools in England, which charge parents fees, typically have smaller classes, with grades that could not easily be standardized by the model. The algorithm thus gave more weight to the teacher-predicted grades for these cohorts, which are often wealthier and whiter than their downgraded peers at state schools.

"One of the complexities that we have is that there are lots of ways an algorithm can be fair," said Helena Webb, senior researcher at Oxford University's Department of Computer Science.

"You can see an argument where [the government] said [it] wanted to get results that look similar to last year's. And at a country-wide level, that could be argued as [being] fair. But it completely misses what was fair for individuals.

"Obviously this algorithm is reflecting and mirroring what has happened in previous years," she added. "So it doesn't [reflect] the fact that schools might [improve.] And of course that's going to have worse effects on state schools than on very well known private schools which have consistently higher grades."

"What's made me angry is the way [they] treated state schools," said Josh Wicks, 18, a pupil from Chippenham in Wiltshire, western England. His marks were downgraded from 2 A* and an A to 3 As.

"The algorithm thought that if the school hadn't achieved [high grades] before, [pupils] couldn't get them now," he told CNN. "I just think it's patronizing."

The political storm has left ministers in Boris Johnson's government scrambling for explanations, following heavy criticism of its handling of the coronavirus pandemic. Covid-19 has killed more than 41,000 people in the UK, making it the worst-hit country in Europe.

Why are some algorithms accused of bias?


Algorithms are used across every part of society today, from social media and visa application systems, to facial recognition technology and exam grading.
The technology can be liberating for cash-strapped governments and for corporations chasing innovation. But experts have long warned of the existence of algorithmic bias and as automated processes become more widespread, so do accusations of discrimination.

"The A-levels thing is the tip of the iceberg," said Cori Crider, co-founder of Foxglove, an organization that challenges the alleged abuse of digital technology. Crider told CNN that the algorithms replicated the biases found in the raw data used.


Students hold placards as they protest outside the Department for Education in central London on August 14.


But Crider warned against the impulse to simply blame policy issues on the technology.

"Anybody who tells you it's a tech problem is [lying]," she said.

"What happened [with the exams] is that a political choice was made to minimize grade inflation. That's a political choice, not a tech one."

Foxglove and the Joint Council for the Welfare of Immigrants recently challenged the British Home Office over its use of an algorithm designed to stream visa applications. The activist groups alleged that the algorithm was biased against applicants from certain countries, making it automatically more likely that such applicants would be denied a visa.

Foxglove alleged that the screening system suffered from a feedback loop,"where past bias and discrimination, fed into a computer program, reinforce future bias and discrimination."

"We have been reviewing how the visa application streaming tool operates and will be redesigning our processes to make them even more streamlined and secure," a UK Home Office spokesperson told CNN.

"But we do not accept the allegations Joint Council for the Welfare of Immigrants made in their Judicial Review claim and whilst litigation is still ongoing it would not be appropriate for the department to comment any further."

Crider said the problems Foxglove found with past data leading to biased algorithms were evident elsewhere, pointing to the debate over predictive policing programs in the United States.

In June, the Californian city of Santa Cruz banned predictive policing over concerns that the analytic software program officers used in their work was discriminating against people of color.

"We have technology that could target people of color in our community -- it's technology that we don't need," Mayor Justin Cummings told Reuters news agency in June.

"Part of the problem is the data being fed in," Crider said.

"Historical data is being fed in [to algorithms] and they are replicating the [existing] bias."

Webb agrees. "A lot of [the issue] is about the data that the algorithm learns from," she said. "For example, a lot of facial recognition technology has come out ... the problem is, a lot of [those] systems were trained on a lot of white, male faces.

"So when the software comes to be used it's very good at recognizing white men, but not so good at recognizing women and people of color. And that comes from the data and the way the data was put into the algorithm."

Webb added that she believed the problems could partly be mitigated through "a greater attention to inclusivity in datasets" and a push to add a greater "multiplicity of voices" around the development of algorithms.

Increased regulation?


Activists and experts told CNN they hoped recent debates around algorithms would lead to greater oversight of the technology.

"There's a lack of regulatory oversight over how these systems are used," Webb said, adding that companies could also choose to self-regulate.

Some companies are becoming notably more vocal on the issue.

"Some technologies risk repeating the patterns developed by our biased societies," Instagram CEO Adam Mosseri wrote in a statement in June on the company's diversity efforts. "While we do a lot of work to help prevent subconscious bias in our products, we need to take a harder look at the underlying systems we've built, and where we need to do more to keep bias out of these decisions."

Facebook, which owns Instagram, subsequently created new teams to review bias in company systems.

"I would like to see democratic pushback on [the use of algorithms]," Crider said. "Are there areas in public life where it's not acceptable to have these systems at all?"

While the debate continues in boardrooms and academia, these automated systems continue to determine people's lives in numerous and subtle ways.

For Philip, the UK government's scrapping of the exams algorithm has left him in limbo.

"We emailed Exeter [University] and phoned and they're in a kind of mess," he said, adding that he was hopeful he could win his place back. "I think I'll just defer now anyway."

He said he was grateful to be given his predicted grades but said the experience had gone "pretty badly."
"[The government] had months to sort this out," he said. "I get that there's a lot of things going on with the health stuff but [...] it's a pretty poor showing."

Newsletter

Related Articles

0:00
0:00
Close
U.S. Chamber Sues Trump Over $100,000 H-1B Visa Fee
Shenzhen Expo Spotlights China’s Quantum Step in Semiconductor Self-Reliance
China Accelerates to the Forefront in Global Nuclear Fusion Race
Yachts, Private Jets, and a Picasso Painting: Exposed as 'One of the Largest Frauds in History'
Australia’s Wedgetail Spies Aid NATO Response as Russian MiGs Breach Estonian Airspace
McGowan Urges Chalmers to Cut Spending Over Tax Hike to Close $20 Billion Budget Gap
Victoria Orders Review of Transgender Prison Placement Amid Safety Concerns for Female Inmates
U.S. Treasury Mobilises New $20 Billion Debt Facility to Stabilise Argentina
French Business Leaders Decry Budget as Macron’s Pro-Enterprise Promise Undermined
Trump Claims Modi Pledged India Would End Russian Oil Imports Amid U.S. Tariff Pressure
Surging AI Startup Valuations Fuel Bubble Concerns Among Top Investors
Australian Punter Archie Wilson Tears Up During Nebraska Press Conference, Sparking Conversation on Male Vulnerability
Australia Confirms U.S. Access to Upgraded Submarine Shipyard Under AUKUS Deal
“Firepower” Promised for Ukraine as NATO Ministers Meet — But U.S. Tomahawks Remain Undecided
Brands Confront New Dilemma as Extremists Adopt Fashion Labels
The Sydney Sweeney and Jeans Storm: “The Outcome Surpassed Our Wildest Dreams”
Erika Kirk Delivers Moving Tribute at White House as Trump Awards Charlie Presidential Medal of Freedom
British Food Influencer ‘Big John’ Detained in Australia After Visa Dispute
ScamBodia: The Chinese Fraud Empire Shielded by Cambodia’s Ruling Elite
French PM Suspends Macron’s Pension Reform Until After 2027 in Bid to Stabilize Government
Orange, Bouygues and Free Make €17 Billion Bid for Drahi’s Altice France Telecom Assets
Dutch Government Seizes Chipmaker After U.S. Presses for Removal of Chinese CEO
Bessent Accuses China of Dragging Down Global Economy Amid New Trade Curbs
U.S. Revokes Visas of Foreign Nationals Who ‘Celebrated’ Charlie Kirk’s Assassination
AI and Cybersecurity at Forefront as GITEX Global 2025 Kicks Off in Dubai
DJI Loses Appeal to Remove Pentagon’s ‘Chinese Military Company’ Label
EU Deploys New Biometric Entry/Exit System: What Non-EU Travelers Must Know
Australian Prime Minister’s Private Number Exposed Through AI Contact Scraper
Ex-Microsoft Engineer Confirms Famous Windows XP Key Was Leaked Corporate License, Not a Hack
China’s lesson for the US: it takes more than chips to win the AI race
Australia Faces Demographic Risk as Fertility Falls to Record Low
California County Reinstates Mask Mandate in Health Facilities as Respiratory Illness Risk Rises
Israel and Hamas Agree to First Phase of Trump-Brokered Gaza Truce, Hostages to Be Freed
French Political Turmoil Elevates Marine Le Pen as Rassemblement National Poised for Power
China Unveils Sweeping Rare Earth Export Controls to Shield ‘National Security’
The Davos Set in Decline: Why the World Economic Forum’s Power Must Be Challenged
France: Less Than a Month After His Appointment, the New French Prime Minister Resigns
Hungarian Prime Minister Viktor Orbán stated that Hungary will not adopt the euro because the European Union is falling apart.
Sarah Mullally Becomes First Woman Appointed Archbishop of Canterbury
Mayor in western Germany in intensive care after stabbing
Australian government pays Deloitte nearly half a million dollars for a report built on fabricated quotes, fake citations, and AI-generated nonsense.
US Prosecutors Gained Legal Approval to Hack Telegram Servers
Macron Faces Intensifying Pressure to Resign or Trigger New Elections Amid France’s Political Turmoil
Standard Chartered Names Roberto Hoornweg as Sole Head of Corporate & Investment Banking
UK Asylum Housing Firm Faces Backlash Over £187 Million Profits and Poor Living Conditions
UK Police Crack Major Gang in Smuggling of up to 40,000 Stolen Phones to China
BYD’s UK Sales Soar Nearly Nine-Fold, Making Britain Its Biggest Market Outside China
Trump Proposes Farm Bailout from Tariff Revenues Amid Backlash from Other Industries
FIFA Accuses Malaysia of Forging Citizenship Documents, Suspends Seven Footballers
Latvia to Bar Tourist and Occasional Buses to Russia and Belarus Until 2026
×