London Daily

Focus on the big picture.
Monday, Sep 08, 2025

One in three councils using algorithms to make welfare decisions

One in three councils using algorithms to make welfare decisions

One in three councils are using computer algorithms to help make decisions about benefit claims and other welfare issues, despite evidence emerging that some of the systems are unreliable.
Companies including the US credit-rating businesses Experian and TransUnion, as well as the outsourcing specialist Capita and Palantir, a data-mining firm co-founded by the Trump-supporting billionaire Peter Thiel, are selling machine-learning packages to local authorities that are under pressure to save money.

A Guardian investigation has established that 140 councils out of 408 have now invested in the software contracts, which can run into millions of pounds, more than double the previous estimates.

The systems are being deployed to provide automated guidance on benefit claims, prevent child abuse and allocate school places. But concerns have been raised about privacy and data security, the ability of council officials to understand how some of the systems work, and the difficulty for citizens in challenging automated decisions.

It has emerged North Tyneside council has dropped TransUnion, whose system it used to check housing and council tax benefit claims. Welfare payments to an unknown number of people were wrongly delayed when the computer’s “predictive analytics” erroneously identified low-risk claims as high risk.

Meanwhile, Hackney council in east London has dropped Xantura, another company, from a project to predict child abuse and intervene before it happens, saying it did not deliver the expected benefits. And Sunderland city council has not renewed a £4.5m data analytics contract for an “intelligence hub” provided by Palantir.

A spokesperson for the Local Government Association, which represents councils, said: “Good use of data can be hugely beneficial in helping councils make services more targeted and effective But it is important to note that data is only ever used to inform decisions and not make decisions for councils.”

But Silkie Carlo, the director of the campaign group Big Brother Watch, said the increasing use of algorithms was leaving vulnerable people at the whim of “automated decisions … they have no idea about and can’t challenge”.

Gwilym Morris, a management consultant who works with IT providers to the public sector, said the complexity of the systems meant the leadership of local authorities “don’t really understand what is going on” and this raised questions about how citizens’ data was used.

North Tyneside stopped using TransUnion’s system last month. It automatically processed data about claimants for housing and council tax benefit to determine the likelihood it was fraudulent – “risk based verification”. But most of the cases deemed high risk by the software were in fact lower risk, and benefit claims were wrongly delayed.

A council report concluded: “TransUnion provides no reason for a case meeting a high-risk category and it was found that in most cases, the reason for it being high risk could not be established There was no reason for the payment to be withheld, but claims had been delayed.”

A spokesperson for TransUnion said the classification of risk groups was “ultimately a matter for the local authorities to decide”.

They added: “Each local authority also determines what extra checks are required for claimants falling into any particular category, and can monitor for accuracy so that they can adapt their criteria if necessary.

“The time spent on reviewing ‘high-risk’ claims will equally depend on each local authority’s own policy in terms of processing the additional checks.”

TransUnion said it checked benefit claims for fraud for about 70 local authorities in the UK, and Xantura serves the same number of councils. The combined figure does not include other examples of algorithms found by the Data Justice Lab at Cardiff University.

Sunderland council awarded a contract for a new “intelligence hub” to Palantir in 2014 as part of a plan to make efficiency savings. Last year, it was announced the authority faces a budget gap of about £50m over the next three years.

The hub was used to analyse data to help with the Troubled Families programme, to bring together information on those at risk of sexual exploitation, and to help find areas at risk of flooding. The council said it did not hold a review of the project and did not know how much had been saved.

A council spokesperson said it was always the authority’s intention not to renew the contract and Palantir had worked alongside staff to transfer knowledge so the council would become “self-sufficient”.

Hackney council said “issues of variable data quality meant that the system wasn’t able to provide sufficiently useful insights”.

The Xantura predictive model analyses warning signs about a household, such as a child being expelled from school or a report of domestic violence. The model’s prediction is then passed to a social worker for potential action.

Wajid Shafiq, the chief executive of Xantura, said: “We’re improving the accuracy of our analytics and models continuously but we have never been unable to develop a reliable predictive model.

“There are a number in place right now, adding real value. Not being able to access regular updates of source data to drive the model is a bigger issue – if we don’t get regular feeds, we can’t provide an up-to-date picture of risk factors.”

Simon Burall, a senior associate with the public participation charity Involve, said: “There are never just benefits from these things but risks and harms, namely privacy and data security.

“But also potential wider unintended consequences, including the stigmatisation of communities and unwanted intrusion by particular services. Any benefits must be balanced against those potential risks.”

David Spiegelhalter, a former president of the Royal Statistical Society, said: “There is too much hype and mystery surrounding machine learning and algorithms. I feel that councils should demand trustworthy and transparent explanations of how any system works, why it comes to specific conclusions about individuals, whether it is fair, and whether it will actually help in practice.”
Newsletter

Related Articles

0:00
0:00
Close
Trump Threatens Retaliatory Tariffs After EU Imposes €2.95 Billion Fine on Google
Tesla Board Proposes Unprecedented One-Trillion-Dollar Performance Package for Elon Musk
US Justice Department Launches Criminal Mortgage-Fraud Probe into Federal Reserve Governor Lisa Cook
Escalating Drug Trafficking and Violence in Latin America: A Growing Crisis
US and Taiwanese Defence Officials Held Secret Talks in Alaska
Report: Secret SEAL Team 6 Mission in North Korea Ordered by Trump in 2019 Ended in Failure
Gold Could Reach Nearly $5,000 if Fed Independence Is Undermined, Goldman Sachs Warns
Uruguay, Colombia and Paraguay Secure Places at 2026 World Cup
Florida Murder Case: The Adelson Family, the Killing of Dan Markel, and the Trial of Donna Adelson
Trump Administration Advances Plans to Rebrand Pentagon as Department of War Instead of the Fake Term Department of Defense
Big Tech Executives Laud Trump at White House Dinner, Unveil Massive U.S. Investments
Tether Expands into Gold Sector with Profit-Driven Diversification
‘Looks Like a Wig’: Online Users Express Concern Over Kate Middleton
Brand-New $1 Million Yacht Sinks Just Fifteen Minutes After Maiden Launch in Turkey
Here’s What the FBI Seized in John Bolton Raid — and the Legal Risks He Faces
Florida’s Vaccine Revolution: DeSantis Declares War on Mandates
Trump’s New War – and the ‘Drug Tyrant’ Fearing Invasion: ‘1,200 Missiles Aimed at Us’
"The Situation Has Never Been This Bad": The Fall of PepsiCo
At the Parade in China: Laser Weapons, 'Eagle Strike,' and a Missile Capable of 'Striking Anywhere in the World'
The Fashion Designer Who Became an Italian Symbol: Giorgio Armani Has Died at 91
Putin Celebrates ‘Unprecedentedly High’ Ties with China as Gazprom Seals Power of Siberia-2 Deal
China Unveils New Weapons in Grand Military Parade as Xi Hosts Putin and Kim
Queen Camilla’s Teenage Courage: Fended Off Attempted Assault on London Train, New Biography Reveals
Scottish Brothers Set Record in Historic Pacific Row
Rapper Cardi B Cleared of Liability in Los Angeles Civil Assault Trial
Google Avoids Break-Up in U.S. Antitrust Case as Stocks Rise
Couple celebrates 80th wedding anniversary at assisted living facility in Lancaster
Information Warfare in the Age of AI: How Language Models Become Targets and Tools
The White House on LinkedIn Has Changed Their Profile Picture to Donald Trump
"Insulted the Prophet Muhammad": Woman Burned Alive by Angry Mob in Niger State, Nigeria
Trump Responds to Death Rumors – Announces 'Missile City'
Court of Appeal Allows Asylum Seekers to Remain at Essex Hotel Amid Local Tax Boycott Threats
Germany in Turmoil: Ukrainian Teenage Girl Pushed to Death by Illegal Iraqi Migrant
United Krack down on human rights: Graham Linehan Arrested at Heathrow Over Three X Posts, Hospitalised, Released on Bail with Posting Ban
Asian and Middle Eastern Investors Avoid US Markets
Ray Dalio Warns of US Shift to Autocracy
Eurozone Inflation Rises to 2.1% in August
Russia and China Sign New Gas Pipeline Deal
China's Robotics Industry Fuels Export Surge
Suntory Chairman Resigns After Police Probe
Gold Price Hits New All-Time Record
Von der Leyen's Plane Hit by Suspected Russian GPS Interference in an Incident Believed to Be Caused by Russia or by Pro-Peace or by Anti-Corruption European Activists
UK Fintechs Explore Buying US Banks
Greece Suspends 5% of Schools as Birth Rate Drops
Apollo to Launch $5 Billion Sports Investment Vehicle
Bolsonaro Trial Nears Close Amid US-Brazil Tension
European Banks Push for Lower Cross-Border Barriers
Poland's Offshore Wind Sector Attracts Investors
Nvidia Reveals: Two Mystery Customers Account for About 40% of Revenue
Woody Allen: "I Would Be Happy to Direct Trump Again in a Film"
×