London Daily

Focus on the big picture.
Tuesday, Jul 29, 2025

One in three councils using algorithms to make welfare decisions

One in three councils using algorithms to make welfare decisions

One in three councils are using computer algorithms to help make decisions about benefit claims and other welfare issues, despite evidence emerging that some of the systems are unreliable.
Companies including the US credit-rating businesses Experian and TransUnion, as well as the outsourcing specialist Capita and Palantir, a data-mining firm co-founded by the Trump-supporting billionaire Peter Thiel, are selling machine-learning packages to local authorities that are under pressure to save money.

A Guardian investigation has established that 140 councils out of 408 have now invested in the software contracts, which can run into millions of pounds, more than double the previous estimates.

The systems are being deployed to provide automated guidance on benefit claims, prevent child abuse and allocate school places. But concerns have been raised about privacy and data security, the ability of council officials to understand how some of the systems work, and the difficulty for citizens in challenging automated decisions.

It has emerged North Tyneside council has dropped TransUnion, whose system it used to check housing and council tax benefit claims. Welfare payments to an unknown number of people were wrongly delayed when the computer’s “predictive analytics” erroneously identified low-risk claims as high risk.

Meanwhile, Hackney council in east London has dropped Xantura, another company, from a project to predict child abuse and intervene before it happens, saying it did not deliver the expected benefits. And Sunderland city council has not renewed a £4.5m data analytics contract for an “intelligence hub” provided by Palantir.

A spokesperson for the Local Government Association, which represents councils, said: “Good use of data can be hugely beneficial in helping councils make services more targeted and effective But it is important to note that data is only ever used to inform decisions and not make decisions for councils.”

But Silkie Carlo, the director of the campaign group Big Brother Watch, said the increasing use of algorithms was leaving vulnerable people at the whim of “automated decisions … they have no idea about and can’t challenge”.

Gwilym Morris, a management consultant who works with IT providers to the public sector, said the complexity of the systems meant the leadership of local authorities “don’t really understand what is going on” and this raised questions about how citizens’ data was used.

North Tyneside stopped using TransUnion’s system last month. It automatically processed data about claimants for housing and council tax benefit to determine the likelihood it was fraudulent – “risk based verification”. But most of the cases deemed high risk by the software were in fact lower risk, and benefit claims were wrongly delayed.

A council report concluded: “TransUnion provides no reason for a case meeting a high-risk category and it was found that in most cases, the reason for it being high risk could not be established There was no reason for the payment to be withheld, but claims had been delayed.”

A spokesperson for TransUnion said the classification of risk groups was “ultimately a matter for the local authorities to decide”.

They added: “Each local authority also determines what extra checks are required for claimants falling into any particular category, and can monitor for accuracy so that they can adapt their criteria if necessary.

“The time spent on reviewing ‘high-risk’ claims will equally depend on each local authority’s own policy in terms of processing the additional checks.”

TransUnion said it checked benefit claims for fraud for about 70 local authorities in the UK, and Xantura serves the same number of councils. The combined figure does not include other examples of algorithms found by the Data Justice Lab at Cardiff University.

Sunderland council awarded a contract for a new “intelligence hub” to Palantir in 2014 as part of a plan to make efficiency savings. Last year, it was announced the authority faces a budget gap of about £50m over the next three years.

The hub was used to analyse data to help with the Troubled Families programme, to bring together information on those at risk of sexual exploitation, and to help find areas at risk of flooding. The council said it did not hold a review of the project and did not know how much had been saved.

A council spokesperson said it was always the authority’s intention not to renew the contract and Palantir had worked alongside staff to transfer knowledge so the council would become “self-sufficient”.

Hackney council said “issues of variable data quality meant that the system wasn’t able to provide sufficiently useful insights”.

The Xantura predictive model analyses warning signs about a household, such as a child being expelled from school or a report of domestic violence. The model’s prediction is then passed to a social worker for potential action.

Wajid Shafiq, the chief executive of Xantura, said: “We’re improving the accuracy of our analytics and models continuously but we have never been unable to develop a reliable predictive model.

“There are a number in place right now, adding real value. Not being able to access regular updates of source data to drive the model is a bigger issue – if we don’t get regular feeds, we can’t provide an up-to-date picture of risk factors.”

Simon Burall, a senior associate with the public participation charity Involve, said: “There are never just benefits from these things but risks and harms, namely privacy and data security.

“But also potential wider unintended consequences, including the stigmatisation of communities and unwanted intrusion by particular services. Any benefits must be balanced against those potential risks.”

David Spiegelhalter, a former president of the Royal Statistical Society, said: “There is too much hype and mystery surrounding machine learning and algorithms. I feel that councils should demand trustworthy and transparent explanations of how any system works, why it comes to specific conclusions about individuals, whether it is fair, and whether it will actually help in practice.”
Newsletter

Related Articles

0:00
0:00
Close
France Opens Criminal Investigation into X Over Algorithm Manipulation Allegations
A family has been arrested in the UK for displaying the British flag
Mel Gibson refuses to work with Robert De Niro, saying, "Keep that woke clown away from me."
Trump Steamrolls EU in Landmark Trade Win: US–EU Trade Deal Imposes 15% Tariff on European Imports
ChatGPT CEO Sam Altman says people share personal info with ChatGPT but don’t know chats can be used as court evidence in legal cases.
The British propaganda channel BBC News lies again.
Deputy attorney general's second day of meeting with Ghislaine Maxwell has concluded
Controversial March in Switzerland Features Men Dressed in Nazi Uniforms
Politics is a good business: Barack Obama’s Reported Net Worth Growth, 1990–2025
Thai Civilian Death Toll Rises to 12 in Cambodian Cross-Border Attacks
TSUNAMI: Trump Just Crossed the Rubicon—And There’s No Turning Back
Over 120 Criminal Cases Dismissed in Boston Amid Public Defender Shortage
UN's Top Court Declares Environmental Protection a Legal Obligation Under International Law
"Crazy Thing": OpenAI's Sam Altman Warns Of AI Voice Fraud Crisis In Banking
The Podcaster Who Accidentally Revealed He Earns Over $10 Million a Year
Trump Announces $550 Billion Japanese Investment and New Trade Agreements with Indonesia and the Philippines
US Treasury Secretary Calls for Institutional Review of Federal Reserve Amid AI‑Driven Growth Expectations
UK Government Considers Dropping Demand for Apple Encryption Backdoor
Severe Flooding in South Korea Claims Lives Amid Ongoing Rescue Operations
Japanese Man Discovers Family Connection Through DNA Testing After Decades of Separation
Russia Signals Openness to Ukraine Peace Talks Amid Escalating Drone Warfare
Switzerland Implements Ban on Mammography Screening
Japanese Prime Minister Vows to Stay After Coalition Loses Upper House Majority
Pogacar Extends Dominance with Stage Fifteen Triumph at Tour de France
CEO Resigns Amid Controversy Over Relationship with HR Executive
Man Dies After Being Pulled Into MRI Machine Due to Metal Chain in New York Clinic
NVIDIA Achieves $4 Trillion Valuation Amid AI Demand
US Revokes Visas of Brazilian Corrupted Judges Amid Fake Bolsonaro Investigation
U.S. Congress Approves Rescissions Act Cutting Federal Funding for NPR and PBS
North Korea Restricts Foreign Tourist Access to New Seaside Resort
Brazil's Supreme Court Imposes Radical Restrictions on Former President Bolsonaro
Centrist Criticism of von der Leyen Resurfaces as she Survives EU Confidence Vote
Judge Criticizes DOJ Over Secrecy in Dropping Charges Against Gang Leader
Apple Closes $16.5 Billion Tax Dispute With Ireland
Von der Leyen Faces Setback Over €2 Trillion EU Budget Proposal
UK and Germany Collaborate on Global Military Equipment Sales
Trump Plans Over 10% Tariffs on African and Caribbean Nations
Flying Taxi CEO Reclaims Billionaire Status After Stock Surge
Epstein Files Deepen Republican Party Divide
Zuckerberg Faces $8 Billion Privacy Lawsuit From Meta Shareholders
FIFA Pressured to Rethink World Cup Calendar Due to Climate Change
SpaceX Nears $400 Billion Valuation With New Share Sale
Microsoft, US Lab to Use AI for Faster Nuclear Plant Licensing
Trump Walks Back Talk of Firing Fed Chair Jerome Powell
Zelensky Reshuffles Cabinet to Win Support at Home and in Washington
"Can You Hit Moscow?" Trump Asked Zelensky To Make Putin "Feel The Pain"
Irish Tech Worker Detained 100 days by US Authorities for Overstaying Visa
Dimon Warns on Fed Independence as Trump Administration Eyes Powell’s Succession
Church of England Removes 1991 Sexuality Guidelines from Clergy Selection
Superman Franchise Achieves Success with Latest Release
×