London Daily

Focus on the big picture.
Friday, Aug 22, 2025

UK Police officers raise concerns about 'biased' AI data

UK Police officers raise concerns about 'biased' AI data

Police officers have raised concerns about using "biased" artificial-intelligence tools, a report commissioned by one of the UK government's advisory bodies reveals. The study warns such software may "amplify" prejudices, meaning some groups could become more likely to be stopped in the street and searched.

It says officers also worry they could become over-reliant on automation.

And it says clearer guidelines are needed for facial recognition's use.

"The police are concerned that the lack of clear guidance could lead to uncertainty over acceptable uses of this technology," the Royal United Services Institute (Rusi)'s Alexander Babuta told BBC News.

"And given the lack of any government policy for police use of data analytics, it means that police forces are going to be reluctant to innovate.

"That means any potential benefits of these technologies may be lost because police forces' risk aversion may lead them not to try to develop or implement these tools for fear of legal repercussions."

Rusi interviewed about 50 experts for its study, including senior police officers in England and Wales - who were not named - as well as legal experts, academics and government officials.

The work was commissioned by the Centre for Data Ethics and Innovation, which plans to draw up a code of practice covering the police's use of data analytics next year.


'Self-fulfilling prophecy'


One of the key concerns expressed was about using existing police records to train machine-learning tools, since these might be skewed by the arresting officers' own prejudices.

"Young black men are more likely to be stopped and searched than young white men, and that's purely down to human bias," said one officer.

"That human bias is then introduced into the datasets and bias is then generated in the outcomes of the application of those datasets."

An added factor, the report said, was people from disadvantaged backgrounds were more likely to use public services frequently. And this would generate more data about them, which in turn could make them more likely to be flagged as a risk.

Matters could worsen over time, another officer said, when software was used to predict future crime hotspots.

"We pile loads of resources into a certain area and it becomes a self-fulfilling prophecy, purely because there's more policing going into that area, not necessarily because of discrimination on the part of officers," the interviewee said.

There was disagreement, however, on how much scope should be given to officers wanting to ignore predictive software's recommendations.

"Officers often disagree with the algorithm," said one.

"I'd expect and welcome that challenge. The point where you don't get that challenge, that's when people are putting that professional judgement aside."

But another officer worried about others being too willing to ignore an app's recommendations, adding: "Professional judgement might just be another word for bias."


'Patchwork quilt'


Mr Babuta said this problem could be addressed.

"There are ways that you can scan and analyse the data for bias and then eliminate it," he told BBC News.

"[And] there are police forces that are exploring the opportunities of these new types of data analytics for actually eliminating bias in their own data sets."

But he added that "we need clearer processes to ensure that those safeguards are applied consistently".

In the meantime, one officer described the current landscape as being like "a patchwork quilt - uncoordinated and delivered to different settings and for different outcomes".

The National Police Chiefs' Council has responded saying UK police always seek to strike a balance between keeping people safe and protecting their rights.

"For many years police forces have looked to be innovative in their use of technology to protect the public and prevent harm and we continue to explore new approaches to achieve these aims," Assistant Chief Constable Jonathan Drake said.

"But our values mean we police by consent, so anytime we use new technology we consult with interested parties to ensure any new tactics are fair, ethical and producing the best results for the public."

Newsletter

Related Articles

0:00
0:00
Close
After 200,000 Orders in 2 Minutes: Xiaomi Accelerates Marketing in Europe
Ukraine Declares De Facto War on Hungary and Slovakia with Terror Drone Strikes on Their Gas Lifeline
Animated K-pop Musical ‘KPop Demon Hunters’ Becomes Netflix’s Most-Watched Original Animated Film
New York Appeals Court Voids Nearly $500 Million Civil Fraud Penalty Against Trump While Upholding Fraud Liability
Elon Musk tweeted, “Europe is dying”
Far-Right Activist Convicted of Incitement Changes Gender and Demands: "Send Me to a Women’s Prison" | The Storm in Germany
Hungary Criticizes Ukraine: "Violating Our Sovereignty"
Will this be the first country to return to negative interest rates?
Child-free hotels spark controversy
North Korea is where this 95-year-old wants to die. South Korea won’t let him go. Is this our ally or a human rights enemy?
Hong Kong Launches Regulatory Regime and Trials for HKD-Backed Stablecoins
China rehearses September 3 Victory Day parade as imagery points to ‘loyal wingman’ FH-97 family presence
Trump Called Viktor Orbán: "Why Are You Using the Veto"
Horror in the Skies: Plane Engine Exploded, Passengers Sent Farewell Messages
MSNBC Rebrands as MS NOW Amid Comcast’s Cable Spin-Off
AI in Policing: Draft One Helps Speed Up Reports but Raises Legal and Ethical Concerns
Shame in Norway: Crown Princess’s Son Accused of Four Rapes
Apple Begins Simultaneous iPhone 17 Production in India and China
A Robot to Give Birth: The Chinese Announcement That Shakes the World
Finnish MP Dies by Suicide in Parliament Building
Outrage in the Tennis World After Jannik Sinner’s Withdrawal Storm
William and Kate Are Moving House – and the New Neighbors Were Evicted
Class Action Lawsuit Against Volkswagen: Steering Wheel Switches Cause Accidents
Taylor Swift on the Way to the Super Bowl? All the Clues Stirring Up Fans
Dogfights in the Skies: Airbus on Track to Overtake Boeing and Claim Aviation Supremacy
Tim Cook Promises an AI Revolution at Apple: "One of the Most Significant Technologies of Our Generation"
Apple Expands Social Media Presence in China With RedNote Account Ahead of iPhone 17 Launch
Are AI Data Centres the Infrastructure of the Future or the Next Crisis?
Cambridge Dictionary Adds 'Skibidi,' 'Delulu,' and 'Tradwife' Amid Surge of Online Slang
Bill Barr Testifies No Evidence Implicated Trump in Epstein Case; DOJ Set to Release Records
Zelenskyy Returns to White House Flanked by European Allies as Trump Pressures Land-Swap Deal with Putin
The CEO Who Replaced 80% of Employees for the AI Revolution: "I Would Do It Again"
Emails Worth Billions: How Airlines Generate Huge Profits
Character.ai Bets on Future of AI Companionship
China Ramps Up Tax Crackdown on Overseas Investments
Japanese Office Furniture Maker Expands into Bomb Shelter Market
Intel Shares Surge on Possible U.S. Government Investment
Hurricane Erin Threatens U.S. East Coast with Dangerous Surf
EU Blocks Trade Statement Over Digital Rule Dispute
EU Sends Record Aid as Spain Battles Wildfires
JPMorgan Plans New Canary Wharf Tower
Zelenskyy and his allies say they will press Trump on security guarantees
Beijing is moving into gold and other assets, diversifying away from the dollar
Escalating Clashes in Serbia as Anti-Government Protests Spread Nationwide
The Drought in Britain and the Strange Request from the Government to Delete Old Emails
Category 5 Hurricane in the Caribbean: 'Catastrophic Storm' with Winds of 255 km/h
"No, Thanks": The Mathematical Genius Who Turned Down 1.5 Billion Dollars from Zuckerberg
The surprising hero, the ugly incident, and the criticism despite victory: "Liverpool’s defense exposed in full"
Digital Humans Move Beyond Sci-Fi: From Virtual DJs to AI Customer Agents
YouTube will start using AI to guess your age. If it’s wrong, you’ll have to prove it
×