A justice sub-committee of lawmakers in the Scottish Parliament in Holyrood published the report on Tuesday, concluding that there would be “no justifiable basis” for Police Scotland to invest in the software due to privacy and human rights concerns.
The police force had previously stated that they would like to use the technology from 2026, but have now put those plans on hold to take part in a wider debate about the implications of the software.
The report shockingly certified that the live facial recognition technology – which cross-references CCTV images with police databases – was “known to discriminate against females and those from black, Asian and ethnic minority communities.” It critically added that its use “would be a radical departure from… [the] fundamental principle of policing by consent.”
Police Scotland Assistant Chief Constable Duncan Sloan said that they would now conduct a public consultation on the live software and keep a “watching brief on the trialing of the technology in England and Wales.”
While Scotland has – albeit momentarily – decided that the software is “unfit” for use by its police force, England and Wales have ostensibly deemed it workable.
In January, London Metropolitan Police announced that they would be deploying facial recognition cameras for the first time on the streets of the capital within the next month. It comes after recent pilots of the new surveillance technology in London and South Wales.
Its deployment has provoked criticism from privacy groups such as Big Brother Watch, which have warned that the new development in technology represents “an enormous expansion of the surveillance state and a serious threat to civil liberties in the UK.”
Don't let the fear of losing be greater than the excitement of winning.