Live facial recognition technology in ‘no fit state’ to be used by police in Scotland

Live facial recognition technology is in ‘no fit state’ to be deployed by police in Scotland, politicians have concluded.

A committee tasked with investigating the use of the emerging technology found that the technology is biased against ethnic minorities and women.

The legal basis for police using such technology is also unclear, with concerns that it may infringe human rights or data protection regulations.

Without safeguards in place, the Justice Sub-Committee at the Scottish Parliament said any investment in the technology would be ‘unjustifiable’.

Although Police Scotland included plans for using live facial recognition technology in its strategic plan, it said it has no current plans to introduce it.

Sub-Committee Convener, John Finnie MSP, said: “The Sub-Committee is reassured that Police Scotland have no plans to introduce live facial recognition technology at this time. It is clear that this technology is in no fit state to be rolled out or indeed to assist the police with their work. 

“Current live facial recognition technology throws up far too many ‘false positives’, and contains inherent biases that are known to be discriminatory. 

“Our inquiry has also shone light on other issues with facial recognition technology that we now want the Scottish Police Authority and the Scottish Government to consider. Not least amongst these are the legal challenges against similar technologies in England and Wales, and the apparent lack of law explicitly governing its use in Scotland – by any organisation.

“So whether this technology is being used by private companies, public authorities or the police, the Scottish Government needs to ensure there is a clear legal framework to protect the public and police alike from operating in a facial recognition Wild West.”

The Sub-Committee has also made a number of other recommendations in its report into facial recognition, including a call for the Scottish Government to explicitly regulate the use of live facial recognition technology. The Sub-Committee suggested that any new legislation should also cover private companies and other public sector organisations, such as local authorities, as well as policing services.

Police Scotland previously outlined its ambition to introduce ‘live’ facial recognition technology in its Policing 2026 strategy. This technology works by flagging up suspects in real time by cross-referencing ‘live’ images picked up by CCTV cameras against police databases. However, Police Scotland’s earlier assessment that it is likely to have a positive impact on equalities and human rights stands in stark contrast to the evidence received by the Sub-Committee. 

More recently, Police Scotland has indicated that it is not currently planning to introduce live facial recognition. While the Sub-Committee believes Police Scotland should have the best available technology to keep people safe and combat crime, it has asked them to remove references to live facial recognition from its 2026 strategy if there is no intention to go ahead with the introduction in the stated timescale. 

Lastly, the Sub-Committee is requesting that the Scottish Police Authority (SPA), and the new Scottish Biometrics Commissioner (if the relevant legislation is passed by Parliament to create the post) review Police use of retrospective facial recognition technology. In particular, it wants this review to consider the impact of police accessing images held ‘illegally’ on the UK Police National Database, or on IT systems inherited from legacy Scottish forces, which contain images of people not convicted of any crime.

The independent advisory group on the Use of Biometric Data in Scotland (IAG) found that Police Scotland currently holds or retains more than 1 million custody images (in total), and that the new custody system should have the technical capability to identify and dispose of photographs of people who are not convicted or not proceeded against. Currently, it is understood that police are unable to delete the images, because of a ‘technical issue’.

Furthermore, the Sub-Committee heard evidence that the UK PND holds:

  • Thousands of images of innocent persons;
  • Hundreds of thousands, if not millions, of custody images of people who were arrested, but not convicted of an offence.