FutureScot
Justice & Policing

Survey results show public split over use of live facial recognition technology by police in Scotland

Detective chief superintendent Gordon McCreadie speaking at the biometrics conference in Edinburgh. Photograph: Futurescot

A new survey has shown that the public are almost split down the middle when it comes to the use of live facial recognition (LFR) technology by police in Scotland.

Initial results of a public poll by Police Scotland show that 49 per cent of the public would be ‘somewhat or very comfortable’ with the use of the AI-powered technology, compared to 48 per cent who would be ‘somewhat or very uncomfortable’.

The top line figures were revealed by a senior police officer to a conference in Edinburgh on Friday, involving organisations it has engaged in a ‘national conversation’ over the course of the last 12 months.

Civil liberties groups including Big Brother Watch, the Scottish Human Rights Commission and the Coalition for Racial Equality and Rights – leading critics of the technology – were among 26 organisations who took part in the process.

Police Scotland, with its oversight body the Scottish Police Authority and the Scottish Biometrics Commissioner – which has a code of practice regulating the use of technologies such as live facial recognition – led the national conversation process.

The force stressed, however, that the public survey was not representative, and that a formal public consultation – with recognised polling methodologies – would need to be carried out before any operational decision to use the technology.

Detective chief superintendent Gordon McCreadie, who oversees biometrics at Police Scotland, said the force was ‘dipping its toe in the water’ with the public poll, as well as the use of focus groups it convened to test public sentiment around LFR.

The findings are to be presented to the Scottish Police Authority’s policing performance committee on June 10.

DCS McCreadie said: “Let me make it clear, in respect of the public survey, this was an open survey. It was a survey in which anybody could participate. It’s not representative.

“So it was a very much toe in the water type of survey, and the results, I think are quite interesting.”

He added: “However, we do know that it was widely shared on social media by the police and our specific interest groups. So it’s very difficult for us to tell who completed the survey and if we were to proceed, or even consider proceeding, we will have to make a much more informed, targeted consultation to properly consider a representative sample of society.”

McCreadie added that the survey showed that 30 per cent of those polled would feel “more comfortable” if the technology – which relies on AI-powered cameras matching faces in a crowd against a known ‘watchlist’ – was used in ‘specific circumstances’.

The force tested out three different scenarios for that purpose: the first was in the ‘nighttime economy’, where the risk of sexual offending or violence was deemed to be high; the second was for vulnerable or missing people, where the technology could be deployed in busy places like transport hubs, and the third was for indoor events, which could be potential targets for terrorism or where those subject to restrictions, such as registered sex offenders, may be prevented from attending.

When focused on those use cases, the results were more persuasive. In each scenario, the survey showed 60, 59 and 60 per cent of those polled respectively would be either ‘somewhat or very comfortable’ with the use of the technology. However, when the force asked for public comments in a qualitative part of the study, the main feedback was that the technology should not be used.

The focus groups also emphasised the need for continued consultation and engagement with marginalised communities.

McCreadie did not take a stance on the controversial technology, which is in use by the Metropolitan police in London, and South Wales police.

And he said the fact that Police Scotland, the SPA and Scottish Biometrics Commissioner have been involved in the stewardship of the national conversation should not be taken as a sign that the outcome has been ‘pre-determined’.

“Just because other police forces are doing it doesn’t mean we should do it,” he said. “However, as a responsible service, who are engaged by the public and support the public, we absolutely have a necessity to consider the use of it to protect the public. So that is why we are having this conversation now.”

Another earlier survey by an independent organisation found that 66% of respondents supported Police Scotland using LFR, increasing to 72% when specific use cases were considered.

The wider context

The Metropolitan police in London officially began operational use of live facial recognition (LFR) technology in February 2020, following several years of trials. The first deployment occurred at the Stratford Centre in East London, where van-mounted cameras scanned thousands of shoppers’ faces against a watchlist of individuals wanted by the police.

The force now deploys the technology around four times per week, with each deployment averaging between four and seven hours, the conference heard. Cameras are mounted prominently on red vans in public places, with signage on display informing the public of their use, and officers informing people locally.

Detective chief superintendent Andy Day, who is the ‘authorising officer’ for the use of facial recognition technology by the force, said that the cameras scan faces in crowds in real-time, with officers monitoring a live feed broadcast on screens inside the vehicles. Passers-by automatically have their faces blurred out on the screens within 0.5 seconds, unless the image captured by the camera is matched to a known face on the ‘pre-determined’ watchlist. Images blurred out are subsequently deleted.

If a suspect is identified by the AI-powered algorithms underpinning the software, a system developed by the Japanese company NEC, an alert is then generated and sent to the tablet device of an officer on the ground – whereupon they can approach the person to confirm their identity.

In 2024, data shared by DCS Day showed the Met had made 578 arrests, with 424 people charged. The system helped officers stop 616 registered sex offenders, with 58 arrested and 38 charged. Day gave an example of one registered sex offender who was spotted last year walking with a young girl, in breach of his conditions. “Who knows what was stopped as a result of that interaction,” he said.

The algorithm developed by NEC was first tested by the National Physical Laboratory – a UK Government facility specialising in the science of measurements – in 2021. Day said scientists found no bias in the algorithm across demographics including gender or ethnicity, when set at a statistical distance/similarity threshold of 0.6, and that they would expect to see false alerts in 1 out of every 6,000 faces scanned. In reality, Day said, the force is experiencing a much lower false positive rate at 1 in every 35,000 faces scanned. In effect, 0.6 is a cutoff point and any value higher than that is unlikely to be a match.

“So that’s more positive for us, but we do have false alerts,” Day said. “There are instances where false alerts have happened. But as I said, it’s important that having the individual engage with the subject as a result, having that human in the loop.”

Vans are usually deployed in crime hot-spots across London boroughs, using an ‘intelligence-led’ approach, with the watchlist comprised primarily of custody images, said Day. As an authorising officer, he said he has to consider multiple pre-deployment factors including whether the vans will be located near to sensitive locations including schools or places of worship, and balancing human rights considerations such as the right to family life, and freedoms of expression and of religion.

High-risk missing persons may also be added to a watchlist, and officers need to consider whether the use of LFR is necessary, proportionate to the policing aims of an operation.

Challenge

Civil liberties campaign groups such as Big Brother Watch have objected to the use of LFR systems, by police and the retail sector. The organisation’s Madeleine Stone confirmed during a question and answer session at the event that it continues to support an ongoing civil case in relation to the Met’s use of the technology. It is supporting a legal challenge by Shaun Thompson, a black anti-knife crime activist, who was detained by police for 20 minutes last year after he was misidentified by an LFR system deployed at London Bridge Station.

More concerning, said Lucien Staddon Foster of the Coalition for Racial Equality and Rights, was that Thompson was detained despite officers realising he did not look like the man on the watchlist.

“He was pulled aside anyway, and was threatened with arrest if he did not provide his fingerprints to prove that he was not, in fact, the man on the watchlist, even though officers are on record saying that they did not think that he looked similar, but trusted the technology enough to just disregard that and continue,” said Staddon Foster, who also questioned whether the 0.6 threshold for the algorithm did enough to eliminate bias, rather than just mask it, and whether the NPL tests had been sufficiently peer-reviewed.

Professor Paul Taylor, the chief scientific adviser at the National Police Chief’s Council, insisted that police were aware of the challenges. “We should be really conscious of the limitations of statistics,” he said. “Because one person.. is too many people. Of course, statistics operate at an aggregate level, so we’re very aware of it, but we’re seeing improvements in the algorithms which are going in the right direction.”

DCS Day added: “We haven’t had an instance where someone’s been arrested on the back of the false alert, but I do recognise that there are concerns.”

Mission creep was another concern. Although the Met is only currently using the CCTV system on mobile vans, Liz Thomson from Amnesty International raised fears that there were plans for permanently mounted live facial recognition cameras in Croydon. DCS Day said there were plans for a pilot of static cameras to be mounted on ‘street furniture’ over the course of the summer, but it would be a temporary measure and would be evaluated accordingly.

Regulation

As well as common law, the technology is governed by a raft of primary legislation including the Data Protection Act, GDPR, the Human Rights Act, and the Equalities Act. There is also a white paper due to be published at Westminster discussing potential legal routes to govern the technology. In Scotland, the office of the Scottish Biometrics Commissioner is the regulating authority for biometrics and has developed a 12-point Code of Practice, which police must adhere to when considering use of biometrics technologies such as live facial recognition. A complaints procedure allows members of the public to complain if they think police are not using their biometric data in line with the code.

Since 11 August 2020, when the Court of Appeal of England and Wales found that an operational deployment of automated facial recognition technology (AFR) by South Wales Police was ‘unlawful’ – in the landmark Bridges case – because of planning shortcomings, there has been added emphasis on securing public trust and compliance with existing legislation.

Police Scotland has not yet used LFR but the Commissioner has stated that it should be available to the chief constable as a strategic or tactical option where there is a ‘significant threat’ public safety or security. Following the controversial introduction of so-called ‘cyber kiosks’ – software used by police to extract data from peoples’ devices – there has been added emphasis on winning public trust before introducing new technology: a ‘rights-based pathway’ is now in place at Police Scotland to consult with the public before any new technologies are deployed.

Going forward, the conference heard that there may be a need for a single defining piece of legislation that governs the use of the technology, or biometrics more widely. There was even mention of a possible AI act, which LFR could fall under. Madeleine Stone at Big Brother Watch said the different policies and procedures currently in place between police forces creates a ‘patchwork effect’.

“We feel that the police are really being able to write their own rules in this and so we know that…every parliamentary committee who’s looked at this, both in Scotland and the Scotland and the UK, has called for a proper basis in primary legislation,” she said, calling the way police have added certain categories of people to watchlists ‘extraordinary permissive’.

LFR in Scotland – leadership

Chief constable Jo Farrell has gone on record to state that she thinks policing in Scotland would be enhanced by live facial recognition technology. Farrell, who has overall command and responsibility for the Police Service of Scotland and leads 22,000 officers and staff serving communities across a third of the United Kingdom’s landmass, said last October that ‘live facial recognition may help us keep people safe and communities safer, but we need to start that conversation and listen to public’s views and concerns’.

At Friday’s conference, Assistant Chief Constable Steve Johnson, who leads on major crime, public protection and local crime for the force, echoed that position.

“I don’t think we will ever get to a true consensus around the use of these sorts of technology,” he said. “The use cases that we’ve got and the interest that we have in Police Scotland is around saving life, preventing and detecting crime, the classic cases that we use…that vulnerable adult child that goes missing that we can use the technology to try and find them and prevent them from coming to serious harm, those sex offenders, violent offenders, offenders of all types, that seek to target communities in certain locations.”

He acknowledged the importance of the national conversation with the stakeholder organisations invited into that process, but also said it was a ‘limited’ audience and that the force has to consider the views of the 5.5 million people living in Scotland, for example those whose loved ones ‘may go missing tomorrow’ or of people walking down the street who might object to their images being used by police for no specified purpose. However, he said that he would hope that the national conversation would come to a conclusion in the ‘fairly near future’ and the force could then begin a formal public consultation process.

“Again, my hope is that we’re permitted to use the technology with all the right checks and balances that help us save life, prevent and detect crime,” he added.

Related posts

Scottish biometrics commissioner backs proposal for four nation summit on AI

Kevin O'Sullivan
June 7, 2023

‘We need a biometrics commissioner for more than policing’

Liam McArthur
September 20, 2018

Officer-recommended stress management strategies – how global justice tech firm Axon supports policing mental health

Axon
September 19, 2025
Exit mobile version