In recent months, Police Scotland has signalled its willingness to start exploring the use of controversial live facial recognition (LFR) technology. Software powered by artificial intelligence (AI), coupled with cameras deployed in public spaces – using algorithms to match the faces of known criminals on databases – has the potential to greatly assist officers on the ground in the real-time detection and prevention of crime. 

They have been used for several years for police operations in London, and in South Wales, where strenuous efforts have been made to convince politicians, privacy campaigners and the public of their effectiveness in keeping people and communities safe, with legal and regulatory challenges now largely surmounted.

That debate is now shifting to Scotland, where the Chief Constable of Police Scotland has gone on record twice publicly over the summer to declare that AI could be just as important a tool for policing in keeping violent criminals off the streets, as it is to doctors treating cancer patients.

The force has been quietly overhauling its internal processes to ensure that new technologies go through a “rights-based” pathway, to ensure that checks and balances – vis-à-vis important subjects like ethics and privacy – are enshrined in the way they consider and deploy technology on the frontline. 

It is important to note that the pathway is in part due to the mishandling of so-called “cyber kiosks”, where the force was accused of introducing technology that allowed them to hack people’s phones.

With hard lessons learned, and new policies in place, Police Scotland is now going through the gears in terms of technology adoption. Body-worn video cameras are finally being deployed after a period of inertia, a national Digital Evidence Sharing Capability platform has been launched, and waiting in the wings is LFR technology.

Although the force already has the tacit approval of the Scottish Biometrics Commissioner to use the technology, in “circumstances” where operational policing needs arise, it will have to prove to the public, policymakers and certain interest groups – such as civil liberties campaigners – that the technology safeguards peoples’ rights, and that there is no mission creep into wider realms of public surveillance.

Paul Roberts of tech firm NEC Software Solutions, whose LFR solution underwent rigorous scientific checks before going live with the Metropolitan Police in London and South Wales Police, said: “It’s clear that Scotland is starting to have these kinds of conversations, which is absolutely right. 

“As a technology provider, all we can do is present the evidence as factually correct and to take part in independent testing that would hopefully give confidence to policymakers and the public that live facial recognition technology is not only safe and effective to use, but that it also greatly assists officers in taking known offenders off the streets.”

Recent statistics revealed by the Met show that 275 known suspects were arrested in the first six months of this year. For Roberts, they are people apprehended who wouldn’t have necessarily been picked up by police on the ground.

“When you have busy public spaces, identifying suspects, if all you have is a photo, sometimes an old photo, is incredibly challenging,” he says. “The AI can do that for you, even allowing for certain changes, for example the ageing process; it can identify the feature points on a face, and the relationships between our eyes, ears, nose and brow. 

“So, even if facial hair has changed, or other organic changes have taken place, actually the structure of the face and the relationships between feature points are still strong enough that we can get a match.”

The underlying algorithm for NEC’s software, developed by AI experts in Japan, has been thoroughly tested by both the National Physical Laboratory (NPL) in the UK and the National Institute of Standards and Technology in the US. Its NeoFace platform has, in fact, been validated to achieve true-positive identification rates of 100 per cent by the NPL for retrospective facial recognition and 89 per cent for live facial recognition.

Encouragingly, the tests also found no difference in the matching rate across various ethnic groups, eliminating the possibility of bias according to skin colour. For Roberts, the fact NEC removed those biases in its algorithm, which not all suppliers have managed to do, was an important step forward in proving that the technology can work in the real world, safely and ethically.

Accuracy, in general, has also improved. At a House of Lords inquiry in December, Mark Travis, the senior responsible officer for facial recognition at South Wales Police, said the platform had been deployed 14 times that year, and out of 819,943 people reviewed by the force, they had observed an error rate of zero.

“I think once the evidence is laid out, and people can be assured that the algorithm hasn’t just been tested on a small set of images – but tens of thousands of images – then the facts speak for themselves,” says Roberts, who gave evidence at the same inquiry.

“And we mustn’t forget, we are not just letting an algorithm do the police’s work for them. The system matches someone’s face against a database and expresses it as being a ‘high degree of similarity’. The technology is there to assist the officer, not replace them, and the decision to make an arrest remains with the officer.”

And there’s reassurance for people who fear mass surveillance and data harvesting, too. The NeoFace platform automatically pixelates anyone’s face who isn’t on the operation’s database, so in the context of crowds, the officer sitting in a van watching the screen can only see people of interest. 

For the vast majority of the public, their faces are automatically deleted from the system and not stored on police computer systems. “The use of the system has been operationally scrutinised to the highest levels,” says Roberts. 

“Many of the people on the databases will have had their images taken whilst in custody, and police may have determined to use the system in scenarios where it is thought that those people, who may have a warrant out against them, could appear. So, it is not just a blanket approach that is taken. It is a selective, targeted approach.”

As the technology comes to be debated in Scotland, these are important markers for Roberts in shaping the discussion. “We’re aware that there have been some fixed positions, and to an extent you may not be able to change everyone’s minds, but it’s important to emphasise that nothing – data wise – is stored. 

“It is instantly deleted. And for the five or six people next to a person in a crowd who perhaps is on a database, they will not be visible to the operator viewing them on screen: they will be blurred out.”

While the Met and South Wales Police are the principal users of the system, there is a shared “mutual aid” agreement with other forces in the UK, allowing officers in different geographies to trial the technology. That has not happened in Scotland – at least not in the live facial recognition sense – but there are other forces now on the procurement journey for LFR platforms. 

The Eastern grouping of police forces, encompassing Kent, Essex, Bedfordshire, Cambridgeshire, Hertfordshire, Norfolk and Suffolk, are working as a collective to adopt the technology, albeit with a different supplier. “As their comfort with it increases, and they realise the benefits of the technology, I think that adoption will grow across the country,” says Roberts. 

“And that applies to the way they bring local communities, and ethics groups, into the conversation as well.”

All this against a backdrop of steadily improving technology, and its use in other settings. “We’ve seen airport queues reduce as a result of the introduction of e-gates, where passengers have their passports scanned against their faces for a match,” adds Roberts. “These are accepted uses of the technology which people have been largely supportive of.”

In terms of the developments, Roberts says camera technology and AI advances are much like the next iteration of an iPhone. “Every time a new model comes out, you expect it to have better features, to be able to do more things. 

“It’s the same with CCTV cameras, or public cameras in general, where the resolution gets better. But that’s also applicable to the AI in the software: even where the picture is grainy or it’s raining, or there’s low light, algorithms can be tweaked to account for extremes of image quality.”

He says: “It can already take account for facial markings like tattoos, and can recognise people wearing masks, despite them obscuring the lower part of their face. Because it was developed in Japan, where mask wearing has been a feature of society for a long time, the tolerances for that have been already built into the algorithm. 

“We expect that as more forces get comfortable with using entry-level AI capabilities, for example with retrospective facial recognition technology, then the appetite will only grow.” 


Partner Content in association with NEC Software Solutions