PRESS RELEASE : Police urged to double AI-enabled facial recognition searches [October 2023]
The press release issued by the Home Office on 29 October 2023.
Policing Minister challenges police to double retrospective facial recognition searches to track down known offenders by May 2024.
Police should double the number of searches they make using retrospective facial recognition technology to track down known offenders by May 2024, as the Policing Minister urges forces to increase their use of this artificial intelligence (AI) crime-fighting tool.
In a letter to police chiefs, Chris Philp has set out the importance of police harnessing the benefits of innovative technologies to support them in preventing and solving crimes, as well as to keep pace with the changing nature of criminal activity.
He notes that with a concerted effort from all forces in England and Wales, it will be possible to exceed 200,000 searches of still images against the Police National Database by early summer using facial recognition technology, which will help to catch perpetrators and keep our streets safe.
The minister also encourages the police to operate live facial recognition more widely. This state-of-the-art technology captures live footage of crowds and compares it with a watch list of suspects wanted by the police, who pose harm to others. When there is a match, an alert will go out to nearby police officers. Not only does this allow police to quickly identify suspects in a dense crowd, it can also have a strong deterrent effect.
AI technology, such as facial recognition, can help the police quickly and accurately identify those wanted for serious crimes, as well as missing people. It also frees up police time and resources, meaning more officers can be out on the beat, engaging with communities and carrying out complex investigations.
Police use a range of other AI programmes to support their role in keeping the public safe, including those which help speed up the investigation of digital evidence, redaction of evidence files and tools which undertake back-office tasks, freeing up officers’ time.
Crime and Policing Minister Chris Philp said:
AI technology is a powerful tool for good, with huge opportunities to advance policing and cut crime. We are committed to making sure police have the systems they need to solve and prevent crimes, bring offenders to justice, and protect the public.
Facial recognition, including live facial recognition, has a sound legal basis that has been confirmed by the courts and has already enabled a large number of serious criminals to be caught, including for murder and sexual offences.
This is not about acquiring new kit and deploying new tech for the sake of it; it is about staying one step ahead of criminals; delivering smarter, more effective policing and, ultimately, making our streets safer.”
We know these technologies work in catching criminals. Craig Walters was jailed for life in 2021 after attacking a woman he followed off a bus. He was arrested within 48 hours of the incident thanks to South Wales Police using CCTV footage to identify him. A murder suspect in Coventry was apprehended after images, taken by a member of the public in a nightclub where the incident occurred, were matched to a known individual.
It is also being used to tackle shoplifting, with the Retail Crime Action Plan setting out advice for retailers on how to provide the best possible evidence for police to pursue in any case, including CCTV footage of the whole incident and an image of the shoplifter.
Live facial recognition technology has also been used successfully, including at last month’s Arsenal v Tottenham north London Derby, where police caught three wanted suspects, including one for sexual offences. Another wanted sex offender was identified at the King’s Coronation and sent back to prison the same day.
To ensure transparency with the public, the police will put up notices in areas where they will be using live facial recognition. If the system does not make a match against a watch list, a person’s data is deleted immediately and automatically. Anyone caught with the help of facial recognition and then charged, would still face trial in the normal way.
The accuracy of facial recognition technology has developed rapidly. An independent study by the National Physical Laboratory of the algorithm the Met and South Wales Police use, found that the technology was 100% accurate when used on still images and only 1 in 6,000 false alerts when used on live images. The police have not had any false alerts this year over 25 deployments. The study also found no statistically significant differences in the performance based on gender or ethnicity at the settings the police use.
Facial recognition use is strictly governed by data protection, equality, and human rights laws, and can only be used for a policing purpose where it is necessary and proportionate. The College of Policing also sets clear guidance on when officers can use live facial recognition and requires that a person’s data is automatically deleted if the system does not match it to the watchlist of suspects.
The government has invested in and continues to build on a tool which uses AI to help officers to identify and grade child sexual abuse material more quickly. It highlights images of interest for officers to focus on to aid investigations, enabling them to more rapidly identify and safeguard children, as well as identify offenders. It also supports the improvement of police office welfare, as they reduce officers’ prolonged exposure to indecent images. This is in addition to other tools already in use, for example facial matching technology, and others in development which will use AI to safeguard children and identify perpetrators more quickly.
The government is also supporting industry innovation to tackle the threat from AI generated child sexual abuse images, recognising that criminals are also exploiting the technology. Last month, the UK and US issued a joint statement in which they committed to working together to explore the development of new solutions to fight the spread of AI-generated child sexual abuse imagery.
The Home Secretary is also convening an event on Monday which will bring together government, law enforcement and the tech industry to discuss how best to tackle child sexual abuse images which have been created using AI.
It comes as the government, whilst recognising the significant benefits of AI, is taking a leading role in ensuring we are researching and investing in appropriate safety measures. The UK is hosting the first ever major global AI Safety Summit next week at Bletchley Park, supported by the Frontier AI Taskforce which was created with £100m of initial funding to spearhead the country’s leadership in this area.