Over the course of 2023, there has been a concerted effort by the UK government and law enforcement to push more digital offerings into policing, including hyperscale public cloud infrastructure, various facial-recognition technologies and integrated record management systems.
However, Computer Weekly’s coverage reflects the contentious nature of many of these deployments, which are often plagued by data protection issues and a lack of effective transparency or oversight.
In April, for example, Computer Weekly revealed potentially unlawful data processing and storage by Police Scotland in its cloud-based digital evidence sharing capability (DESC) system, which prompted regulatory action by the Scottish biometrics commissioner.
Computer Weekly’s coverage also focused extensively on the proliferation of facial-recognition and biometric data throughout policing, and the continued lack of clear biometric oversight and frameworks despite government assurances to the contrary.
At the start of April, Computer Weekly revealed that a cloud-based digital evidence-sharing system was being piloted by Police Scotland despite major data protection concerns raised by watchdogs about how the use of Microsoft Azure could be putting people’s sensitive personal data at risk.
According to a in a data protection impact assessment (DPIA) for the DESC – disclosed to independent security consultant Owen Sayers via FOI before being shared with Computer Weekly – major problems with the system included the potential for US government access via the Cloud Act, which effectively gives the US government access to any data, stored anywhere, by US corporations in the cloud; Microsoft’s use of generic, rather than specific, contracts; and Axon’s inability to comply with contractual clauses around data sovereignty.
As a result of Computer Weekly’s coverage, Scottish biometrics commissioner Brian Plastow served Police Scotland (the lead data controller for the system) with a formal information notice on 22 April 2023, requiring the force to demonstrate that its use of the system is compliant with Part Three of the Data Protection Act 2018 (DPA 18), which contains the UK’s law enforcement-specific data protection rules.
Responding to the commissioner’s formal notice, Police Scotland confirmed in July 2023 that it had “uploaded significant volumes of images to DESC during this pilot”. Plastow later noted that while the force’s response was helpful, it “did not ameliorate my specific concerns” around the uploading of sensitive biometric data to DESC.
He further reiterated that his office will be conducting a “separate but related assurance review” on Police Scotland’s handling of biometric data in winter 2023 to see whether it complies with the statutory code of practice in Scotland.
The unfolding story around DESC was also picked up by national press in Scotland, but without any attribution to Computer Weekly.
In January, Newham Council voted unanimously in favour of a motion to suspend the use of live facial recognition (LFR) by police in the East London borough.
It mandated that the Council’s chief executive must write to the Home Office, the Mayor of London, and the Metropolitan Police to make its opposition to LFR technology clear, and to request its suspension of LFRs use throughout the borough, at least until sufficient biometric regulations and anti-discrimination safeguards are in place.
While Newham as a local council does not have the power to halt LFR deployments throughout the borough itself, Labour councillor for Canning Town North Areeq Chowdhury said he hopes it will increase pressure on the government to introduce a national moratorium on police’s use of the technology.
Responding to the motion, as well as questions from Computer Weekly about whether it intends suspend the use of LFR by police in Newham given the lack of consent from the council, the Home Office said the technology plays “a crucial role in helping the police tackle serious offences including knife crime, rape, child sexual exploitation and terrorism”.
When asked if it was able to provide any evidence that LFR had led to arrests for the serious offences it listed, the Home Office said the Met would be best placed to answer as operational leads on the technology.
The Met confirmed that no arrests have been made for those reasons up to that point, adding it deploys LFR “based on a specific intelligence case and with a focus on locating those people who pose a serious risk to the public” but who are difficult to find.
In February, then-biometrics commissioner for England and Wales Fraser Sampson called for clear, comprehensive and coherent frameworks to regulate police use of AI and biometrics in the UK, after publishing a report which described a lack of transparency and accountability around its use.
It also noted the absence of any express requirement for police forces to demonstrate why, and evidence how, their deployments are necessary and proportionate.
Sampson added that he is particularly concerned about the potential for retrospective use of the technology to locate witnesses; as outlined in guidance published by the College of Policing in March 2022, which suggested that witnesses of crime, as well as victims, could be included in facial recognition watchlists.
He said that any instances where retrospective facial recognition might “legitimately make a significant contribution”, such as in the wake of terrorist attacks or natural disasters, are “mercifully rare and wholly exceptional”.
Sampson also noted that the vast majority of the UK’s biometric surveillance capability is privately owned, and can only be accessed under contractual arrangements between policing bodies and the private sector.
In an appearance before the Parliament’s Joint Committee on Human Rights (JCHR) the same month, Sampson told MPs and Lords there was a “culture of retention” around biometric data in UK policing, which has the potential to massively damage public trust.
Noting the proliferation of intrusive surveillance techniques, Sampson said there are a number of human rights concerns around, for example, bias and discrimination against groups or individuals, privacy, freedom of movement, and freedom of assembly or speech.
He also highlighted the ongoing unlawful retention of millions of custody images by the Home Office, despite a 2012 High Court ruling ordering it to destroy them.
“I’m here today saying there are probably several million of those records still,” he said, adding that the response from policing bodies and the Home Office (which owns most of the biometric database used by UK police) is to point out the information is held on a database with no bulk deletion capability.
“I’m not sure that works for public trust and confidence, but even if it did … you can’t [legally] rely on a flaw in a database you built for unlawfully retaining stuff … that’s a technical problem that’s of the country’s and the police’s making rather than the people whose images you’ve kept.”
In April, both the Metropolitan and South Wales Police reaffirmed their commitments to using facial-recognition technologies, after research commissioned by the forces found a “substantial improvement” in the accuracy of their systems, but only if certain settings are used.
Conducted by the National Physical Laboratory (NPL), the research tested the facial detection and recognition algorithms being used by both forces, and found there is “no statistical significance between demographic performance”.
It specifically found that when deploying the Neoface V4 facial recognition software provided by Japanese biometrics firm NEC, the two police forces can achieve “equitable” outcomes across gender and ethnicity by setting the “face-match threshold” to 0.6 (with zero being the lowest similarity and one indicating the highest similarity).
Despite the improved accuracy of the Neoface system, civil society groups maintain that the technology is “discriminatory and oppressive”.
Computer Weekly revealed in August, based on a Freedom of Information disclosure on the Met Police’s website, that the force’s integrated record management system Connect is nearly £64m over budget and still facing major teething problems, with officers and staff raising more than 25,000 support requests in its first four months of operation.
Responding to the FOI’s findings, Caroline Russell, chair of the London Assembly’s Police and Crime Committee, said: “The Met needs to get the basics right to help officers to do their job. They should be fighting crime, not computer systems.”
Computer Weekly was told that a further FOI request would need to be submitted for that and other information, including when the Met expects the operational efficiencies to be delivered and whether there was any forecasting of how many support requests were initially expected under Connect.
Computer Weekly also separately revealed that the system was deployed November 2022 despite multiple data protection “compliance issues” that would inhibit its ability to retrieve data, meet its statutory logging requirements, and respond to subject access requests.
Computer Weekly also contacted the ICO about the open discussion of data protection issues between Met and the Mayor’s Office for Police and Crime, and asked for clarification on whether the data regulator was made aware of the issues or otherwise approached by any of the bodies involved in rolling out Connect.
The ICO press office said it was aware of ongoing compliance issues but had only limited engagement with the force. It’s FOI team later responded to Computer Weekly’s follow up about the extent of its engagement with the Met that “we can neither confirm nor deny whether we have been made aware of the issues that you refer to in your request”.
Policing minister Chris Philp outlined his intention in early October to give police forces access to the UK’s passport database, claiming it will enhance their facial-recognition capabilities to help catch shoplifters and other criminals.
Speaking at a fringe event of the Conservative Party Conference, Philp told attendees that he plans to integrate data from the police national database (PND), the Passport Office and other national databases to help police find a match with the “click of one button”.
According to the 2021 census, just over 86% of the British public hold at least one passport.
The Scottish biometrics commissioner later described the policing minister’s “egregious proposal” to link the UK’s passport database with facial recognition systems as “unethical and potentially unlawful”.
After announcing his resignation from the biometrics and surveillance camera commissioner dual role in August, Sampson spoke to Computer Weekly about his time in office, and warned of the declining state of oversight in these areas.
He said there are real dangers of the UK slipping into an “all-encompassing” surveillance state if concerns about these powerful technologies aren’t heeded, and described the government and police as having a “disconnected approach” to technology that he found shocking.
Sampson further warned that the government’s proposed data reforms will further fracture what is already a very fragmented regulatory landscape, and will particularly weaken already scant oversight of the police’s intrusive surveillance capabilities.
In November, UK police chiefs announced plans to equip officers with a mobile-based facial-recognition tool that will enable them to cross reference photos of suspects against a database of millions of custody images from their phones.
Known as operator initiated facial recognition (OIFR), the tool uses software supplier NEC’s NeoFace facial-recognition algorithm, and is currently being jointly trialled by South Wales, Gwent and Cheshire police.
The National Police Chief’s Council (NPCC) has said the PND-linked tool will be rolled out nationwide in 2024, and that it has further plans to increase the police’s use of retrospective facial-recognition (RFR) software by 100% before May that year.
This is in line with wider efforts to push the mass adoption of facial recognition tools in UK law enforcement, including policing minister Philp’s intention to roll out the tech to every force in England and Wales, as well integrate the tech with police body-worn video cameras.
On 12 December, the Lords Justice and Home Affairs Committee (JHAC) – which has launched a short follow-up inquiry into the use of artificial intelligence by UK police, this time looking specifically at live facial recognition (LFR) – heard from senior Metropolitan Police and South Wales Police officers about the improving accuracy of the technology, as well as how both forces are managing their deployments.
Claiming there was a “very clear focus” on the most serious criminality, they also told the Lords about the operational benefits of LFR technology, which includes the ability to find people they otherwise would not be able to and as a preventative measure to deter criminal conduct.
At the same time, they confirmed for the first time that both forces use generic “crime categories” to determine targets for their live facial recognition deployments, bringing into question claims that their use of the technology is concentrated on specific offenders who present the greatest risk to society.
Academic Karen Yeung, an interdisciplinary professorial fellow in law, ethics and informatics at Birmingham Law School, challenged the proportionality and necessity of this approach during the evidence session, claiming the coercive power of the state means police must be able to justify each entry to the watchlists based on the specific circumstances involved, rather than their blanket inclusion via “crime types”.