In 2024, the UK government and law enforcement leaned heavily into technology to enhance efficiency and cut costs. Since the new Labour government took charge in July, law and order became a priority, aligning with their promise to “take back our streets.”
This shift has led to a broader implementation of tech in policing, especially camera systems like facial recognition and cloud-based AI tools. But as before, there are ongoing concerns regarding data protection and ethics. Coverage from Computer Weekly raised questions about how new data reforms might lower transparency and oversight in police technology. They pointed out that many people in the UK have little say in how their tax money is spent on surveillance technologies in public areas.
In March, former chancellor Jeremy Hunt allocated £230 million to police forces to start using technologies like live facial recognition, drones, and AI—all aimed at improving productivity. The government also discussed using automated systems to redact personal information from documents. An additional £75 million went to Violence Reduction Units, targeting police resources to areas with the highest crime rates. Yet, doubts linger about the legality surrounding how UK police use cloud tech and facial recognition, leading to calls for clearer biometric legislation.
In June, Computer Weekly revealed that Microsoft acknowledged it couldn’t ensure the sovereignty of UK policing data stored on its global cloud servers. Documents obtained under freedom of information laws showed that Police Scotland’s Digital Evidence Sharing Capability faced serious data privacy issues. The data processing agreement overlooked UK-specific protection requirements. They also highlighted that data transfers are part of Microsoft’s cloud design, which poses risks for UK government agencies.
February saw the Metropolitan Police dismantle its controversial Gangs Violence Matrix, a database criticized for its racial bias since its launch in 2012. Investigations revealed that the majority of those tracked were from minority backgrounds, with many labeled as low-risk victims rather than perpetrators of violence. Despite this step back, there are worries that whatever replaces the GVM will repeat past mistakes.
In November, Prime Minister Keir Starmer announced an additional £75 million for the new Border Security Command (BSC), ramping up surveillance efforts against people smuggling gangs. This funding adds to an earlier commitment of the same amount in September. Critics, however, argue that enforcing stricter border controls could endanger vulnerable individuals seeking refuge.
MPs debated the use of live facial recognition technology in November for the first time since the Met Police implemented it in 2016. There was a clear need expressed for specific regulations governing its use, as concerns about privacy, bias, and lack of legal structure dominated the discussion. Many felt it was long overdue to bring this technology under scrutiny.
Nine police forces are looking to shift their records management systems to a cloud-based solution. However, experts warn that this move—especially if they choose a US company—could lead to serious data protection issues. The government’s proposed data reforms, meant to ease some of these concerns, may compromise the UK’s ability to maintain adequate data protection standards with the EU.
In Lewisham, the Metropolitan Police claimed community support for its facial recognition deployments, yet a community impact assessment indicated minimal communication with residents. Even local councillors voiced frustrations over the lack of engagement prior to the technology rollout.
A Metropolitan Police officer faced dismissal for improperly accessing sensitive files related to the Sarah Everard case while off duty. This incident raised alarms about compliance with legal requirements for accessing sensitive information, especially as the government’s upcoming data reforms intend to ease restrictions on police data handling.
Civil rights organizations submitted a report to the UN highlighting how the use of AI in policing is eroding civil rights for people of color. They called for transparency in AI use and condemned practices like predictive policing, which often exacerbate racial profiling and discrimination.
In response to violent riots fueled by far-right groups in August 2024, Starmer introduced the National Violent Disorder Programme, aimed at swiftly addressing violent groups through enhanced policing and intelligence sharing. However, critics are concerned that this initiative might lead to increased surveillance, particularly targeting peaceful protests and movements challenging state policies.