Online platforms and biometric surveillance: How the UK government weaponizes safety

Credit: Borislav / iStock

The Human Rights Act of 1998, based on the protections enshrined in Article 8 of the European Convention on Human Rights (ECHR), safeguards the right to privacy in the UK. Despite this, the UK government is set to spend its final months in power weakening the legal protections that uphold this fundamental right, allowing emerging technologies to be deployed without adequate protections and setting a dangerous precedent for state surveillance globally. 

 

Online surveillance 

End-to-end encryption scrambles data, such as private messages, preventing unauthorized access by third parties. This process is crucial for protecting the fundamental right to privacy online, but the UK government has continued to exploit concerns surrounding online child sexual abuse and terrorism to introduce new laws that weaken these protections for all users. 

To “respond to threats to the public and keep apace with evolving technology,” the government announced it would be amending the Investigatory Powers Act, the 2016 legislation that introduced obligations on communication providers to retain customer connection records for up to 12 months. Through secondary legislation introduced in November 2023, the government is now proposing that technology companies, including those overseas, inform the government of any plans to amend security or privacy measures, giving the Home Office the power to compel companies to remove these measures. Such a move not only would weaken the security protection of these services, raising concerns about the vulnerability of stored user data, but it also threatens to transform private companies into surveillance tools of the state—an accusation that the UK government has itself deployed against authoritarian regimes like China. 

Measures to weaken online privacy and freedom of speech, however, were already passed into law in 2023 with the introduction of the Online Safety Act, which places legal obligations on social media platforms to remove illegal or harmful content. The messaging platforms WhatsApp and Signal both threatened to withdraw from the UK due to clause 122 of the act, which initially sought to compel companies to scan users’ messages for illegal materials without compromising user privacy. While the government later admitted the necessary technology to achieve this does not actually exist, the final clause could still compel companies to identify particular kinds of content if they receive notice from Ofcom, the communications regulator. The Electronic Frontier Foundation warns that this requirement could “undermine not just the privacy and security of UK residents, but internet users worldwide.” With the government centering child safety as a core aspect of the new law and accusing platforms of “empowering child sexual abusers” by rolling out encryption, privacy activists face an uncomfortable fight but must continue working with trusted industry and security experts to challenge the government’s overreach. 

With social media platforms facing punishment—including prison time—for failing to comply, the new act also creates the risk of over-compliance whereby platforms, likely using automated detection software, are able to censor content before it is posted. Crucially, the act sets a precedent for other governments—authoritarian or democratic—to impose similar restrictions, compromising the safety of those who rely on private communications, including journalists, activists, and even child abuse victims.  

 

Biometric surveillance

Offline, the government is exploiting the patchwork of overlapping data protection laws that govern biometric surveillance to roll out the use of live facial recognition (LFR) technology, raising serious concerns about the right to privacy in public spaces. While a 2022 report by the Ada Lovelace Institute found there was legal uncertainty surrounding the use of LFR, Policing Minister Chris Philp has urged police chiefs to double facial recognition searches while working with the Home Office to push for the rollout of the technology into high street shops and supermarkets.

Instead of establishing the regulatory framework necessary to protect citizens’ data from new and intrusive technologies, the government has been actively encouraging its use to “protect the public” while ignoring issues that include potential racial bias. In London alone, Met Police statistics show that in 2023, the force scanned over 366,156 faces over 34 deployments.  Additionally, the technology has been discreetly used at a host of public events, from the king’s coronation to Beyonce’s Renaissance tour.  

While the Home Office claims that biometric data is “deleted immediately and automatically” if not matched, the former biometrics and surveillance camera commissioner, Professor Fraser Sampson, accused police forces of holding images of more than 3 million UK citizens, describing the UK as an “omni surveillance” society.

Human Rights Watch further warns that this technology is being increasingly used to surveil protests, curb political dissent, and exacerbate existing structural inequalities. Without adequate oversight, LFR technology risks weakening fundamental rights in the UK, including the right to protest and freedom from discrimination, while emboldening what human rights group Liberty has described as “the most intrusive mass surveillance regime of any democratic country.”

 

Conclusion

It is vital that international human rights norms guide the development of regulations to ensure that emerging technologies work to safeguard freedoms, not curtail them. The UK’s weaponization of safety to increase its surveillance apparatus has global implications and risks, weakening online security while legitimizing increased data collection.   

Over 180 rights groups and tech experts have already called for a global stop to facial recognition surveillance, warning about the human rights and discriminatory impacts of the technology as well as the insufficient legal and democratic mandate. These efforts must continue in order to uphold the right to private conversation and privacy in public, not just for those seeking to expose injustice but for our collective global safety.