Britain acknowledged potential technical hurdles in its planned crackdown on illegal online content after encrypted messaging companies including WhatsApp threatened to withdraw their service from the country.
Regulator Ofcom can only force tech companies to scan platforms for illegal content, such as images of child sexual abuse, if it is “technically feasible”, Culture Secretary Stephen Parkinson told the House of Lords on Wednesday, as the chamber debated the Online Government Safety Bill. He said the watchdog will work closely with companies to develop and find new solutions.
“If there is no suitable technology that meets these requirements, Ofcom cannot require its use,” Parkinson said. Ofcom “cannot require companies to use proactive technology for private communications to comply” with the bill’s security obligations.
The comments are intended to allay concerns from tech companies that scanning their platforms for illegal content could compromise the privacy and encryption of user data, giving hackers and spies a backdoor into private communications. In March, Meta Platforms’ WhatsApp even threatened to withdraw from Britain.
“Today it appears that the Ministry of Science, Innovation and Technology is offering some wording to the messaging companies to allow them to save face and avoid the shame of having to back down from their threats to kill Britain, their second largest country. market in the G7,” says Andy Burrows, a tech responsibility campaigner who previously worked for the National Society for the Prevention of Cruelty to Children.
Protect children
The far-reaching legislation – which aims to make the internet more secure – is in its final stages after six years of development in parliament. Parkinson said Ofcom could nevertheless require companies to “develop or acquire a new solution” to enable them to comply with the law.
“It is right that Ofcom should be able to require technology companies to use their significant resources and expertise to develop the best possible protection for children in encrypted environments,” he said.
Meredith Whittaker, president of encrypted messaging app Signal, previously welcomed a Financial Times report suggesting the government was pulling back from its standoff with tech companies, citing anonymous officials as saying there is no service today that can scan messages without undermine privacy.
However, Security Minister Tom Tugendhat and a government spokesman said it was wrong to suggest the policy had changed.
Eligibility
“As has always been the case, as a last resort, on a case-by-case basis and only where strict privacy safeguards are met, Ofcom will be able to direct companies to either use, or make best efforts to, to develop or source technology to identify and remove illegal child sexual abuse content – which we know can be developed,” the spokesperson said.
Ministers met major tech companies including TikTok and Meta in Westminster on Tuesday.
Language around technical feasibility has been used by the government in the past. In July, Parkinson told parliament: “Ofcom can only require the use of technology on an end-to-end encrypted service if this is technically feasible.”
The NSPCC, a strong supporter of the crackdown in Britain, said the government’s statement “reinforces the status quo in the bill and the legal requirements for tech companies remain the same.”
Accredited technology
Ultimately, the wording of the legislation leaves it to the government to decide what is technically feasible.
Once the bill comes into force, Ofcom could send a company a notice requiring it to “use accredited technology” to identify and prevent child sexual abuse or terrorist content or face fines, according to the bill published in July bill. No accredited technology currently exists because the process of identifying and approving services only begins once the bill becomes law.
Previous attempts to solve this dilemma revolved around so-called client-side or device-side scanning. But in 2021, Apple Inc. released such a system, which would have searched photos on devices for signs of child sexual abuse, after fierce criticism from privacy advocates, who feared it would pave the way for other forms of tracking.
Andy Yen, founder and CEO of privacy-focused VPN and messaging company Proton, said: “As it stands, the bill still allows for the imposition of a legally binding obligation to implement end-to-end encryption in Great Britain Britain, which undermines the fundamental rights of citizens. privacy, and let the government determine what is ‘technically feasible’.”
“For all the good intentions of today’s statement, without additional safeguards in the Online Safety Bill, all it would take is a future government to change its mind and we’ll be back where we started,” he said.
© 2023 BloombergLP