The Online Safety Bill: Clauses, Backdoors and a Great Britain Without WhatsApp
The Online Safety Bill has laudable aims, but experts believe it will make UK businesses a huge target by undermining encryption. Phil Muncaster investigates
Author: Phil Muncaster
Share:
The Online Safety Bill has laudable aims, but experts believe it will make UK businesses a huge target by undermining encryption. Phil Muncaster investigates
It’s not often a government formulates a policy so misguided that it galvanises some of the biggest names in tech to join forces. But on privacy, the British government has managed it, aligning itself closer to China than the Western liberal democracies it purports to lead.
While we usually operate a low-tolerance policy for acronyms on Assured Intelligence, the nature of legislation means it’s almost impossible to write about this topic without abbreviations. So we’re making an exception and hope you can forgive us.
The Online Safety Bill (OSB), which the House of Lords was reviewing at the time of writing, is a mammoth piece of legislation. But it’s Clause 110 that has united the CEOs of some of the world’s most popular messaging services in condemnation. In trying to combat child sexual exploitation and abuse (CSEA) content sent via end-to-end encrypted (E2EE) services, they argue that the government risks undermining security and privacy for everyone – without really tackling the underlying problem.
Redefining regulation
The Online Safety Bill ran to 225 pages and 194 individual clauses at the last count. Described in the usual post-Brexit fashion as “world-leading”, it’s certainly ambitious. A product of Boris Johnson’s quixotic and chaotic time as prime minister, the bill seeks nothing short of redefining internet regulation for the 21st century.
The central premise remains – to impose a “duty of care” on the world’s big tech firms, mainly social media companies. Free speech be damned: if the government deems content “harmful” enough, they should be forced to remove it, the argument goes. However, what constitutes harm is where the problem lies. It is also an issue in Clause 110, which covers the monitoring of private messages.
Referred to as the “spy clause” by the Open Rights Group, Clause 110 empowers Ofcom to force tech companies to use accredited technology to identify and rapidly take down CSEA content. Crucially, private messaging services are also covered by the clause. As they currently use E2EE, which ensures messages can only be read by sender and receiver, The Online Safety Bill would require them to break this feature.
According to the messaging app CEOs, this would “open the door to routine, general and indiscriminate surveillance of personal messages of friends, family members, employees, executives, journalists, human rights activists and even politicians themselves, which would fundamentally undermine everyone’s ability to communicate securely.”
Why it can’t be done
Governments and law enforcement agencies have been trying to force tech companies to undermine encryption in their products for years. In 2015, Apple stood its ground against the FBI, despite a court order demanding it unlock a mass shooting suspect’s phone, and was backed by companies across the tech spectrum. In the end, the FBI found a different way to hack the device. That confirms some privacy experts’ beliefs that the calls for encryption backdoors are more about making life easier for law enforcement than protecting the public.
Yet still, the demands keep coming. On the one side are governments and law enforcers who say that tech companies should be able to find a way of cracking encryption in isolated circumstances without undermining security for all users. In the other corner are world-leading cryptography experts who maintain it absolutely can’t be done. Which side are you on?
Now, the debate has moved from backdoors to another workaround: client-side scanning (systems that scan message content for similarities to a database of objectionable content). A paper published by the UK National Cyber Security Centre’s technical director, Ian Levy, suggests this could be the answer for the proponents of the Online Safety Bill. He argues that the technology would run locally on devices and effectively use AI to scan for keywords and images indicative of CSEA or stored on a database of illegal content. Moreover, the scan would take place before a message is encrypted and sent. It’s similar to a solution Apple proposed and then quietly shelved after much uproar a couple of years ago.
There are several problems with this ‘solution’, according to experts. Researchers at Imperial University calculated it would create way too many false positives to be practically useful. They also demonstrated how the tech could be circumvented if bad actors could change their images slightly so as not to produce a match with those stored in a CSEA database.
Matthew Hodgson, CEO and co-founder of encrypted messaging app Element, adds that such a system would also be a magnet for bad actors.
“If an attacker breaches the moderation system – either by hacking into it or by compromising a moderator – they could access unencrypted content which has been identified by the scanning system,” he tells Assured Intelligence. “A more sophisticated attacker could also hijack the opaque scanning logic to locate and access other sensitive data. Alternatively, a child abuser could break into the system to hit the jackpot of potential child abuse material.” This consequence is unthinkable.
A target on our backs
If client-side scanning were chosen as a means to comply with the Online Safety Bill, then every smartphone user with encrypted messaging apps on their phone could be subject to the scanning of messaging content. That’s an obvious red flag for corporate security, says Hodgson.
“Client-side scanning is absolutely a backdoor: one that lets in the bad guys as well as the good guys. The fact the encryption itself remains intact is meaningless if the client scans content being sent or received and sends it to a third party for inspection,” he argues.
It would be a huge target for nation-states and cyber criminals alike, adds Matt Ellison, a cybersecurity specialist at Corelight.
“Given that organisations with the resources of Microsoft and Apple still find their hardware and software hacked and abused on an almost daily basis, how would we expect a government contractor who had won with the lowest bid to be able to guarantee the solution was hack-proof?” he says to Assured Intelligence. “We’re talking about deploying a mass surveillance infrastructure that would give direct, unfettered access to business and personal messages and potentially images, video and audio.”
“How would we expect a government contractor who had won with the lowest bid to be able to guarantee the solution was hack-proof?” Matt Ellison
ESET global security advisor, Jake Moore, argues that the plans stand in direct contrast to the GDPR (General Data Protection Regulation) aims, adding that criminals who want to stay hidden from the authorities will always find a way to do so.
“GDPR is specifically designed to keep data private, so it remains impossible to have both this and a backdoor,” Moore tells Assured Intelligence.
“It is essential that phone users’ data is protected and kept safe from prying eyes. Criminals are very impressive at bypassing techniques designed to capture them, and this proposal would simply move them to more underground locations.”
Criminal communication networks such as EncroChat and Sky ECC have flourished in the past as alternative unpoliced channels – serving tens of thousands of users globally before they were disrupted by law enforcement.
Another possible result of the Online Safety Bill’s Clause 110 is that it forces secure IM providers to exit the UK market rather than undermine security for all their global users. WhatsApp boss Will Cathcart has already stated his intent to do precisely this. A Britain without WhatsApp seems unfathomable right now.
“Not only will this have a blanket and immediate impact across individuals and their ability to have secure IM conversations, but it will immediately isolate any UK business that relies on these tools to converse with colleagues and customers in other countries,” argues Corelight’s Ellison. “There are, of course, technical workarounds, using web-based versions of common tools, for example, but the impact will still be significant.”
What happens next?
The government’s response to such concerns has been typically robust, to put it politely.
“It does not represent a ban on end-to-end encryption, nor will it require services to weaken encryption,” a spokesperson recently said of client-side scanning. “It will not introduce routine scanning of private communication. This is a targeted power to use only when necessary. And when other measures cannot be used.”
However, it is not hard to see how law enforcement or intelligence agencies could abuse these surveillance capabilities in the future. Plus, it still doesn’t change the fact that the scanning infrastructure would be a target for attack.
“The problem is simply that in order to have scanning available, it has to be incorporated into the client, at which point it is present and ready to be exploited by an attacker,” argues Hodgson. “It doesn’t matter whether the moderators need a warrant or not to enable it if, in practice, an attacker can compromise a moderator to hijack the scanning system.”
Most experts agree that there is plenty law enforcement can do to counter CSEA, terrorism, and other crimes that client-side scanning supposedly helps to expose. But they require traditional policing techniques rather than “magical software,” according to noted Cambridge University professor Ross Anderson.
“The idea that complex social problems are amenable to cheap technical solutions is the siren song of the software salesman and has lured many a gullible government department onto the rocks,” Anderson writes.
Many people forget that the government already has a law forcing tech companies to do their bidding on E2EE: the Investigatory Powers Act. It has chosen thus far not to pick a battle with big tech over the provisions. A similar stalemate is the most likely outcome of The Online Safety Bill.