Oh dear. It looks like Britain has painted itself into a corner with proposed new legislation that will give tech firms in the country major headaches.

As previously signalled by the Conservative Government, UK authorities will have the power to force internet and tech companies to break encryption. Governments, including the UK, have become increasingly worried that end-to-end encryption will make it impossible for them to conduct surveillance on citizens, and are attempting to legislate ways to access people's communications.

In doing so, though, they're also jeopardising internet security for all of us.

First, to understand what's going on, a definition of strong cryptography: this is when only the creator of a piece of encrypted information can unscramble the material - and decide who can decode it.

Advertisement

If someone else apart from the creator of the encrypted material can unscramble it, it's no longer strong encryption.

Likewise, the promise by Joanna Shields, the Conservative Minister of Internet Safety and Security at the Orwellian-sounding Department for Culture, Media and Sport, that the Government won't require backdoors in encryption is disingenuous (tinyurl.com/nzh-ukhansard). If the Government mandates that anyone providing strong encryption must retain the keys to it, what else would that be but a backdoor? Even then, Shields insisted that the UK Government does not seek to weaken encryption and to introduce backdoors.

"The Government does not advocate or require the provision of a backdoor key or support arbitrarily weakening the security of internet applications and services in such a way. Such tools threaten the integrity of the internet itself," she said in the House of Lords last week.

"Current law requires that companies must be able to provide targeted access, subject to warrant, to the communications of those who seek to commit crimes or do serious harm in the UK or to its citizens," she said, during a debate on the Investigatory Powers Bill, which would also force internet providers to retain users' web browsing history for a year.

Shields should have no problems understanding the contradiction inherent in what she said, as she's worked for Google, AOL, RealNetworks and Facebook in senior roles.

The case for allowing authorities to break encryption is hard to argue against, on the face of it: already, criminals, terrorists and sex offenders use unbreakable crypto to make sure their communications stay secret, and that's a bad thing.

Then again, even if tech companies are forced to retain decryption keys for their users' communications, would that make bad people stop using strong encryption? Sure, it would inconvenience them not to be able to use WhatsApp, but it's easy enough to find other non-commercial encrypted alternatives where decryption keys are not retained.

This is also assuming that the authorities are always the good people, and always do the right thing, which isn't necessarily the case. Scotland Yard, for instance, appears to have an endemic corruption problem (tinyurl.com/nzh-corruption).

Advertisement

Already Apple, which along with Google last year decided that it would no longer hold decryption keys to newer smartphones thanks to excessive government surveillance, as revealed by Edward Snowden, is being boiled by law enforcement.

US authorities don't like it at all, and Apple has been trying to explain in a New York court that it can't actually provide access to an encrypted new iPhone without decryption keys, which the judge didn't seem to want to accept (tinyurl.com/nzh-applecrypt).

As Snowden, whose leaks kicked off Apple's quest for greater privacy protection for users, noted on Twitter, if the British Government is allowed decryption keys, other governments will want them too. The US, China and Russia for instance.

The new law places Apple and Google and other tech companies in an untenable situation. Because they don't actually have the keys, there is no practical way to provide access to devices - the encryption used on them is robust enough for that.

A forced software update that either copies over users' private decryption keys or replaces them with new ones, copies of which are held by, say, Apple and Google, would be one way to do it. There's been talk of using this method as a backdoor, along with a government gag order banning any mention of such a "feature" by companies.

That'd substantially weaken security, though, and be open to interception and abuse. State actors and enterprising hackers would home in on the forced update mechanism instantly, in order to find weaknesses in it that could be exploited. And, they'd find them, be sure of that.

Advertisement

The irony here is, of course, that many government employees use strongly encrypted devices and services, and the UK law threatens to weaken their security too.

A bad idea overall, then, and you have to wonder if it wouldn't be better if the Brits just took themselves off the internet instead.