Last week the cold Crypto War between vendors and government turned hot: the United States Department of Justice got a court order demanding that Apple to hack an iPhone 5c that was carried by one of the San Bernardino shooters.

This is to help Federal Bureau of Investigation gain access to its contents, in case there's something incriminating on the phone. The hack in question would require Apple to write code that disables certain iPhone security features.

Juha Saarinen: Keeping your phone secure, the Google way
Juha Saarinen: A safer internet? Not going to happen
Juha Saarinen: Unlinking the web

FBI wants protections against "brute force cracking" or guessing an iPhone's lock code removed so that it can enter them quickly as many times as it takes until the right one is found; without the hack, iPhones are designed to first stall too-frequent lock code entries, with increasing wait times between each try, and the device will lock itself after ten incorrect guesses.


This won't set a precedent, it's not a backdoor, and it is perfectly safe and necessary, FBI says.

Apple vehemently disagrees, and says that such a hack will endanger other iPhone users (those with newer devices than the iPhone 5c) as well. The company can do it, but Apple doesn't want to because it believes once the cat is out of the bag, millions of its customers will be less secure.

If Apple breaks into the phone for the US government, what's to stop other authorities around the world including repressive regimes from demanding the same?

Here, it's worth remembering that what constitutes a serious crime varies hugely around the world.

Security expert Jonathan Zdziarski who specialises in Apple iOS and who has testified in court cases puts its succinctly:

An American is arrested in a foreign country and faces death for being gay. Foreign country orders that Apple backdoor a phone. Now what?

There will be pressure from now on from governments on Apple to provide the hack that FBI demands; if Apple doesn't have the hack, it will be that much easier to resist the pressure.

Furthermore, keeping the hack secret might not be possible.

Zdziarski believes that the hack would have to be certified as a forensic Instrument by the court to be accepted as part of the evidence gathering process.


That means the hack would have to be extensively documented, tested to make sure it doesn't change anything on the device, and peer reviewed by a US standards body before it is accepted in court. This is standard procedure for forensic instruments, Zdziarski said.

It remains to be seen if the hack would be subject to such rigorous vetting, but if it did, it would make Apple's concerns about having to provide similar assistance to each and every government moot, as the details will be out in the open.

There's also some doubts around how much valuable information Farook's iPhone actually contains.

It was given to him by his employer, subject to monitoring, and it is not his personal device. He destroyed two of his own phones.

It remains to be seen if the hack would be subject to such rigorous vetting, but if it did, it would make Apple's concerns about having to provide similar assistance to each and every government moot, as the details will be out in the open.

Furthermore, the iPhone will have had its cellular spoors - location and call data from mobile phone telcos - well and truly captured, along with iCloud backups from Apple. That's quite a bit of data on Farook's activities already.

The iCloud backups are only until October 19 last year. There might have been newer backups, but these stopped after Farook's employer reset the iCloud password after confiscating the phone after being asked by the police.

Not only that, the police turned off the iPhone completely. This is a no-no in IT forensics, as it makes access to the device far harder than if it had been left charged and running and partially unencrypted.

Leaving possible police forensic incompetence aside, there's plenty of stake on both sides, and neither side feels it can back down.

A safe guess is that Apple and other tech vendors will do their damnedest to ensure they won't be in a similar Mexican stand-off with the law again. This means more services, devices and storage will be strongly encrypted, with users having the keys, not the vendors.

In other words, the whole exercise could backfire for law enforcement who will have to contend with much more secure devices soon.

But, if you're gay or a political dissident, or both, and don't want your nearest and dearest and friends to be picked up by the police and murdered along with you because their details were found in your smartphone, what can you do?

Apart from minimising smartphone use, and what's stored on it, set a long, complicated password. It's too easy to crack the four-digit PINs, something Apple recognised in iOS 9 when it made the passcode a minimum of six numbers.

Make it even more complicated than six numbers, with plenty of digits, letters and symbols, and you'll have served the life in prison sentence before the police guess your passcode.

They'll probably try to beat the passcode out of you, but at least you've made them work a bit harder than they'd normally need to.