It's a battle of Goliath against Goliath - the world's most powerful company vs the world's most powerful Government. It's a clash over matters of fundamental principle, which could have enormous consequences. And it all boils down to a few simple digits on an iPhone.

In December last year, Syed Rizwan Farook and Tashfeen Malik, a married couple from San Bernadino, went on a shooting spree that left 14 people dead.

To find out more about how he became radicalised, and whom he was in contact with, the FBI would like to get into Farook's iPhone. The problem is that he was killed in a shootout with police, taking the password with him.

To crack the phone open, the FBI can't simply type in random numbers: that would not only take too long, but might wipe the phone clean. So they want Apple to create and install a special version of the phone's operating software which will enable them to make as many guesses as they like, as quickly as they like.


The FBI promises this software would be used just this once. Apple claims that the same code, once written, could be used to crack open any other iPhone, or at least to give people who want to do so a good head start. Apple CEO Tim Cook has now published an extraordinary open letter outlining why he is refusing to allow the code to be created, despite a judicial order.

Despite Cook's line-in-the-sand rhetoric, the case isn't clear-cut. Farook's phone is an old model: newer iPhones have tougher security, which some believe would prevent such a hack (Apple denies this).

It's also been claimed that Apple has complied with previous requests to unlock criminals' phones, meaning that taking a stand now is as much about public relations as principle.

Yet while Apple's stand may not be that popular with some, it's the right one to take. Not because of the details of this particular case, but because of the wider issues at stake.

The United States Government believes that its security depends on it having access, even if only as a last resort and with the appropriate legal safeguards, to any form of encrypted communication. The tech firms, on the other hand, believe that their users' privacy must be paramount.

This resistance is motivated by both principle and pragmatism.

On the principle front, the logic of surveillance, in a society that feels itself under threat, is always towards more: a back door, once opened, is never closed.

And snooping that starts as exceptional inevitably becomes banal.

Yet taking a pro-privacy stance is also good business sense.

In the wake of Edward Snowden's revelations, the tech firms are determined to prove to their users that they can be trusted with their data.

Apple, for example, wants to have an iPhone in the pocket of every customer in the world.

For that to happen, those customers need to know that their governments - many an awful lot nastier than ours - won't be able to see their private thoughts.

It's not just emails and text messages that are at stake here. Devices such as the Apple Watch, and its successors, could soon be accumulating data about our blood pressure, sleeping habits, even our genetic make-up.

On the face of it, this is as much of a threat to our privacy as government surveillance, if not more. Yet Apple and Facebook and Google know we will trust them with our lives only if we feel we can trust them full stop.

What keeps them in line is commercial imperative.

The tug-of-war between governments and technology companies - over privacy, taxation and so much more - is one of the emerging themes of our age. Neither will have right on their side all the time. But on this issue, Apple is acting in its own best interests - and ours.