The American government has been putting great pressure on our technology companies to build "back doors" into their software so that the Government can decrypt our private data. This ability would be used sparingly, only when fighting serious crime, and only with a judge's approval, of course.
The government has been looking for a test case to push their position through the courts. They recently were given a cell phone which had been used by one of the Islamic terrorists who engaged in "workplace violence" in San Bernardino last year. The phone is encrypted. The FBI can't read the data in it, so they want to court to ask Apple to break into the phone, just this once. Yep, just this once, and never again.
This is part of Apple's response, which was written by CEO Tim Cook:
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a back door to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software - which does not exist today - would have the potential to unlock any iPhone in someone's physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control. [emphasis added]
The essential, vital, unarguable point is that this hacking tool does not exist today. - and, given its immense capacity for mayhem, is a tool that the world would be better off if it never exists at all.
Mr. Cook doubtless remembers how successful generations of American Presidents have been in putting the nuclear genie back in the bottle. Instead of being limited to a few governments, nuclear weapons capability is spreading to more and more states whose motives are difficult to discern, and it's not hard to imagine a time soon when non-state actors will get their dirty mitts on one.
This, despite the fact that nuclear weapons are very hard to build! Although their existence and their basic operating principles have been known for decades, the actual engineering and manufacture of a working nuclear weapon takes many years and billions of dollars.
A software back door would be completely unlike nuclear weapons, because it wouldn't have the protection of difficulty in replication. As Napster and its countless clones have proven, once created, a digital work can be copied as easily and as universally as a web page.
It would be one thing for a safe manufacturer to turn over the combination to one specific safe under court order. Apple has done that in response to subpoenas, as it should.
It would be quite another thing for a safe manufacturer to first build, and then give the government, a device that would open any safe in the world. Yet that's what the government wants of Apple.
Would gun owners accept the FBI having a tool that could prevent any Smith & Wesson handgun from firing? Or would the company go bankrupt overnight?
How could anyone protect such a desirable secret? How much could you get for a tool which could open any iPhone in the world? How could anyone, no matter how virtuously inclined, guarantee that he would resist the temptation to sell it - particularly considering that it doesn't even need to be stolen, a simple software copy will do?
Once out, of course, that's it for any hope of protecting your private information if you lose your phone. But it's actually worse than that: hackers are very skilled at reverse engineering of software to learn the existence of vulnerabilities.
What guarantee is there that a clever hacker couldn't analyze Apple's super-cracker software and figure out a way to crack into an Apple device without needing physical possession of it? Mr. Cook doesn't want his engineers figuring out how to do this, or even thinking about it. Neither do we.
Our concern is especially valid when government holds the key. Our government in general, and the Obama administration in particular, has given ample evidence that it sees no obligation to follow the law and that it is incapable of protecting important data.
When encryption is outlawed, only outlaws will have encryption - and when everyone is an outlaw because that's the only way to protect yourself, well - does that mean nobody is an outlaw? Do we really want to find out?
Over the past five years, the editors have been secretly working on a book that summarizes the fundamental viewpoints of Scragged.