encryption
Modern smartphone with mobile security application interface on
Modern smartphone with mobile security application interface on

Steve Jobs understood what people want. His insistence on making hard things easier — for instance, using a personal computer — was an essential part of the Apple success story. Apple CEO Tim Cook has been doing the same thing — but now the “hard thing” is privacy and encryption.

Apple has consistently earned top marks for its privacy and data security policies. That said, since the San Bernardino shooting, which left 14 dead and 22 seriously injured, the company’s privacy-first approach has been experiencing a sort of baptism by fire.

Much debate has arisen around the encryption on San Bernardino shooter Syed Rizwan Farook’s iPhone 5C. Shortly after the shooting, the iCloud password associated with Farook’s phone was reset by a law enforcement officer attempting to gather information.

The snafu purportedly eliminated the opportunity for any information on the phone to auto backup onto the cloud when the device was used on a recognized Wi-Fi network. This information could have then been retrieved.

According to ABC News, the last time Farook’s phone had been backed up was Oct. 19, 2015 — a month and a half before the attack. According to court documents, this fact suggested, “Farook may have disabled the automatic iCloud backup function to hide evidence.”

Apple provided the FBI with the iCloud backups prior to Oct. 19. But the government wanted access to the phone, at least partially to discern if Farook had any terrorist ties. And, to get to it, the FBI asked Apple to reverse a feature that erases an iPhone’s data after 10 failed attempts to unlock it. If Apple did so, the government could use software to guess Farook’s passcode.

The FBI argued its reset of Farook’s password should not prevent Apple from honoring this request.

“It is unknown whether an additional iCloud backup of the phone after that date — if one had been technically possible — would have yielded any data,” the agency said in a statement. “Direct data extraction from an iOS device often provides more data than an iCloud backup contains.”

And, last week, a federal court ordered Apple to develop a custom iOS so the FBI could gain access to the phone. Apple is refusing to comply with the court order.

“Building a version of iOS that bypasses security in this way would undeniably create a backdoor,” CEO Tim Cook said in an open letter to Apple customers. “And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.”

What’s at Stake

Consumer awareness around privacy and encryption has gained traction, following Edward Snowden’s revelations regarding the scope of government surveillance practices at the National Security Agency. Still, the public’s response to Apple’s current plight remains divided.

While some pundits, commentators and high-profile figures have argued the FBI should be able to access phone records in cases where national security may be at risk, others have come to Cook’s defense, arguing he is right to protect Apple customers. I, too, believe he is right to stand his ground here. In an environment where many companies would allow law enforcement to access private information, Apple is standing up for consumers and suggesting they can no longer tolerate routine incursions into their private lives — whether the so-called trespassers hail from the halls of government or invade in the interest of commerce.

To create an iOS or any other kind of backdoor into a personal device creates moral hazard. The potato chip theory applies to law enforcement and the erosion of the constitutional rights guaranteed to all U.S. citizens. One potato chip leads to another, and it’s hard to stop eating them. In the same way, one legal mulligan leads to another.

There has to be a point in the evolution of consumer privacy (or its disintegration) where we can no longer lower our standards as fast as our situation is deteriorating. When it comes to our privacy we really have to stand firm — and Tim Cook is doing that.

Executive Director of the Privacy and Big Data Institute at Ryerson University Ann Cavoukian long ago coined the phrase “Privacy by Design” to describe what’s starting to happen in the U.S. marketplace. Her theory was that consumers will start shopping for the best deals on their privacy — the less personal information required by a potential service or product, the more appealing it will be to the consumer.

So in that regard, the Justice Department is right to suggest, as it did last week that Apple is trying to protect its “public brand marketing strategy.” But in this instance, the strategy is consumer advocacy — nothing more or less. Privacy is not a brand. It is a right. And, contrary to popular belief, it’s no longer particularly hard, either. Apple’s strategy is to provide a useable product that is safe — and protects users against a potential war on their privacy.