The recent Game of Phones between the FBI and Apple underscored an area in our jurisprudence that is screaming for more clarity. If there is a tipping point when the protection of consumer privacy should yield to the needs of a criminal investigation, where is it?
Few will dispute the obvious cases where the Constitutional rights of a citizen are disrupted by a judge who knows (or at least has access to) the legal precedents informing the decision to suspend a citizen’s right to privacy. A court-ordered search warrant trumps those rights, for a defined period of time, and it can happen fairly quickly when a member of the judiciary believes there is good and sufficient reason for it. Sometimes, in instances involving probable cause and easily discernible physical evidence, the law permits on-the-spot access.
The latter scenario came into play with the phone belonging to San Bernardino shooter Syed Rizwan Farook, an iPhone 5C running iOS 9. Law enforcement officials had every reason to believe there could be time-sensitive information on the device—information that very well might save lives. They attempted to access that information through Farook’s iCloud account. But, in the process, they made a mistake. They reset the password remotely. When they did that, they cut off a way into the device, an auto-backup, which may have been possible had the phone been transported and connected to a Wi-Fi network that it recognized—in this case, the shooter’s home wireless network. There was only one way to find out if that would have worked, and it disintegrated when a law enforcement official reset that password.
Locked out, the government requested Apple’s help. Apple CEO Tim Cook refused to provide that help on the grounds it would compromise consumer privacy and set a dangerous precedent. The FBI secured a court order demanding Apple unlock Farook’s iPhone, and still the company refused to comply, which begged the question: Should the government be allowed special access to information that is protected by encryption or any other method designed to protect user privacy?
In October 2015, the Obama administration had decided it was not a good idea to legislatively force decryption at the behest of law enforcement. “The administration has decided not to seek a legislative remedy now, but it makes sense to continue the conversations with industry,” FBI director James B. Comey told the Homeland Security and Governmental Affairs Committee. Not long after that announcement, the San Bernardino shooting caused the Justice Department to do a 180—getting a court to order Apple to decrypt. The case made daily headlines. Numerous briefs were filed by all stripe of organization on both sides of the issue. Then the action became moot because—reportedly with the help of a third-party technology firm—the FBI wormed its way into the phone.
But on the other side of the FBI’s successful workaround with Farook’s iPhone 5C lies a legal shadowland. This pivotal question about consumer privacy still has not been addressed, because the FBI successfully breached the phone without Apple’s help.
What Now?
When it comes to encrypted devices, can there be special access afforded to the government, in only extreme cases, without weakening the privacy protections afforded by encryption to consumers?
Digital enterprise probably won (by a smidge) in the battle over access to Farook’s iPhone because Apple was not required to provide what could have amounted to a permanent backdoor to law enforcement. The FBI said this week that it would help local law enforcement agencies decrypt information on devices without saying that it would specifically make available to them the means used to crack the San Bernardino shooter’s phone. You can be sure that when Apple closes the door on the FBI’s exploit, there will be an announcement and the fight over law enforcement access to encrypted information will resume in earnest.
It is not breaking news in the information security community that the FBI has had a Tor exploit for a while now. Tor is an anonymizing network that allows people to visit websites without being traced. There are as many legitimate reasons to use it as there are illegal ones—among the latter category being the trafficking of child pornography, which was the reason the FBI developed the tracker malware used to locate and arrest people who transmit illegal images. What is not known: how many other presumed safe platforms have glass walls for law-enforcement eyes only?
I think it’s also worth wondering aloud if the FBI always knew there was a hack to get in Farook’s iPhone. Were that the case, the FBI motion in this case would have been less about finding a way into the phone and more about two-stepping around the Obama Administration’s previously stated position to continue conversations and not go to war with Silicon Valley over decryption legislation.
In February, Tim Cook explained to ABC World News Tonight that the FBI had essentially asked him to create “the software equivalent of cancer.” The tension between selling privacy and having it compromised by legal means is not an easy one to navigate, but in this war of words and ideology, we need to do a whole lot better than we have so far.