Remember San Bernardino shooter Syed Farook and his iPhone that Apple, despite pleas from the FBI, refused to unlock? One of the only things all members of the “Apple vs. FBI” Commencement panel on encryption and privacy seemed to agree on was this: both sides were being totally disingenuous.As Stewart Baker ’69, a technology lawyer who was the first assistant secretary for policy at the Department of Homeland Security, put it, “It’s not privacy versus security. It’s security versus Apple’s current advertising campaign.” He said Apple “decided to pick a fight with the FBI,” in order to position itself as “the privacy company” in its competition with Google.
“I was struck by [Tim Cook’s] tone of bravery,” Baker said sarcastically, “as if he were Tank Man standing in front of the U.S. military on behalf of [his] customers.” Meanwhile, Baker said, Apple has repeatedly caved in to China’s requests to weaken iPhone security.
Daniel Kahn Gillmor ’98, staff technologist for the ACLU, said, “The FBI could have had access [to the terrorist’s iPhone] without asking Apple.” They publicly asked Apple, Gillmor said, because they were “looking for something that had emotional resonance with the American public,” i.e., terrorism, that would allow them to set a legal precedent that would further encroach on the right to privacy.
The lively panel, which was sponsored by the Executive Master in Cybersecurity program of Brown’s School of Professional Studies, started off with video clips of both Apple CEO Tim Cook and President Barack Obama talking about the issue of cybersecurity. In the clip, the president spoke eloquently of his support for privacy, but also of the necessity of national security and his belief that a few trusted entities should be able to unlock the information on a smartphone.
Computer science professor Anna Lysyanskaya, an expert in cryptography, was not impressed. Either your encryption is secure or it’s insecure, she said. There’s no gray area. It’s just math. Regarding having a “back door” available to law enforcement, she said, “If you poll computer scientists on this idea today, one hundred percent will say that this is a bad idea.”
As a computer scientist, Gillmor agreed. Baker was much more skeptical, saying that most encryption programs today can be weakened with software updates, and that there’s always a way in. “At the end of the day, you have to trust somebody,” he said, giving as an example our willingness to trust security updates from our software provider. “We end up with more security by leaving the key with the manufacturer of the software. Take away that trust,” he said, and “you don’t end up with Nirvana, you end up with Somalia.”