February 21, 2016

SCHULMAN | Emotion is Simple, Technology Isn’t

Print More

It’s rare for technology to make front-page news. But this week, Apple has been making headlines. No, Apple’s quarterly earnings report isn’t being released. And no, the new iPhone isn’t coming out either. On the surface, the issue at hand is simple. The federal government asked Apple to unlock an iPhone. Apple said no. Obviously, there’s more to the story — this wasn’t just any iPhone. It belonged to terrorists involved with the deadly shooting in San Bernadino last December. The FBI wanted data from the phone to better understand the shooters’ activities.

This is an emotionally charged case. As such, people told emotional narrative: Profit hungry technology company stages a PR stunt, delaying justice in a heinous crime. Yet, Apple’s decision has huge implications for digital privacy. Obviously emotion is important and I want the shootings’ victims to receive justice, however people seem more concerned with the emotion of the case than its implications. Apple denying the FBI’s request reflects the debate pitting national security against our personal liberty. There are obviously two sides to this struggle.  However, regardless of your opinion or its justification, you should side with Apple. By denying the FBI, Apple has empowered us to make that choice.

Unlocking the San Bernardino shooter’s iPhone would not only set a dangerous precedent regarding digital privacy, it would forfeit that decision to a private corporation. To hasten unlocking the terrorist’s phone, the FBI asked Apple to load a new version of its operating system, iOS, on the terrorists’ phone. If you’ve ever waited 80 minutes to unlock your iPhone after repeatedly entering wrong passwords, you know why the FBI is asking for Apple’s help. This version of iOS would allow the FBI to try as many passwords as they wanted using a computer.

If Apple creates a “master key” (as they call it) for unlocking iOS, where does Apple draw the line regarding its use? If this special version Apple software existed, nothing would stop the FBI or another government bodies from asking Apple to use it again. What if the FBI asked to use it in a murkier case? What if foreign governments (like China) asked Apple for access to political dissidents’ information? Whether this “master key” for governments should exist or not, should not be Apple’s decision. It should belongs to us and our elected officials.

Of course, don’t think this means Apple denied the FBI’s request out of the goodness of its hearts. In many ways, Apple’s press release misrepresents the reality of the situation. Developing the “master key” may not be a sweeping surrender of our privacy. The terrorists had an older iPhone. Apple’s “master key” would not necessarily leave newer, more secure versions of the iPhone vulnerable. Also, Apple has circumvented security features for the FBI in the past. All considered, the choice to make an announcement about privacy during a high profile investigation is probably related to marketing. Apple’s decision frames them as the gold standard in terms of protecting personal data — regardless of the reality.

But, I still think Apple is in the right. Good marketing usually embodies the truth. The conversation about digital privacy has been evolving long before the case in San Bernardino and it’s about time someone demonstrated its commitment to privacy. Taken at face value, Tim Cook’s statement was spot on. I support the effort Apple is making to protect our privacy  — or at least to give us the choice — regardless of its motivations. It is about time that a large corporation took a stand for our digital privacy. I’m not a particularly big Apple fan, but I can’t help but take Apple’s side. That’s my schtick and I’m sticking to it. Stay tuned alternating Mondays for more!

Eric Schulman is a junior in the College of Arts and Sciences. He may be reached at [email protected]. Schulman’s Schtick appears alternate Mondays this semester.