November 19, 2015

BERKOWITZ | The War We Are Not Seeing

Print More

By ETHAN BERKOWITZ

In the past week, it has been nearly impossible to find any social media or digital communication platforms that haven’t been inundated with reactions to the barbaric attacks in Paris. Amongst these reactions, many disputes have broken out on an array of issues, such as implications for the Syrian refugee crisis and underreporting on the attacks in Beirut a day earlier.

All the while, a new clash on social media is brewing. This clash is one that we can’t identify in the form of blood, body bags and gunshots, but in 0’s and 1’s: The clash between Silicon Valley and the government over digital privacy and encryption. In light of the Paris attacks, renewed public anxiety over how the attack proceeded unbeknownst to intelligence communities, compounded with tech companies adamant about their right to protect consumer privacy, has transformed the issue of digital encryption into one of the most challenging and morally divisive digital battlegrounds of our ongoing war on terror.

It has long been known that extremist groups like ISIS have used social media networks such as Twitter to spread jihadist propaganda and to try to recruit new members. Counter-terrorism authorities have publicly acknowledged the pros and cons of allowing these accounts to remain online: On one hand, the bad guys are able to operate unimpeded, but on the other hand, authorities can mine useful intelligence to help thwart actual attacks. With that being said, Twitter is generally not seen as the most secure communication service, as most of it’s content is public, and direct messages are not encrypted.

However, there is a growing concern that extremists have been masking their recruiting and attack planning using other apps and digital communication services such as the Facebook-owned WhatsApp and Apple’s iMessage. How? These services, and others, are integrating end-to-end encryption to protect a user’s privacy. To understand what this encryption means, imagine you have a key to a door; you locked the door, and threw away the only key (In this scenario, you are the communication service provider). Sure you, or anyone else, can try to pick the lock, but nobody has done so and even if they did, it would be an incredibly time consuming process, made more difficult by the fact that you keep changing the lock to make it harder to crack. While, to be fair, there is no definitive proof that these communication services played a role in the Paris attacks, the possibility still exists that extremists are increasingly attracted to the secrecy these services protect. All the while, government and security officials have been increasingly sparring with many of the tech companies that create these services over how to respond.

The major question is this: How do we balance privacy rights with national security interests? If you are like me, speaking as a consumer of these services, and as an American, we want to have our cake and eat it too. That is to say, we don’t want the government to act as a ‘big brother,’ infringing on our right to privacy. However, many of us recognize instances in which there is a legitimate and lawful need for authorities to have access to personal data.

The problem is that nobody has found a viable solution thus far that satisfies the authorities, the tech companies and consumers alike. Legally, the law regulating encryption is far from settled. In the meantime, critics of end-to-end encryption have proposed tech companies create a so-called “back door” that allows for the companies themselves to have the means to access communications when warranted (morally, and, more importantly, legally). On the other hand, end-to-end encryption advocates, such as Apple CEO Tim Cook, believe creating a back door for the “good guys” presents a viable back door for the “bad guys,” whether it be terrorists, foreign governments or anyone else. Additionally, security experts see the evolving tactics used by ISIS as a sign that anyone wishing to evade services that have a back door would either simply switch to services that don’t, such as the German-based app Telegram, or start communicating in the Deep Web. While the United States can try to compel American companies, that would not stop the development of messaging services outside their jurisdiction. To make matters worse, the NSA didn’t do itself any favors by abusing their access to unencrypted communications with covert programs like PRISM, which in part facilitated the public’s interest in privacy and encryption in the first place.

Does that mean nothing should be done? Personally, in the process of researching this story, I’ve been going back and forth on both sides of the argument. With that being said, I believe it’s imperative that the government, along with European governments and tech companies in Silicon Valley and around the world, must collectively find some middle ground to keep us safe while maintaining our right to privacy as best as possible. To have these sides defy each other with laws or software updates, rather than identify a solution, is a loss for everybody. Also, to assume that terrorists will use other methods of communicating does not mean we should willingly acquiesce the mediums they might be currently using. We have a system of checks and balances in place via our courts to determine whether a so-called middle ground, whatever it may be, has or has not gone beyond what the Constitution would permit. The sooner these sides can collectively arrive at the best plan of action, we can work towards addressing our country’s main threat, rather than continuing to shoot ourselves in the foot.

Ethan Berkowitz is a senior in the College of Industrial and Labor Relations. Views From the 14853 appears alternate Fridays this semester. Comments may be sent to associate-editor@cornellsun.com.

Leave a Reply

Your email address will not be published. Required fields are marked *