September 18, 2017

MORADI | How Do You Like Them Apples?

Print More

I recently bought an iPhone after having a Samsung Galaxy for almost five years. I had lamented the lack of iMessage, the terrifyingly janky emojis (Why does the screaming-in-fear emoji have a GHOST coming out of its mouth?) and the front camera that made me look like a cloudy mirage. Tired of being The Girl with the Green Texts, I switched.

The iPhone isn’t a good smartphone. It’s not a humane smartphone. That is, the iPhone is not made with the user’s humanity in mind; rather, iPhones and iOS work to keep a person reliant on their smartphones as much as possible. This isn’t surprising. It’s capitalism, and until recently, it was somewhat insignificant and inane. For instance, iOS 10 features separate, individual notifications for every individual bump from Facebook Messenger, every individual like on your tweet and every single Slack message. You can’t consolidate these notifications into one alert anymore, meaning you constantly check your phone: incessant phone use is hard-wired into your iPhone settings.

OK, so not having combined notifications is really not that big of a deal. I realize this; when I got my iPhone I described the notifications as being “like crack but on your phone” and my brother responded, “You’re overreacting,” and “What is wrong with you?” The stakes here are low and the solution is simple. When Apple revealed the newest iPhone models in its keynote last week, however, things got a little spicier.

The iPhone X’s face-scanning feature is egregious and violatory. When it comes to security, your face is less mutable than a passcode and easier to spoof. As Andy Greenberg, security writer for Wired writes, “Your face sits out in the open, displayed in public, and well-documented across social media platforms. Using it as a secret key is a little like writing your PIN on a Post-It note, slapping it on your forehead and going for a stroll.” That said, if it takes printing a 3D model of a user’s face to unlock a smartphone, then the average user is probably fine taking that risk. If, on the other hand, it takes a police officer forcibly unlocking a phone with a suspect’s face — actions that several courts have found acceptable for fingerprints and TouchID — users may be less enthusiastic.

The privacy implications are far more insidious. Consider that in order for FaceID to function, the iPhone X needs to almost constantly be scanning the user. Natasha Lomas of TechCrunch duly notes that constant scanning is conducive to surveillance, and that this danger is exacerbated by the omnipresence and ceaseless use of smartphones. While the face scan is stored directly on the device and cannot be remotely accessed, any other data collected (e.g. facial reactions to content you view on your phone) could easily be sold to behemoth third-parties like Facebook or Google.

Fortunately, the next iOS allows the user to disable FaceID. Unfortunately, many users still won’t do so. Lay users often make the “I have nothing to hide” argument when it comes to surveillance and privacy, and on the surface, that argument is understandable: If you’re not a criminal or cheating on your spouse, it’s difficult to see privacy being of particular importance. I don’t particularly care if anyone knows that I’ve been playing a lot of Boggle With Friends (Play me @PegahM4). The argument is understandable, but stupid. Privacy isn’t just for criminals: There are just certain things we don’t want people to know. You probably don’t want your employer to see your political donations or your neighbor to know your income.

 

More importantly, research repeatedly shows that individuals change their behavior when they know they’re under surveillance, whether consciously or subconsciously: A 2015 paper by Alex Matthews of Digital Fourth and Catherine Tucker of MIT demonstrated how internet search queries on personally sensitive and government sensitive topics decreased following Edward Snowden’s surveillance revelations. In 2016, Jon Penney of Harvard Law School’s Berkman Klein Center and the Oxford Internet Institute wrote extensively on finding “chilling effects” in traffic to sensitive Wikipedia pages following government leaks.  If you don’t care about surveillance subtly changing your behavior, perhaps you’ll care about higher prices as a result of tracking-based price discrimination.

This isn’t “technopanic,” as Adam Thierer of the Mercatus Center described it at a libertarian tech policy summit in July. While the free market is supposed to account for individuals’ privacy and security concerns, consumer preferences have trouble trickling up when the entire U.S. market for a product is effectively a duopoly. According to ComScore, as of June 2017, Apple made up almost 45 percent of the U.S. smartphone market. The consumer is rendered pretty much helpless, especially when Apple’s design decisions are specifically intended to lock in the user through network effects and inter-device operability.
Antitrust regulation is apt for use in this scenario but likely does not have the necessary jurisdiction to curb consumer concerns. And while Sen. Al Franken’s (D-Minn.) letter to CEO Tim Cook was a positive sign of prompt government responsiveness as opposed to the traditional ex post regulatory approach, legislative prospects are bleak. In the meantime, American consumers (yes, you, the millennial with immense importance in the U.S. market!) can benefit from employing the European attitude on privacy: Refuse to give it up.

 

Pegah Moradi is a junior in the College of Arts and Sciences. She can be reached at [email protected]All Jokes Aside appears alternate Mondays this semester.