It’s a new semester, which means new schedules, routes, groupmates and, if all goes well, knowledge and opportunity. It also quite often means new books, technologies and … privacy agreements? As we’re rushing to adapt, we can often overlook important decisions about what the tools our community requires may extract from us in turn. Hear me out.
Next I open my new textbook in the Cornell Store’s VitalSource page, a sort of digital archive that stores textbook purchases. Navigating to the chapter marked in my syllabus, eager and ready to learn, I’m interrupted by yet another pop-op: a cookies policy. Cookies are, as Emily Stewart at Vox puts it, “small files that websites send to your device that the sites then use to monitor you and remember certain information about you.” You’re probably familiar with the concept. A cookies permission request appears on basically every page from Forbes to, I don’t know, insert something salacious here.
But when I select “reject cookies,” as I commonly do, VitalSource logs me out. I sign back in and read from a link below the apparently wrong button I’ve just pressed: it’s a statement about how the page requires first-party cookies to function. Yes, true. That’s how the internet nominally works. First-party cookies are used by the website you’re on to track you for site ease and function. However, VitalSource gives no option to dismiss third-party cookies. Those are the ones that allow advertisers and shady “data partners” to follow you around. And even that salacious company you thought of a moment ago will likely allow you to opt out of such trackers as standard practice.
I imagine for a moment what my professor would say if I told her that I couldn’t do an assignment because my rights are not for sale. I picture delivering the line with Calvin and Hobbes-esque gusto. Maybe I’d be wearing a cape. I hit “accept cookies.”
Now I find in my syllabus something about a group project, for which I’ll need Discord. See where this is going? As I try to create an account, I remember why I don’t already have one. Discord, like Twitter or many other professor favorites, denies access to users with VPNs, VOIP numbers or other privacy-protective tools. And sure, in one sense they do this to keep their sites clean (results: dubious) but in another, it’s rather convenient that doing so requires our most trackable information.
So if you’ve been following closely, you may have gathered that I like privacy and the tools that support it. It’s kind of my point here. With just a couple of free or inexpensive changes, say an adblocking browser, an encrypted texting app, and/or a VPN subscription, you may not solve the internet that Bo Burnham sings about, but you can feel much better about your place in the behavioral futures market (see Zuboff’s Age of Surveillance Capitalism).
Because of what’s called privacy dependency, every invasive measure that you accept online is shared by your neighbor. Your digital footprint indicates the digital footprints of others, regardless of their own privacy preferences. But, like recycling or sustainable fashion, a few minor alterations can afford you an identity as a reasonably ethical member of society. Which brings me back to school.
You may have heard that companies like Dell and Apple donate hardware to universities to inspire brand loyalty among the future product ambassadors of the universe (us). It’s why we get discounts on Spotify, Hulu, Amazon, etc. It’s how they get us hooked. Which is fine. I’m not here to debate transparent capitalist incentives. What does bother me — and should bother you — are the sneakier, less mutualistic moves of a MacMillan, McGraw Hill or even a VitalSource. Education-industry companies take advantage of student reliance on their wares by coercing uncommonly strict data contracts. And where these practices are supported by our alma mater, we are entrained to accept exploitation.
I get it. Nobody reads privacy agreements. And maybe you’re the type of user who breathlessly hits the “accept cookies” button each time you see one — like you’re catching the last Pokémon. But our time in university is meant to teach us to navigate our futures with greater insight and agency. And evading privacy concerns in the digital era is like refusing to teach assembly line safety to Industrial-era laborers. You’d think that was an exaggeration … but learning to read bids for our data and the amplified, manipulated forms in which it’s delivered back at us is a first step to media literacy. It’s a first step to identifying online propaganda and avoiding scams and hacks. The first step to countering FOMO and online hate speech and that “little bit of everything all of the time,” as Bo Burnham sings, with a rational, sovereign response.
So here’s what we can do (and by we, I mean the Cornell administration):
1. Replace all default browsers on campus computers with Brave or Opera. Brave, specifically, plugs into everything that Chrome can while blocking ads and trackers, saving bandwidth and being cost-free.
2. Suggest that students use privacy-protecting apps. Signal, for instance, is a texting app that uses end-to-end encryption so that not even the app can access private messages. Such a suggestion may seem oblique, but it’s a nudge, subtly telling our thousands of future alumni that there are better options and they are worth it.
3. Tell education-specific companies who engage in exploitative practices that they need to change their policy or look elsewhere for buyers. A little more difficult. But not only could a large, leading institution like Cornell make this happen, they’d be praised in doing so.
4. Encourage professors to designate privacy-supporting tech in course syllabi. Summarily stated, we shouldn’t have to choose between education and privacy.
This time in our lives is a unique and powerful initiation into greater techno-capitalist society: that elusive “real world” so often referenced in university halls. The technology practices endorsed and enforced by those we look to for answers now will stay with us for the long foreseeable future. As I close here, I find myself recalling how the Silicon Valley creators of privacy-reducing apps and tech do their best to bypass them — and ensure their children do too. This is in essence the most informed response, the cutting-edge model of digital conduct. And I don’t know about you, but the cutting edge is just what I came here to learn.
Stephen Young is a senior in the College of Agriculture and Life Science. Comments can be sent to [email protected]. Guest Room runs periodically this semester.