Yufei Wang/Sun Contributor

SnapChat's new chatbot, My AI, has generated privacy concerns among Cornell students.

May 2, 2023

Snapchat’s ‘My AI’ Chatbot Raises Privacy and User Experience Concerns Among Students

Print More

Snapchat’s recent entry into the AI race with the release of its new chatbot, My AI, caused an immediate, significant uproar amongst its user base when they discovered it had unexpected access to their personal information. 

“I’m your new AI chatbot,” the opening prompt says. “You can ask me just about anything and I’ll do my best to help. I’m always here for a laugh, and you can give me a name if you’d like. Is there anything I can do for you today?” 

My AI appeared pinned to the top of every user’s chat feed — who has the automatic updates setting enabled — on April 20 without their consent or warning, and cannot be removed, blocked or hidden. Though it is powered by OpenAI’s ChatGPT technology, the chatbot’s reception has been far from favorable among students The Sun spoke to — even when compared to its parent AI — with Cornellians concerned about intrusive behavior and potential privacy breaches

My AI has the ability to track a user’s city-level location and general distance if a user shares their location with SnapChat — even if they are in “Ghost Mode,” which prevents the user from appearing on the SnapMap. According to Snapchat’s website, My AI cannot view the location of users who do not grant permission for the app to track their location.

“I’ve seen people ask what their location is, and the AI will respond with their location, and they’ll [the user] say how do you know, and the AI will say I don’t,” Ben Chenven ’26 said.

For years, Snapchat has been a trusted platform for many, primarily due to its emphasis on privacy, according to students who spoke with The Sun. The inability to remove or control the My AI feature has eroded their confidence.

“Snapchat is an app that is known for ‘deleting’ the snaps after they’re viewed, but I feel like I don’t trust Snapchat in general,” Alice Liu ’25 said. “I definitely feel cautious about it.”

With the rollout of My AI, users are feeling their boundaries are being overstepped, and in response have begun to question their experiences on Snapchat. Neil Janjikhel ’25 said that he viewed the chatbot as “creepy” and expressed concerns about its undisclosed knowledge.

“I don’t know [what else it knows], and that’s what worries me,” Janjikhel said. 

However, Kurt Fass ’25 said the concern is not the abilities of the chatbot, but rather users failing to read terms and conditions.

“I think the main concern isn’t so much about the AI,” Fass said. “It’s more about these apps that we all sign up for because we never read the terms and conditions. We always scroll through, click ‘submit,’ waive our rights and just say, ‘Alright, that’s good enough.’”

Fass elaborated that social dynamics and convenience cause people to willingly give up privacy.

“There’s a certain point in which we trade privacy for convenience,” Fass said. “In this case, we use all these apps and waive our rights and sign these long contracts that we’ll never read, because one, everyone uses it, so peer pressure, and two, because it’s just convenient. We trade our privacy for convenience, so I’m not surprised.”

Canela Garcia DeMetropolis ’23 is a Sun contributor and can be reached at [email protected].