Skip to Content, Navigation, or Footer.
The Cornell Daily Sun
Submit a tip
Wednesday, April 2, 2025

DSC_9314.jpg

Amidst Concerns of AI Privacy, Private Version of Copilot Available to Cornell-Affiliated Users

More than half of college students say artificial intelligence has helped them improve their grades, according to a June 2024 Pearson report. However, recent federal investigations into AI giants are raising concerns over how and why user data is stored.

To protect user data, Cornell-affiliated users can now access a private version of Microsoft’s artificial intelligence-powered chat assistant Copilot — an initiative designed to provide a safer environment for experimentation in generative AI that prioritizes user privacy. 

First launched in early March 2024 and later updated, this private version of Copilot establishes that user input is not used to train the tech giant’s future AI models, granting users more control over when their data is stored and how it is used. 

“By logging into Copilot with a Cornell NetID, users will be provided with Enterprise Data Protection which provides additional security, privacy and copyright protections for the information users entered into Copilot,” Cornell Information Technologies office wrote in an email to The Sun. “This ensures that data is not used to train Microsoft’s AI models.”

CIT told The Sun that they had advertised the privatized Copilot throughout the fall semester, meeting with student groups and both the University Assembly and the Student Assembly. According to the CIT office, the private AI service currently has just under 1,900 daily users.

While signing into Copilot with a netID does provide extra privacy and copyright protection compared to the free version, users are still advised to exercise caution when inputting sensitive data.

“Copilot has been approved for use only with low-risk data,” CIT said, “Though discussions are underway that may permit broader use in the future.”

Kathleen Anderson ’25, an information science student who studies the relationship between AI and humans, said that she was not aware of the private version of Copilot. Anderson founded the Responsible AI Network last semester, a student-led initiative aimed at educating students about the ethical implications of AI. She believes that Cornell could improve communication about these resources that are available to students. 

“Cornell is a very tech-heavy school. We have a lot of people here that are interested in AI and using these tools and building these tools,” Anderson said. “I think it would be to the benefit of everyone — the school, the students [and] the professors — if that information was widely available to people who would want to use it or get involved.”

Prof. Ayham Boucher, information science, said that privacy initiatives with AI are crucial, citing the release of ChatGPT.   

“Because [ChatGPT], like many other free tools, had no privacy guarantees or contractual agreement with enterprises, conversations could be recorded and used to train the next version of ChatGPT, improving its ability to understand user intent,” Boucher said. “That’s why institutions quickly blocked it — people were inputting sensitive data without any assurance that it wouldn’t be used to train the model and as a result leak this data unintentionally to other users.”

Although Microsoft does not use any information from Cornell users to train future AI tools, Copilot still offers chat history, which enables users to select from their recent chats and continue those conversations with previous questions and details available, according to Boucher. He said this feature was not previously available — a shortfall that annoyed users. 

Boucher emphasized the distinction between storing data as training material and saving it for context in order to improve the user's experience. Chat history maintains the efficiency of chats while excluding the possibility of personal information being saved and used as training data without the user’s knowledge. 

The decision is one of several AI privacy proposals that CIT juggles as the protection of sensitive data remains a consideration for University administration. 

Prof. Bruce Lewenstein, communication — who studies the public perception of science and new technology — explained that much of the public does not understand how their data is used as much as they should. 

“It turns out that we live in a world where it’s actually very, very hard to protect your privacy … where the access to information about us is extreme,” Lewenstein said. “Most of us just accept that.”

According to both professors, universities have a responsibility to provide privacy-focused resources for students.

“People don’t realize they’re doing stuff that’s not going to be safe, and if there is some relatively straightforward way for us to protect them, then we should do that,” Lewenstein said. “People who are heads of IT do think about that, and think about how we can make it so the default [setting] is to preserve as much privacy as possible.”


Read More