August 26, 2014

Cornell Professor Involved in Facebook Study Affecting 700,000 Unknowing Users

Print More


Two Cornellians — Prof. Jeffrey Hancock, communication and information science, and former doctoral student Jamie Guillory ’13 — authored a controversial study earlier this summer, part of which involved Facebook altering the news feeds of nearly 700,000 users without their consent.

The study, published June 17, was part of an experiment to test whether social media websites have an effect on users’ moods by deliberately showing users content that was happier or sadder than normal. Facebook controls the order — and has the ability to control the types — of stories users view in their news feeds, according to The New York Times.

Facebook revealed in late June that it had manipulated users’ news feeds in the interest of psychological testing. The research was published in an article titled “Experimental Evidence of Massive-Scale Emotional Contagion Through Social Networks,” in the journal Proceedings of the National Academy of Sciences.

“[The results of the experiment] indicate that emotions expressed by others on Facebook influence our own emotions,” the study said.

Hancock and Guillory analyzed the results from Facebook’s research and coauthored the article along with Facebook data scientist Adam Kramer, which documented the emotional results of the manipulation of 689,003 users’ news feeds.

The experiment produced widespread backlash nationally, largely because Facebook users felt they had not consented to the research, according to The New York Times. Facebook argues that by signing its Terms of Use, every user agrees to any experimentation by the company.

However, Hancock and Guillory’s work was limited to initial discussions and collaboration with Facebook and analyzing the research results, according to University Spokesperson John Carberry. Neither Hancock nor Guillory was directly involved in collecting user data, Carberry added.

“Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required,” Carberry said.

Though some say Facebook should have obtained more explicit consent from users, Hancock said in an August interview with The New York Times he believes there are times when consent can and should be waived in social research.

“This is a new era,” Hancock said. “I liken it a little bit to when chemistry got the microscope.”

In large-scale social experiments involving corporations, he said, researchers have different priorities than they would have in smaller studies.

“Informed consent is a really important principle, the bedrock of a lot of social science, but it can be waived when the intervention, the test, is minimally risky,” Hancock said in an August interview with The Atlantic. “[It] isn’t the be-all-end-all of how to do social science at scale,” Hancock said.