Facebook secretly manipulated the feelings of 700,000 users to understand "emotional contagion" in a study that prompted anger and forced the social network giant on the defensive.
For one week in 2012 -- and without the explicit consent or knowledge of users -- Facebook tampered with the algorithm used to place posts into their news feeds to study how this affected their mood.
The researchers wanted to see if the number of positive, or negative, words in messages they read affected whether users then posted positive or negative content in their status updates.
Have you been affected?
Email firstname.lastname@example.org and tell us your story.
The study, conducted by researchers affiliated with Facebook, Cornell University, and the University of California at San Francisco, appeared in the June 17 edition of the Proceedings of the National Academy of Sciences.
Results of the study spread -- and with it anger and disbelief -- when the online magazine Slate and The Atlantic website wrote about it on Saturday.
"Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness," the study authors wrote.
"These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks."
While other research has used metadata to study trends, this appears to be unique because it manipulated the data to see if there was a reaction.
Facebook, which says that it has more than one billion active users, said a statement: "This research was conducted for a single week in 2012 and none of the data used was associated with a specific person's Facebook account.
"We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible.
"A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process."
In the paper, the researchers said the study "was consistent with Facebook's Data Use Policy, to which all users agree prior to creating an account on Facebook."
But that did nothing to stem the growing anger of Facebook users.
"#Facebook MANIPULATED USER FEEDS FOR MASSIVE PSYCH EXPERIMENT... Yeah, time to close FB acct!" read one Twitter posting.
Other tweets used words like "super disturbing," "creepy" and "evil," as well as angry expletives, to describe the psychological experiment.
Susan Fiske, a Princeton University professor who edited the report for publication, told The Atlantic that she was concerned about the research and contacted the authors.
Fiske said the authors allayed her concerns, saying they had received approval from a ethics review board before conducting the experiment.
Nevertheless, Fiske admitted to being "a little creeped out" by the study.
• More than 600,000 Facebook users had newsfeeds manipulated in January, 2012.
• For a week, some users saw mostly negative status posts, while others saw mostly positive posts.
• Results showed what people saw influenced what they would write on their newsfeed.
Next story: Tired of passwords? You aren't alone