Researchers at Facebook were given almost free rein to manipulate the news feeds and sometimes the emotions of many of the company's 1.3 billion users without their knowledge, a former employee has disclosed.
The company's data science team is said to have operated with virtually no supervision and little corporate oversight as it conducted experiments with such regularity that researchers worried that they were repeatedly using the same information.
The revelations by The Wall Street Journal follow controversy over the disclosure that Facebook ran psychological experiments to determine how the emotions of almost 700,000 users were affected by highlighting negative or positive postings from their friends on their public news feeds, without informing them.
"There's no review process, per se," said Andrew Ledvina, who worked as a Facebook data scientist from February 2012 to July 2013. "Anyone on that team could run a test. They're always trying to alter people's behaviour."
His disclosures signal that the secret experiments were more widespread than the controversial mood manipulation study, revealed last week, which sought to discover whether the content presented on users' pages made them happier or sadder.
In one experiment described by the newspaper, thousands of users received a message that they were being locked out of the social network because Facebook believed they were robots or using fake names. The message was actually a test designed to help improve Facebook's anti-fraud measures.
Mr Ledvina described discussions within the team about conducting other tests without informing users. "I'm sure some people got very angry somewhere," he said. "Internally, you get a little desensitised to it."
Facebook's data science team has reportedly run hundreds of tests and studies without the knowledge or consent of participants since its creation in 2007, relying on the terms of service agreement which states that user data can be used for research.
The controversy about the human emotions experiment took Facebook by surprise after its organisers published details in an academic study.
The company said it had imposed stricter controls on its data science team, including reviews by a 50-person panel of experts in security and privacy. "We are taking a very hard look at this process to make more improvements," a spokesman said.
Sheryl Sandberg, Facebook's chief operating officer, this week apologised that the study was "poorly communicated", but not for the research itself.
Next story: How Facebook is seen as a threat to Afghan unity