Facing heat for conducting a psychological experiment on unwitting users, Facebook last summer began another study to see whether the social network insulates members from diverse opinions. The conclusion: Nope.
Facebook researchers sought to determine whether its customised news feeds seal off users from a variety of perspectives. The study concluded that personal choice about what to click on has a greater effect than Facebook's own formula for the feed, the collection of postings to a user's accounts, according to findings published on Thursday in Science magazine.
Read also:
• Outrage at Facebook's happy and sad test
• To save the world/Internet, please start sharing content that makes you feel bad
Now that almost half the Internet-connected population is on the social network, Facebook is seeking to understand its cultural effects, for insight into how to create products and for academic purposes. The Science report adds to the limited public knowledge about how the network's 1.4 billion users behave socially.
Researchers began the study in July, as Facebook faced scrutiny from regulators for conducting an experiment without user consent to see how making news feeds more positive or negative would affect peoples' moods. The Menlo Park, California-based company in October tightened the standards for its research policies.
The study published on Thursday, for which researchers reviewed 10.1 million US accounts anonymously, acknowledges an "echo chamber" effect in which users predominantly see views they agree with. But researchers found that the network's news- feed algorithm isn't largely responsible.
The power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals.
"Our work shows that social media exposes individuals to at least some ideologically cross-cutting viewpoints," wrote the researchers, who came from Facebook and the University of Michigan. "The power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals."
In a completely random news feed, more than 40 per cent of the hard-news content would be from opposing points of view, the study found. Accounting for friendships and the popularity of postings, that percentage declines. Liberals tend to see a narrower universe of opinions: they get 24 per cent of their hard-news from conservatives, while conservatives get 35 per cent from liberals.
Facebook's news-feed algorithm, which gives priority to postings from close friends and other content deemed relevant to the individual, reduces opposing opinions by 8 per cent for liberals and 5 per cent for conservatives.
What a user clicks on has a greater effect cumulatively. A liberal user's clicks actually cuts exposure to other views by a little less, 6 per cent, but a conservative's choices reduces such exposure by 17 per cent, the study found.
"On average, more than 20 per cent of an individual's Facebook friends who report an ideological affiliation are from the opposing party, leaving substantial room for exposure to opposing viewpoints," the study found.
- Bloomberg