Facebook is extending its ban on hate speech to prohibit the promotion and support of white nationalism and white separatism.
The company previously allowed such material even though it has long banned white supremacists. The social network said Wednesday that it didn't apply the ban previously to expressions of white nationalism because it linked such expressions with broader concepts of nationalism and separatism — such as American pride or Basque separatism (which are still allowed).
But civil rights groups and academics called this view "misguided" and have long pressured the company to change its stance. Facebook said it concluded after months of "conversations" with them that white nationalism and separatism cannot be meaningfully separated from white supremacy and organized hate groups.
Critics have "raised these issues to the highest levels at Facebook (and held) a number of working meetings with their staff as we've tried to get them to the right place," said Kristen Clarke, president and executive director of the Lawyers' Committee for Civil Rights Under Law, a Washington, D.C.-based legal advocacy group.
"This is long overdue as the country continues to deal with the grip of hate and the increase in violent white supremacy," she said. "We need the tech sector to do its part to combat these efforts."
Though Facebook said it has been working on the change for three months, it comes less than two weeks after Facebook received widespread criticism after the suspect in shootings at two New Zealand mosques that killed 50 people was able to broadcast the massacre on live video on Facebook.
As part of the change, people who search for terms associated with white supremacy will be directed to a group called Life After Hate, which was founded by former extremists who want to help people leave the violent far-right.
Clarke called the idea that white supremacism is different than white nationalism or white separatism a misguided "distinction without a difference."
She said the New Zealand attack was a "powerful reminder about why we need the tech sector to do more to stamp out the conduct and activity of violent white supremacists."
The rise and spread of white nationalism on Facebook were thrown into sharp relief in the wake of the deadly neo-Nazi rally in Charlottesville, Virginia, in 2017, when self-avowed white nationalists used the social networking site as an organising tool.
Facebook's new policy comes as the company continues to struggle to take down other content that attacks people on the basis of their race, ethnicity, national origin and a host of other "protected characteristics."
Between January 1 and September 30, 2018, Facebook took action against eight million pieces of content that violated its rules on hate speech, according to its latest transparency report. Facebook is not legally required to remove this content, but its rules prohibit it.
To help enforce its policies, Facebook has developed and deployed artificial intelligence tools that can spot and remove content even before users see it. But the technology isn't perfect, particularly when it comes to hate speech. The company only removes about 50 per cent of such posts at the moment users upload them, it said last year.
- AP, with Washington Post