Social platforms have learned to remove violent videos of extremist shootings more quickly over the past few years. It's just not clear they're moving quickly enough.
Police say that when a white gunman killed 10 people and wounded three others — most of them black — in a "racially motivated violent extremist" shooting in Buffalo Saturday, he livestreamed the attack to the gaming platform Twitch, which is owned by Amazon. It didn't stay there long; a Twitch spokesperson said it removed the video in less than two minutes.
That's considerably faster than the 17 minutes Facebook needed to take down a similar video streamed by a self-described white supremacist who killed 51 people in two New Zealand mosques in 2019. But versions of the Buffalo shooting video still quickly spread to other platforms, and they haven't always disappeared quickly.
In April, Twitter enacted a new policy on "perpetrators of violent attacks" to remove accounts maintained by "individual perpetrators of terrorist, violent extremist, or mass violent attacks", along with tweets and other material produced by perpetrators of such attacks. On Sunday, though, clips of the video were still circulating on the platform.
One clip purporting to display a first-person view of the gunman moving through a supermarket firing at people was posted to Twitter at 8.12am Pacific time, and was still viewable more than four hours later.
Twitter said Sunday it was working to remove material related to the shooting that violates its rules. But the company added that when people share media to condemn it or provide context, sharing videos and other material from the shooter may not be a rules violation. In these cases, Twitter said it covers images or videos with a "sensitive material" cover that users have to click through in order to view them.
At a news conference following the attack, New York Gov, Kathy Hochul said social media companies must be more vigilant in monitoring what happens on their platforms and found it inexcusable the livestream wasn't taken down "within a second".
"The CEOs of those companies need to be held accountable and assure all of us that they're taking every step humanly possible to be able to monitor this information," Hochul said Sunday on ABC's This Week.
"How these depraved ideas are fermenting on social media – it's spreading like a virus now."
Hochul said she holds companies responsible for "fomenting" racist views. "People are sharing these ideas. They're sharing videos of other attacks. And they're all copycat. They all want to be the next great white hope that's going to inspire the next attack," she said on NBC's Meet the Press.
A law enforcement official told the Associated Press that investigators were also looking into a diatribe the gunman posted online, which purports to outline the attacker's racist, anti-immigrant and anti-Semitic beliefs, including a desire to drive all people not of European descent from the US.
Police said the suspected gunman, identified as Payton Gendron, of Conklin, New York, shot 11 black and two white victims in a Buffalo supermarket, echoing a deadly attack in a German synagogue that was also streamed on Twitch in October 2019.
Twitch is popular among video game players and has played a key role in boosting the spread of esports. A company spokesperson said the company has a "zero-tolerance policy" against violence. So far, the company hasn't revealed details around the user page or the livestream, including how many people were watching it. The spokesperson said the company has taken the account offline and is monitoring any others who might rebroadcast the video.
In Europe, a senior European Union official with oversight of digital affairs for the 27-nation bloc said Sunday that the livestreaming on Twitch showed the need for administrators to continue working with online platforms so that any future broadcasts of killings can be quickly shut down.
But Margrethe Vestager, who is an executive vice-president of the European Commission, also said it would be a stiff challenge to stamp out such broadcasts completely.
"It's really difficult to make sure that it's completely waterproof, to make sure that this will never happen and that people will be closed down the second they would start a thing like that. Because there's a lot of livestreaming which, of course, is 100% legitimate.
"The platforms have done a lot to get to the root of this. They are not there yet.
"But they keep working and we will keep working."
Meta, which owns Facebook and Instagram, said Sunday that it quickly designated the shooting as a "terrorist attack" on Saturday, which triggered an internal process that identifies the suspect's account, as well as copies of his writings and any copy of or link to video of his attack.
The company said it has removed the video of the shooting from the platform and added that instances of it still being shared are through links to streaming sites. These links, in turn, are blocked and "blackholed" by the company, meaning they can't be uploaded again.
But new links created as people upload copies to outside sites would have to be individually blocked in a game of cat and mouse — unless the company chooses to block an entire streaming site from its platform, which is unlikely.
Jared Holt, a resident fellow at Atlantic Council's Digital Forensic Research Lab, said live-content moderation continues to be a big challenge for companies. He noted Twitch's response time was good and the company was smart to watch their platform for potential re-uploads.
"It would behoove other video hosting platforms to also be aware of this content to the extent that it may have been recorded - may also be republished on their own products," Holt said.